Marknadens största urval
Snabb leverans

Information Theory and Language

Om Information Theory and Language

"Information Theory and Language" is a collection of 12 articles that appeared recently in Entropy as part of a Special Issue of the same title. These contributions represent state-of-the-art interdisciplinary research at the interface of information theory and language studies. They concern in particular: ¿ Applications of information theoretic concepts such as Shannon and Rényi entropies, mutual information, and rate-distortion curves to the research of natural languages; ¿ Mathematical work in information theory inspired by natural language phenomena, such as deriving moments of subword complexity or proving continuity of mutual information; ¿ Empirical and theoretical investigation of quantitative laws of natural language such as Zipf's law, Herdan's law, and Menzerath-Altmann's law; ¿ Empirical and theoretical investigations of statistical language models, including recently developed neural language models, their entropies, and other parameters; ¿ Standardizing language resources for statistical investigation of natural language; ¿ Other topics concerning semantics, syntax, and critical phenomena. Whereas the traditional divide between probabilistic and formal approaches to human language, cultivated in the disjoint scholarships of natural sciences and humanities, has been blurred in recent years, this book can contribute to pointing out potential areas of future research cross-fertilization.

Visa mer
  • Språk:
  • Engelska
  • ISBN:
  • 9783039360260
  • Format:
  • Inbunden
  • Sidor:
  • 244
  • Utgiven:
  • 3. augusti 2020
  • Mått:
  • 244x170x21 mm.
  • Vikt:
  • 726 g.
Leveranstid: 2-4 veckor
Förväntad leverans: 4. augusti 2025

Beskrivning av Information Theory and Language

"Information Theory and Language" is a collection of 12 articles that appeared recently in Entropy as part of a Special Issue of the same title. These contributions represent state-of-the-art interdisciplinary research at the interface of information theory and language studies. They concern in particular: ¿ Applications of information theoretic concepts such as Shannon and Rényi entropies, mutual information, and rate-distortion curves to the research of natural languages; ¿ Mathematical work in information theory inspired by natural language phenomena, such as deriving moments of subword complexity or proving continuity of mutual information; ¿ Empirical and theoretical investigation of quantitative laws of natural language such as Zipf's law, Herdan's law, and Menzerath-Altmann's law; ¿ Empirical and theoretical investigations of statistical language models, including recently developed neural language models, their entropies, and other parameters; ¿ Standardizing language resources for statistical investigation of natural language; ¿ Other topics concerning semantics, syntax, and critical phenomena. Whereas the traditional divide between probabilistic and formal approaches to human language, cultivated in the disjoint scholarships of natural sciences and humanities, has been blurred in recent years, this book can contribute to pointing out potential areas of future research cross-fertilization.

Användarnas betyg av Information Theory and Language



Hitta liknande böcker
Boken Information Theory and Language finns i följande kategorier:

Gör som tusentals andra bokälskare

Prenumerera på vårt nyhetsbrev för att få fantastiska erbjudanden och inspiration för din nästa läsning.