Breaking Math Podcast

24: Language and Entropy (Information Theory in Language)

March 7, 2018

Information theory was founded in 1948 by Claude Shannon, and is a way of both qualitatively and quantitatively describing the limits and processes involved in communication. Roughly speaking, when two entities communicate, they have a message, a medium, confusion, encoding, and decoding; and when two entities communicate, they transfer information between them. The amount of information that is possible to be transmitted can be increased or decreased by manipulating any of the aforementioned variables. One of the practical, and original, applications of information theory is to models of language. So what is entropy? How can we say language has it? And what structures within language with respect to information theory reveal deep insights about the nature of language itself? --- This episode is sponsored by · Anchor: The easiest way to make a podcast. Support this podcast: neither hosts nor alters podcast files. All content © its respective owners.