Version: v1, Published online: 1998
Retrieved July 21, 2019, from https://www.rep.routledge.com/articles/thematic/information-theory/v-1
Information theory was established in 1948 by Claude Shannon as a statistical analysis of factors pertaining to the transmission of messages through communication channels. Among basic concepts defined within the theory are information (the amount of uncertainty removed by the occurrence of an event), entropy (the average amount of information represented by events at the source of a channel), and equivocation (the ‘noise’ that impedes faithful transmission of a message through a channel). Information theory has proved essential to the development of space probes, high-speed computing machinery and modern communication systems.
The information studied by Shannon is sharply distinct from information in the sense of knowledge or of propositional content. It is also distinct from most uses of the term in the popular press (‘information retrieval’, ‘information processing’, ‘information highway’, and so on). While Shannon’s work has strongly influenced academic psychology and philosophy, its reception in these disciplines has been largely impressionistic. A major problem for contemporary philosophy is to relate the statistical conceptions of information theory to information in the semantic sense of knowledge and content.
Sayre, Kenneth M.. Information theory, 1998, doi:10.4324/9780415249126-Q051-1. Routledge Encyclopedia of Philosophy, Taylor and Francis, https://www.rep.routledge.com/articles/thematic/information-theory/v-1.
Copyright © 1998-2019 Routledge.