Access to the full content is only available to members of institutions that have purchased access. If you belong to such an institution, please log in or find out more about how to order.


Print

Information theory

DOI
10.4324/9780415249126-Q051-1
DOI: 10.4324/9780415249126-Q051-1
Version: v1,  Published online: 1998
Retrieved March 29, 2024, from https://www.rep.routledge.com/articles/thematic/information-theory/v-1

Article Summary

Information theory was established in 1948 by Claude Shannon as a statistical analysis of factors pertaining to the transmission of messages through communication channels. Among basic concepts defined within the theory are information (the amount of uncertainty removed by the occurrence of an event), entropy (the average amount of information represented by events at the source of a channel), and equivocation (the ‘noise’ that impedes faithful transmission of a message through a channel). Information theory has proved essential to the development of space probes, high-speed computing machinery and modern communication systems.

The information studied by Shannon is sharply distinct from information in the sense of knowledge or of propositional content. It is also distinct from most uses of the term in the popular press (‘information retrieval’, ‘information processing’, ‘information highway’, and so on). While Shannon’s work has strongly influenced academic psychology and philosophy, its reception in these disciplines has been largely impressionistic. A major problem for contemporary philosophy is to relate the statistical conceptions of information theory to information in the semantic sense of knowledge and content.

Print
Citing this article:
Sayre, Kenneth M.. Information theory, 1998, doi:10.4324/9780415249126-Q051-1. Routledge Encyclopedia of Philosophy, Taylor and Francis, https://www.rep.routledge.com/articles/thematic/information-theory/v-1.
Copyright © 1998-2024 Routledge.

Related Searches

Topics

Related Articles