Version: v1, Published online: 1998
Retrieved December 15, 2019, from https://www.rep.routledge.com/articles/thematic/information-theory-and-epistemology/v-1
The mathematical theory of information (also called communication theory) defines a quantity called mutual information that exists between a source, s, and receiver, r. Mutual information is a statistical construct, a quantity defined in terms of conditional probabilities between the events occurring at r and s. If what happens at r depends on what happens at s to some degree, then there is a communication ‘channel’ between r and s, and mutual information at r about s. If, on the other hand, the events at two points are statistically independent, there is zero mutual information.
Philosophers and psychologists are attracted to information theory because of its potential as a useful tool in describing an organism’s cognitive relations to the world. The attractions are especially great for those who seek a naturalistic account of knowledge, an account that avoids normative – and, therefore, scientifically unusable – ideas such as rational warrant, sufficient reason and adequate justification. According to this approach, philosophically problematic notions like evidence, knowledge, recognition and perception – perhaps even meaning – can be understood in communication terms. Perceptual knowledge, for instance, might best be rendered in terms of a brain (r) receiving mutual information about a worldly source (s) via sensory channels. When incoming signals carry appropriate information, suitably equipped brains ‘decode’ these signals, extract information and thereby come to know what is happening in the outside world. Perception becomes information-produced belief.
Dretske, Fred. Information theory and epistemology, 1998, doi:10.4324/9780415249126-P026-1. Routledge Encyclopedia of Philosophy, Taylor and Francis, https://www.rep.routledge.com/articles/thematic/information-theory-and-epistemology/v-1.
Copyright © 1998-2019 Routledge.