Print

Mind, philosophy of

DOI
10.4324/9780415249126-V038-1
DOI: 10.4324/9780415249126-V038-1
Version: v1,  Published online: 1998
Retrieved March 28, 2024, from https://www.rep.routledge.com/articles/overview/mind-philosophy-of/v-1

1. Functionalism and the computational theory of mind

These developments led to the emergence in the 1970s of the loose federation of disciplines called ‘cognitive science’, which brought together research from, for example, psychology, linguistics, computer science, neuroscience and a number of sub-areas of philosophy, such as logic, the philosophy of language, and action theory. In philosophy of mind, these developments led to Functionalism, according to which mental states are to be characterized in terms of relations they bear among themselves and to inputs and outputs, for example, mediating perception and action in the way that belief and desire characteristically seem to do. The traditional problem of Other minds then became an exercise in inferring from behaviour to the nature of internal causal intermediaries.

This focus on functional organization brought with it the possibility of multiple realizations: if all that is essential to mental states are the roles they play in a system, then, in principle, mental states, and so minds, could be composed of (or ‘realized’ by) different substances: some minds might be carbon-based like ours, some might be computer ‘brains’ in robots of the future, and some might be silicon-based, as in some science fiction stories about ‘Martians’. These differences might also cause the minds to be organized in different ways at different levels, an idea that has encouraged the co-existence of the many different disciplines of cognitive science, each studying the mind at often different levels of explanation.

Functionalism has played an important role in debates over the metaphysics of mind. Some see it as a way of avoiding Dualism and arguing for a version of materialism known as the identity theory of mind (see Mind, identity theory of). They argue that if mental states play distinctive functional roles, to identify mental states we simply need to find the states that play those roles, which are, almost certainly, various states of the brain. Here we must distinguish identifying mental state tokens with brain state tokens, from identifying mental types with brain types (see Type/token distinction). Many argue that multiple realizability shows it would be a mistake to identify any particular kind or type of mental phenomenon with a specific type of physical phenomenon (for example, depression with the depletion of norepinepherine in a certain area of the brain). For if depression is a multiply realized functional state, then it will not be identical with any particular type of physical phenomenon: different instances, or tokens, of depression might be identical with tokens of ever different types of physical phenomena (norepinephrine deletion in humans, too little silicon activation in a Martian). Indeed, a functionalist could allow (although few take this seriously) that there might be ghosts who realize the right functional organization in some special dualistic substance. However, some identity theorists insist that at least some mental state types – they often focus on states like pain and the taste of pineapple, states with Qualia (see also the discussion below) – ought to be identified with particular brain state types, in somewhat the way that lightning is identified with electrical discharge, or water with H2O. They typically think of these identifications as necessary a posteriori.

An important example of a functionalist theory, one that has come to dominate much research in cognitive science, is the computational theory of mind (see Mind, computational theories of), according to which mental states are either identified with, or closely linked to, the computational states of a computer. There have been three main versions of this theory, corresponding to three main proposals about the mind’s Cognitive architecture. According to the ‘classical’ theory, particularly associated with Jerry Fodor, the computations take place over representations that possess the kind of logical, syntactic structure captured in standard logical form: representations in a so-called Language of thought, encoded in our brains. A second proposal, sometimes inspired by F.P. Ramsey’s view that beliefs are maps by which we steer (see Belief), emphasizes the possible role in reasoning of maps and mental Imagery. A third, recently much-discussed proposal is Connectionism, which denies that there are any structured representations at all: the mind/brain consists rather of a vast network of nodes whose different and variable excitation levels explain intelligent Learning. This approach has aroused interest especially among those wary of positing much ‘hidden’ mental structure not evident in ordinary behaviour (see Ludwig Wittgenstein §3 and Daniel Dennett).

The areas that lend themselves most naturally to a computational theory are those associated with logic, common sense and practical reasoning, and natural language syntax (see Common-sense reasoning, theories of; Rationality, practical; Syntax); and research on these topics in psychology and Artificial intelligence has become deeply intertwined with philosophy (see Rationality of belief; Semantics; Language, philosophy of).

A particularly fruitful application of computational theories has been to Vision. Early work in Gestalt psychology uncovered a number of striking perceptual illusions that demonstrated ways in which the mind structures perceptual experience, and the pioneering work of the psychologist, David Marr, suggested that we might capture these structuring effects computationally. The idea that perception was highly cognitive, along with the functionalist picture that specifies a mental state by its place in a network, led many to holistic conceptions of mind and meaning, according to which parts of a person’s thought and experience cannot be understood apart from the person’s entire cognitive system (see Holism: mental and semantic; Semantics, conceptual role).

However, this view has been challenged recently by work of Jerry Fodor. He has argued that perceptual systems are ‘modules’, whose processing is ‘informationally encapsulated’ and hence isolatable from the effects of the states of the central cognitive system (see Modularity of mind). He has also proposed accounts of meaning that treat it as a local (or ‘atomistic’) property to be understood in terms of certain kinds of causal dependence between states of the brain and the world (see Semantics, informational). Others have argued further that Perception, although contentful, is also importantly non-conceptual, as when one sees a square shape as a diamond but is unable to say wherein the essential difference between a square and a diamond shape consists (see Content, non-conceptual).

Print
Citing this article:
Jackson, Frank and Georges Rey. Functionalism and the computational theory of mind. Mind, philosophy of, 1998, doi:10.4324/9780415249126-V038-1. Routledge Encyclopedia of Philosophy, Taylor and Francis, https://www.rep.routledge.com/articles/overview/mind-philosophy-of/v-1/sections/functionalism-and-the-computational-theory-of-mind.
Copyright © 1998-2024 Routledge.

Related Searches

Topics

Related Articles