Twentieth-century philosophy

DOI: 10.4324/0123456789-DD3596-1
Version: v1,  Published online: 2017
Retrieved June 12, 2024, from

5. Mid-century critiques and radicalization of earlier pictures

Beginning around 1950, a number of prominent philosophers developed far-reaching critiques of what they saw as the foundationalist, structuralist, and idealizing analytic assumptions of earlier projects. An important example of this kind of critique can be seen in the development of W. V. Quine’s critical views about the programmatic assumptions of his friend and mentor, Rudolf Carnap. In the influential 1950 article “Two Dogmas of Empiricism,” Quine charged Carnap and logical positivism with commitment to two programmatic claims that were in fact, as Quine argued, dogmas unsupported by any good evidence and unlikely to yield philosophical gains. One of these was the “reductionism” of logical positivism’s attempt to analyze empirical sentences uniquely into separate components of empirical and logical content. The other dogma was that of a distinction between “analytic” truths – those that were held to be true simply in virtue of definitions, meanings, or conceptual relationships available to a priori analysis – and “synthetic” ones confirmed by empirical evidence. Within the actual empirical description of the use of a natural language, there was no way, Quine argued, to maintain this distinction, and thus no support for either of the two dogmas.

Quine’s argument was broadly seen as challenging the very possibility of a philosophical analysis of meaning in a priori or purely conceptual terms. This theme was echoed by a number of partially similar critical arguments in the philosophy of mind and perception. For instance, Wilfrid Sellars made a partially parallel argument in his influential 1956 “Empiricism and the Philosophy of Mind,” challenging what he called the “Myth of the Given”: the myth, that is, of a separable variety of immediately given or directly experienced contents of perception (such as sense-data or basic sensory contents) which would nevertheless play a role in the justification of empirical judgments. Quine then developed his own replacement for the project of semantic analysis in his 1960 Word and Object: here he envisaged, in particular, the challenge of “radical translation,” or the translation of a language at first wholly opaque to the interpreter by means of the investigation of observable responses and other intersubjectively accessible data alone, and argued that translationally accessible meaning would be deeply indeterminate under these evidentiary conditions. This project of the translational analysis of meaning was subsequently taken up more positively by Donald Davidson, who envisioned adding to the interpretation of a particular language the structurally revealing constraints that Tarski had suggested for the theorization of truth (see Meaning and truth). The suggestion led Davidson and others to imagine the possibility of a comprehensive “semantics of natural language,” partially analogous to Noam Chomsky’s development of universal grammar, but with a direct focus on semantic meaning rather than just syntactic structure.

In the somewhat different context of post-World War II France, a number of thinkers developed the structuralist picture of language that had first been proposed by Saussure, radicalizing certain of its key concepts and applying them to heterogeneous domains. These thinkers, including Jacques Lacan, Michel Foucault, Louis Althusser, Jacques Derrida, and Gilles Deleuze, often extended or applied structuralist concepts to diverse domains, often in such a way as to call into question the possibility of a single, univocal, or unitary analysis of the overall structure of language, knowledge, or conceptual reality (see Post-structuralism; Post-structuralism in the social sciences). Lacan, for example, accomplished a complex and heterodox synthesis of structuralism with Freudian psychoanalysis, holding that the “unconscious is structured like a language” and thus using linguistic structure as a decisive key for understanding the constitution and role of the ego, fantasy, desire, and other psychic phenomena. Jacques Derrida’s project of deconstruction drew out the critical and problematic consequences of structural features of language that were already suggested by the structuralist concept of language, for instance the general iterability of signs and the constitution of language as a system of terms defined by their differences from each other. For Derrida, Althusser, and others, these radically structuralist or (as they have sometimes also been termed) “post-structuralist” analyses involved a picture of the determination of social and political life, practice, and ideology as turning on large-scale structural effects rather than on individual action or subjectivity. With these analyses, they followed the later Heidegger in critically contesting as well the legacy and contemporary applicability of the traditional picture of human subjectivity as entailing the subject’s free, self-conscious mastery of language, as well as the whole tradition of philosophical humanism which that picture had involved.

A concern with contingency, heterogeneity, and difference as relatively prior to identity characterizes much postwar social and political thought. In his 1966 Negative Dialectics, Theodor Adorno sought to recover a philosophically useful understanding of dialectics in the wake of the failure of Hegel’s classical version to account for the pervasive systematic violence and oppression of the twentieth century, which has often (Adorno argued) resulted from totalizing attempts to identify and assimilate diverse phenomena under a single unifying form. His argument, which accordingly defined negative dialectics as “the consistent consciousness of non-identity,” gave forceful expression to the Frankfurt School’s broader critique of abstract and instrumental forms of rationality and their tendency to support both capitalist and totalitarian forms of social organization and practice (see Critical theory). In a more explicitly metaphysical register, Deleuze argued in Difference and Repetition for a positive concept of difference “in itself,” not to be defined in terms of prior identities, and for its essential link to a positive concept of essentially creative repetition. Drawing on the groundbreaking phenomenological analysis given by Simone de Beauvoir in The Second Sex, feminist philosophy took up the profound and far-reaching implications of the thinking of sexual difference and the construction of gender for traditional philosophical themes and questions, including those of politics, epistemology, and ethics (see Feminist epistemology). And the development of critical race theory and post-colonial theory challenged traditional philosophy to find ways of acknowledging and thinking through the differences whose marginalization or exclusion it had constitutively presupposed.

Citing this article:
Livingston, Paul M.. Mid-century critiques and radicalization of earlier pictures. Twentieth-century philosophy, 2017, doi:10.4324/0123456789-DD3596-1. Routledge Encyclopedia of Philosophy, Taylor and Francis,
Copyright © 1998-2024 Routledge.

Related Articles