Twentieth-century philosophy

DOI: 10.4324/0123456789-DD3596-1
Version: v1,  Published online: 2017
Retrieved June 12, 2024, from

4. Being, life, and practice

Considerations about the underlying structure of meaning or sense also played a decisive role in the ontologically radicalized phenomenological inquiry of Martin Heidegger’s Being and Time. While continuing the phenomenological project of his teacher, Husserl, Heidegger also disputed its identification of subjective consciousness as an ultimate ground for all possible demonstration and its tendency to idealize meaning and sense as existing in an atemporal realm distinct from temporal becoming. Rather than focusing on consciousness or idealized contents, Heidegger’s central question was rather about the “sense” or “meaning” of being. His investigation proceeded by way of a “preparatory” analysis of Dasein, the kind of beings or entities that we, ourselves, are, in its characteristic phenomena of practical comportment, propositional description, and truth, and an investigation of its relationship to death, time, and the possibility of authenticity.

Drawing on Heidegger’s analysis but situating it within the more classically phenomenological project of the description of subjective consciousness, Jean-Paul Sartre’s popular Being and Nothingness developed an analysis of the conditions and nature of human freedom as grounded in the reflexive structure by which consciousness, or what Sartre termed the “for-itself,” is constitutively related to itself. In The Phenomenology of Perception, Maurice Merleau-Ponty extended Husserl’s phenomenological analyses of consciousness to a complex exploration of the experience of embodiment, including its implications for the constitution of space and time as phenomenologically experienced. Also drawing on Heidegger’s distinctive conception of the methodology of an interpretation or “hermeneutics” of actual life, Hans-Georg Gadamer developed the project of an overall philosophical hermeneutics, sharply contrasting the irreducibly contextual and situated “human understanding” it was to produce with scientific and other methodically, procedurally, and technically determinable conceptions of knowledge.

Another philosophical project that decisively situated meaning in the context of actual life and practice, rather than proposing primarily formal or idealized analysis of it, was the “ordinary language philosophy” which emerged in England between the wars (see also Ordinary language philosophy, school of). Whereas earlier linguistic philosophers had often focused on the idealized structure of logical inference or on constructing artificial, formal notations, ordinary language philosophers such as Gilbert Ryle and J. L. Austin looked to actual patterns of the usage of English in order to clarify or rectify (as Ryle put it) the “logical geography” of our ordinary concepts. For this methodology of investigating ordinary usage rather than idealized meaning, the newer views of Wittgenstein, who had returned to Cambridge and to sustained philosophical work in 1929, were an important influence. Rejecting his earlier logical atomism, Wittgenstein now suggested that the diagnosis and treatment of philosophical questions could best proceed by means of a consideration of what is actually said, and meant, in the expressions of ordinary language. This kind of systematic reflection was, in some of its aspects at least, not far removed from phenomenology’s classical project of clarifying reflection on meaning or sense: Austin, for instance, sometimes referred to his own practice as “linguistic phenomenology.” However, the practice of ordinary language philosophy also had important critical implications, at least for the questions and problems typical of earlier philosophical theorizing, by showing how these questions or problems resulted from misunderstandings of the actual workings of our language, or from confusions between different kinds of usages.

Other results and problems began to suggest, at this time, fundamental limitations to the project of a total analysis of truth and meaning in terms of language as a rule-governed system of signs. Kurt Gödel’s two incompleteness theorems, published in 1931, showed that any consistent axiomatic system for arithmetic would fail to capture all of its truths, and the closely related results of Alfred Tarski’s investigation into the possible definition of truth predicates indicated that the truth of sentences within a language cannot be wholly defined within that language itself, as long as it remains self-consistent (see also Tarski’s definition of truth). With very different motivations but also with direct critical bearing on the earlier idea of language as a total, rule-governed calculus, Wittgenstein launched, in his Philosophical Investigations, a deep-seated attack on the idea it involved of following a rule (see Meaning and rule following). No rule of usage for signs could, Wittgenstein argued, by itself determine its own application to each of the infinite number of possible cases and contexts in which a word can meaningfully be used. Wittgenstein concluded that linguistic use and practice cannot be seen as a matter of the mechanical application of rules at all. Rather, they were to be clarified only on the basis of a more reflective and situated consideration of the roles of regularity, identity, and meaning in our complex, linguistically shaped lives.

Citing this article:
Livingston, Paul M.. Being, life, and practice. Twentieth-century philosophy, 2017, doi:10.4324/0123456789-DD3596-1. Routledge Encyclopedia of Philosophy, Taylor and Francis,
Copyright © 1998-2024 Routledge.

Related Articles