Version: v1, Published online: 1998
Retrieved July 15, 2020, from https://www.rep.routledge.com/articles/thematic/complexity-computational/v-1
The theory of computational complexity is concerned with estimating the resources a computer needs to solve a given problem. The basic resources are time (number of steps executed) and space (amount of memory used). There are problems in logic, algebra and combinatorial games that are solvable in principle by a computer, but computationally intractable because the resources required by relatively small instances are practically infeasible.
The theory of -completeness concerns a common type of problem in which a solution is easy to check but may be hard to find. Such problems belong to the class ; the hardest ones of this type are the -complete problems. The problem of determining whether a formula of propositional logic is satisfiable or not is -complete. The class of problems with feasible solutions is commonly identified with the class of problems solvable in polynomial time. Assuming this identification, the conjecture that some problems require infeasibly long times for their solution is equivalent to the conjecture that . Although the conjecture remains open, it is widely believed that -complete problems are computationally intractable.
Urquhart, Alasdair. Complexity, computational, 1998, doi:10.4324/9780415249126-Y007-1. Routledge Encyclopedia of Philosophy, Taylor and Francis, https://www.rep.routledge.com/articles/thematic/complexity-computational/v-1.
Copyright © 1998-2020 Routledge.