We are not able to resolve this OAI Identifier to the repository landing page. If you are the repository manager for this record, please head to the Dashboard and adjust the settings.
The importance of algorithms is now recognized in all mathematical sciences,
thanks to the development of computability and computational
complexity theory in the 20th century. The basic understanding of computability
theory developed in the nineteen thirties with the pioneering
work of mathematicians like G¨odel, Church, Turing and Post. Their work
provided the mathematical basis for the study of algorithms as a formalized
concept. The work of Hartmanis, Stearns, Karp, Cook and others
in the nineteen sixties and seventies showed that the refinement of the
theory to resource-bounded computations gave the means to explain the
many intuitions concerning the complexity or hardness of algorithmic
problems in a precise and rigorous framework.
The theory has its roots in the older questions of definability, provability
and decidability in formal systems. The breakthrough in the nineteen
thirties was the formalisation of the intuitive concept of algorithmic
computability by Turing. In his famous 1936-paper, Turing [43] presented
a model of computation that was both mathematically rigorous and general
enough to capture the possible actions that any human computer
could carry out. Although the model was presented well before digital
computers arrived on the scene, it has the generality of describing
computations at the individual bit-level, using very basic control commands.
Computability and computational complexity theory are now
firmly founded on the Turing machine paradigm and its ramifications
in recursion theory. In this paper we will extend the Turing machine
paradigm to include several key features of contemporary information
processing systems
Is data on this page outdated, violates copyrights or anything else? Report the problem now and we will take corresponding actions after reviewing your request.