Computability and Recursion,

The History and Concept of Computability,

* Annals of Pure and Applied Logic *, 160 (2009), 368--399.
We examine the birth of computability theory in the 1930's, the roles
of G\"odel, Church, and Turing, and the formalisms of recursive
functions and Turing automatic machines ($a$-machines). Godel and
most modern scholars gave credit to Alan Turing for formulating his
Turing machine definition and for demonstrating Turing's Thesis that
an intuitively computable function is computable by a Turing machine.
We present Turing's notion [1939, \S4] of an {\em oracle machine\/}
($o$-machine) and Post's development of it in [1944, \S 11], [1948],
and finally Kleene-Post [1954] into its present form.

A number of topics arose from Turing functionals including continuous functionals on Cantor space and online computations. Almost all the results in theoretical computability use relative reducibility and $o$-machines rather than $a$-machines and most computing processes in the real world are potentially online or interactive. Therefore, we argue that Turing $o$-machines, relative computability, and online computing are the most important concepts in the subject, more so than Turing $a$-machines and standard computable functions since they are special cases of the former and are presented first only for pedagogical clarity to beginning students. At the end in \S10 -- \S13 we consider three displacements in computability theory, and the historical reasons they occurred. Several brief conclusions are drawn in \S14.

* Annals of Pure and Applied Logic *, to appear in volume of
Proceedings of Computability in Europe, Siena, 2007 meeting.

Turing, born in 1912, would turn out to be one of the most brilliant minds of the century; conceiving, as he did, of the modern computer decades before it came into existence.

Turing went on to study mathematics at King’s College, Cambridge, and, after a long-distance run one afternoon, he lay down in a meadow to ponder. By the time he got up to run back, he had worked out a new definition of computability. Without consulting his mentor, or anyone else for that matter, he sat down and wrote “On Computable Numbers,” which conceived of abstract “machines” that function much like modern computers. The year was 1936. At the time, of course, actual computers did not exist. Turing was 23.

The great mathematician John von Neumann was one of the few at the time who appreciated Turing’s originality.