# Alan Turing

The fruits of the earth are not brought to perfection immediately, but by time, rain, and care; similarly, the fruits of men ripen through ascetic practice, study, time, perseverance, self-control, and patience.
Anthony the Great

Alan Mathison Turing (1912–1954) was an extremely gifted man, who was influential in the development of computer science and provided a formalization of the concept of the algorithm and computation with his famous Turing machine, playing a significant role in the creation of the modern computer. Turing discovered something that would have delighted Leibniz—he found that it was possible, in principle, to devise one single universal machine that, all by itself, could carry out any possible computation.

Turing first described the Turing machine in his 1936 article On Computable Numbers, with an Application to the Entscheidungsproblem.

The Turing machine is an idealized computing device, consisting of a read/write head (or scanner) with a paper tape passing through it. The tape is divided into squares, each square bearing a single symbol (0 or 1, for example). This tape is the machine’s general-purpose storage medium, serving both as the vehicle for input and output and as a working memory for storing the results of intermediate steps of the computation. In the original article of 1936, Turing actually imagines not a mechanical machine, but a person whom he calls the computer, who executes these deterministic mechanical rules in a desultory manner.

The input that is inscribed on the tape before the computation starts must consist of a finite number of symbols. However, the tape is of unbounded length, for Turing’s aim was to show that there are tasks that these machines are unable to perform, even given unlimited working memory and unlimited time. The read/write head is programmable. To compute with the device, you program it, inscribe the input on the tape, place the head over the square containing the leftmost input symbol, and set the machine in motion. Once the computation is completed, the machine will halt with the head positioned over the square containing the leftmost symbol of the output (or elsewhere if so programmed).

The Turing machine can perform six types of fundamental operations—read, write, move left, move right, change state, and halt. A complicated computation may consist of hundreds of thousands or even millions of such operations. Despite the Turing machine’s simplicity, it is capable of computing anything that any modern computer can compute.

A short definition of the thought experiment was given by Turing in his 1948 essay, Intelligent Machinery. Referring back to his 1936 publication, he writes that the Turing machine, here called a Logical Computing Machine, consisted of:
…an infinite memory capacity obtained in the form of an infinite tape marked out into squares on each of which a symbol could be printed. At any moment there is one symbol in the machine; it is called the scanned symbol. The machine can alter the scanned symbol and its behavior is in part determined by that symbol, but the symbols on the tape elsewhere do not affect the behavior of the machine. However, the tape can be moved back and forth through the machine, this being one of the elementary operations of the machine. Any symbol on the tape may therefore eventually have an innings.

Alan Turing played an important role in the creation of the first English electronic computers. During WWII, he worked for Britain’s codebreaking center (Government Code and Cypher School at Bletchley Park). He devised a number of techniques for breaking German ciphers, and together with Gordon Welchman developed an improved version of the Bomb—an electromechanical machine, that could find settings for the German Enigma coding machine. Later Turing devised a technique for use against the Lorenz cipher, used in the Germans’ new Geheimschreiber machine. In 1948, Turing and his colleague D. G. Champernowne wrote the first computer chess algorithm, called Turbochamp, but no computer to test it on.

Turing’s greatest contributions to the development of the digital computer are:
1. The idea of controlling the function of a computing machine by storing a program of symbolically, or numerically, encoded instructions in the machine’s memory.
2. His proof that, by this means, a single machine (a universal machine), is able to carry out every computation that can be carried out by any other Turing machine whatsoever. He stated: Electronic computers are intended to carry out any definite rule of thumb process which could have been done by a human operator working in a disciplined but unintelligent manner.