Universal Computers

Universal Computers

By Samuel Dartez, Logan Reed, and Kevin Taylor

The Ideas of Alan Turing

The Program in Your Head

    Turing was the first to employ the idea of storing both the memory and the program within the machine. This was a big jump in making a machine more like the human brain, which was the original point of creating a computer. Others had wanted to do this but had failed. The birth of the Turing machine began in Turing’s head. First Turing had to organize a system or process of thought, a program, which could be understood by a machine. This “program” was based on human logical thinking. An example of logical thought is; A>B, B>C, then A>C. The program was a system of 1’s and 0’s that acted as symbols for either “true” or “false”. This is called a binary number system. The idea is identical to the punch cards used in Jacquard’s loom. When using the cards, the metal rod passed through a hole (true), or was stopped by the card (false). Unlike the other computing machines of Turing’s time, the Turing machine could hold both the program and the data. This meant that the operator of the computer no longer had to completely rewire the machine to run a different program.

    The Turing machine is composed of a writing device (marker), two connected rolls of tape, a rotating felt eraser head, and a camera for reading the programming on the rolls of tape. The Turing machine could read and write information using programs stored within it. A universal Turing machine U is a Turing machine that can imitate the behavior of any other Turing machine T. It is a fundamental result that such machines exist and can be constructed effectively. Only a suitable description of T's finite program and input needs to be entered on U's tape initially. To execute the consecutive actions that T would perform on its own tape, U uses T's description to simulate T's actions on a representation of T's tape contents. Such a machine U is also called `computation universal.' In fact, there are infinitely many such U's (scholarpedia.org). This was a revolutionary turn for computers. This had moved computer’s one step closer to the goal of matching the human brain in every aspect. In 1945 Turing was awarded an OBE (Order of the British Empire) for his pivotal work as code breaker, but (they couldn’t tell anyone that) they said it was for his services in the Foreign Office. In 1951 he was elected to Fellow of the Royal Society. King’s College in Cambridge has named its computer center after Turing and a life-size statue of him was unveiled in Manchester in 2001.

    The University of Manchester initiated the Alan Turing Institute in 2004. In addition, The Association of Computing Machinery (ACM) issues The Alan M. Turing Award for person’s technical contributions in computing. The prestigious award -considered the Nobel Prize in computing – comes with a $250,000 prize supported by Intel Corp. and Google Inc.

    A few notable recipients and contributions:

    All modern computers are built upon the Turing machine and are in fact Universal Turing Machines themselves.

Artificial Intelligence

    Turing also delved into artificial intelligence. He designed a neural network machine called a “B-type unorganized machine”. Turing was a founding father of modern cognitive science and a leading early exponent of the hypothesis that the human brain is in large part a digital computing machine. He theorized that the cortex at birth is an “unorganized machine” that through “training” becomes organized “into a universal machine or something like it.” In 1950 Turing proposed what subsequently became known as the Turing test which was a criterion for whether a machine thinks. He was never able to build the machine though. This idea was years ahead of its time and was not even published until 14 years after Turing’s death. A neural network computer was not successfully built until soon after Turing death.

John Von Neumann

Von Neumann Architecture

    John Von Neumann was one of the first to implement the ideas of a stored program and information system in one. Another way to say this is that Von Neumann built the first computers to have the programs stored within the computer and not the computer itself. Before Von Neumann, the ENIAC of the 1940’s had to be rewired to perform different programs. Von Neumann proposed the idea of using a processing unit, a control unit, and a memory unit where the data and the program were stored. He later installed these improvements on the EDVAC. The EDVAC was the computer that was built to handle the computing of the Manhattan Project. Since the project had an enormous amount of data, it needed an easier way to run its different programs and store the data. The EDVAC is the first ever universal stored-program computing machine.

    Such a computer implements a universal Turing machine, and the common "referential model" of specifying sequential architectures, in contrast with parallel architectures. The term "stored-program computer" is generally used to mean a computer of this design, although as modern computers are usually of this type, the term has fallen into disuse (udo-sobotta.net). To date, quantum computers have been implemented so that programming their operation was, in essence, hardwired into their essential structure. Although many useful demonstrations of quantum computing have resulted from such special-purpose devices, they are basically one-problem computers which cannot easily be reprogrammed or scaled to attack larger problems. As early models of practical quantum computers, they don't make the grade. The basis of essentially all practical classical computers is the Von Neumann architecture, which comprises a central processing unit (CPU) to do calculations, a memory which holds both data and CPU instructions, and an interface which allows the input and output of the CPU to change the information in memory (gizmag.com).

    At one time, the "computer" was a human being, usually female, who did calculations for men in suits. This all changed when Alan Turing and his idea of "a single machine that could be used to compute any computable sequence" and then forks into two versions; (1) The British version the "Colossus" computer and (2) the American version ENIAC machine. This led to the companies Univac and IBM that powered and shaped the industries of the mid-20th century. That then brought the emergence of other companies such as; Xerox, Apple, Intel and Microsoft. As a result we eventually arrive at a world in which nearly everything has a computer in it somewhere.

Just for Fun

    In addition, here are some great links to videos of The Turing Machine and The Universal Turing Machine in action.

Works Cited

1. http://www.udo-sobotta.net/pdf/vonneumann.pdf
2. "Alan M. Turing (British mathematician and logician) : Artificial intelligence pioneer." Encyclopedia Britannica Online. Encyclopedia Britannica. 09 Dec.                 2012 http://www.britannica.com/EBchecked/topic/609739/Alan-M-Turing/214879/Artificial-intelligence-pioneer.

3. "Alan Mathison Turing." Cyber Heroes of the past:. 09 Dec. 2012 <http://wvegter.hivemind.net/abacus/CyberHeroes/Turing.htm>.
                Naughton, John. "The true fathers of computing." The Guardian. 25 Feb. 2012. Guardian News and Media. 09 Dec. 2012                                            http://www.guardian.co.uk/technology/2012/feb/26/first-computers-john-von-neumann.

4. "Quantum computer with separate CPU and memory represents significant breakthrough." Quantum computer with separate CPU and memory                         represents significant breakthrough. 09 Dec. 2012 http://www.gizmag.com/quantum-computer-von-neumann/21340/.

5. "Turing, Father of the Modern Computer." Turing, Father of the Modern Computer. 09 Dec. 2012                                                                                             http://www.rutherfordjournal.org/article040101.html.

6. "Turing machine." - Scholarpedia. 09 Dec. 2012 http://www.scholarpedia.org/article/Turing_machine.

7. "Quantum computer with separate CPU and memory represents significant breakthrough." Quantum computer with separate CPU and memory                          represents significant breakthrough. 09 Dec. 2012 http://www.gizmag.com/quantum-computer-von-neumann/21340/.

8. John McCarthy – Artificial Intelligence in 1971 

9. Ivan Sutherland – computer graphics, 1988 

10. Douglas Engelhard – invented the ubiquitous computer, 1997 

11. Andrew Yao – pseudo-random number generation and cryptography, 2000 

12. Ronald Rivest – public-key cryptography, 2002 

13. Frances E. Allen – optimizing compilers and automatic parallel execution, 2006