About Alan Turing

You will not be entirely knowledgeable about computers if you know nothing about Alan Turing. This guy is one of the most important people in the history of computers. If you want to understand these remarkable pieces of technology, you need to get to know how it all started. To do that, you need to get to know who Alan Turing is and how he contributed to the evolution of computers.

Who is Alan Turing

So who is this man? Alan Turing is considered as the father of modern computing. He is both a mathematician and a logician – and a brilliant one at that! He was the very first to develop the idea of what we now have as the modern computer. While there have been earlier types and designs, it was his Turing machine that provided the fundamental concepts that would eventually lead to the computers of today. Not only that, he was also the one who developed the idea about artificial intelligence – something that we have yet to fully explore.

To know more about Alan Turing, let us start from the very beginning of his life.

Alan Turing was born in London back in 1912. He attended Sherborne School and King’s College Cambridge. Eventually, he crossed the ocean to the US to pursue further education at Princeton University.

When World War II happened, his genius was put to work by the government. He was recruited to help break codes that were used by the enemy. Turing went to the Government Codes and Cypher School to help break the code of the German Enigma and Naval machine. This was the device used by the enemy to send coded messages. He also developed the “Bombe” – a machine they used to break the code. He was so good at what he did that Churchill said his efforts helped shorten the war by two years. His work back then was so confidential that nobody really knew what he did during the war.

After that, he became the computing lab’s deputy director at Manchester University. During the latter part of his life, he was involved in a lot of controversies – but they would never overshadow his contributions to modern technology. It was sad that he died young of an apparent suicide in 1954. When thinking about Alan Turing and his early death, some people wonder if the evolution of computers turned out differently if he had more years to influence its development.

About the first computer

When it comes to the first computer, it is actually hard to pinpoint just one of them. Because of that, we can probably mention more than one “first” when it comes to computers.

  • First use of the word “computer”. It happened in 1613. But it was used to describe someone who did computations and calculations.
  • First mechanical computer. It was called the Difference Engine and was created by Charles Babbage in 1822. It was when the automatic computing engine concept was designed. It can compute various sets of numbers and made hard copies of the result. Ada Lovelace helped in this project and her efforts and notes made everyone consider her as the first computer programmer.
  • FIrst programmable computer. This was the Z1. It was created by Konrad Zuse, a talented German who worked on it in the living room of his parents. This was created between 1936 to 1938.
  • First modern computer. This was the Turing machine. It was proposed by Alan Turing in 1936. It is the foundation for the theories used when creating computers – or computing in general. The machine printed symbols on a paper tape.
  • First electric programmable computer. Known as the Colossus, this was developed by Tommy Flowers and was demonstrated in December of 1943. This was used to help the British break codes that were encrypted by the Germans.
  • First digital computer. The Atanasoff-Berry Computer (ABC) was developed by Professor John Vincent Atanasoff and Cliff Berry, a graduate student. The development started in 1937 and continued until 1942. This computer used 300 vacuum tubes to enable digital computation.

This is a brief history of what happened to computers and how it developed in the early years. Obviously, what you know about Alan Turing is probably not enough – but you can bet that it remains significant nevertheless.