By Clara Martens Avila

 

A Brief History of Computers #1

This is the first of a three-part series on the history of computers made by Qlouder. The goal is to give an idea on why computers are the way they are nowadays. This time: what makes a machine a computer?

Part II: The road from an idea to Artificial Intelligence

Part III: How accessible computers gave us the cloud

What makes a computer a computer?

The year was 1837 and a man named Charles Babbage had an idea.

 

Charles lived in the middle of the industrial revolution. Factories were emerging left and right, and Charles had the particular luck of being a mathematician with a mechanical mind. At the time he had spent years building this advanced mechanical calculator, and it was during the work on this “Difference Engine” that an idea struck him.

 

A “hard-coded” machine that turns input into output, in this case by addition.

Machines at the time were built with a very clear purpose. If you wanted a machine to add numbers you would build a mechanism that would do just that. If you wanted to do subtraction on top of that, you’d have to build mechanisms for that too or a whole new machine altogether; you had to spell out what the computer had to do in gears, step by step. And that’s what got Babbage thinking. Every instruction had to build into the machine for it to fulfill its purpose, but what if instead the instructions came from outside?

 

A machine whose output depends on the instructions card that is inside.

Babbage started thinking of a machine that wouldn’t be “hard-coded”. A machine that wasn’t built for one process, but could execute several processes depending on the instructions given to it. The sets of instructions would be written in punched cards that could be changed, and so you would have the machine multiply with one card and do divisions with another. It was, in a way, the same idea another man named Alan Turing would expand on exactly 100 years later. At its core the idea was quite simple, but this thought laid the first foundation for what would one day become a computer.

The importance of alternatives; or the magic of if-else statements

If you write code nowadays you’re probably aware of the fact that computers read code from top to bottom. It’s a very simple concept and seems easy enough, but it has very important consequences: you can’t use a variable (or “memory space”) if you haven’t declared it for the computer first, nor execute a function (“set of instructions”) without writing it out first. That does not mean however that a computer can’t jump back and forth in your code: but it can’t jump forward.

 

Punched cards were already widely used in the weaving industry to direct looms.

  Punched cards were already used in Babbage’s time, they directed looms in various weave factories. In these factories they would be read linearly, from start to end in one long consistent line. Just like a computer reading code. But that wouldn’t do for the use Babbage envisioned. He didn’t want to repeat instructions over and over again on the card, he wanted the machine to jump back and forth and be able to execute a set of instructions an indefinite amount of times. There was one missing link to make that possible, and it was this concept that would make Babbage’s ideas even more crucial to the birth of the computer: conditional branching.

 

  What use does jumping back in a code have if you can’t link it to conditions? You would get into an infinite loop, repeating the same instructions over and over again. And here’s the real big step Babbage set towards what would one day become a computer: the first ideation of if- else statements. The implementation of a condition to stop the loop. Jump back, until a condition is met.

 

This was effectively also the start of algorithms in machines. Babbage wrote a few sample programs for the machine he had in his mind, but it was Ada Lovelace,  a fellow mathematician he corresponded with, that would go down in history as the first programmer. She wrote a neat algorithm in an annotation on Babbage’s work to compute Bernoulli numbers. 

An example of going back and forth till a condition is met. Also similar in a way to what a Turing Machine would look like.

 

 It was a small jump backward for the Analytical Engine, but a huge jump forward for the development of computers.

A Universal Machine, aka a computer

Babbage never intended his machine to be more than a calculator. He probably understood the importance of his project to a fair extent, but it was Ada Lovelace that speculated on using the engine for more than just mathematics in one of her notes on Babbage’s work. But it would take 100 years for that idea to take off.

 

 The year was 1937 and a man Alan Turing had an idea. A lot of his ideas actually overlapped and expanded on Babbage’s ideas, but Turing had a more practical mindset and 100 years of other ideas to build on. Turing was very aware of the fact that numbers were symbols and nothing. This led him to two crucial thoughts:

  1. If numbers symbolize numerical values, then those values can be symbolized by other symbols too; and
  2. If numbers symbolize numerical values, then they can symbolize other values too

 The first point led him to embrace other symbols to symbolize numerical values, starting with the binary system (1’s and 0’s) and ending with very complicated systems his colleagues weren’t too happy with.

 

  The binary system was particularly interesting for electronic machines, whose circuits built on Boolean algebra (named after the mathematician Boole). True or false, 1 or 0, is the core language of computers today still. And it was the binary system he used in the paper he published in 1937, On Computable Numbers, for the theoretical machine he ideated there. The machine used an infinite tape of instructions, but apart from that it formulated all the basics of computer science as we know it today. And Turing’s ideas were never limited to a calculator: the “computer” he ideated in On Computable Numbers was what he called universal: it could be used for anything. That’s why it was so important to see numbers as the symbols that they were: binary numbers could be used to compute numbers, but also something like music notes, as Ada Lovelace came up with. What Alan Turing did was formalize all these ideas. He bound his “general purpose computer” to a set of rules and concrete concepts, effectively tuning computers into a science.

 

 The computer we know today, the computer this article was written on, wasn’t invented because of one man, or even two men and a woman. It has been a long road of people building on each other’s ideas. Two turning points in those ideas though were the two machines that never got built: the Analytical Engine and the Turing Machine. One set the foundation with an idea, but it was Alan who sat down and made a science out of it.  Babbage might be the father of the computer, Alan Turing was the father of Computer Science.

 

 Qlouder is a Google premium partner that helps clients with big data, application development and machine learning solutions in Google Cloud. We’re an organization working with the newest technologies but aware of where they come from. 

 

The main source for this article was Andrew Hodges’ Alan Turing: The Enigma of Intelligence. It goes very deep into Alan Turing’s mathematics and is a recommendation for anyone interested in computers, mathematics, artificial intelligence and gay history. We also recommend The Computer History Museum (in Mountain View, California) if you get the chance to go there.