By Clara Martens Avila

 

A Brief History of Computers #3

This is the last of a three-part series on the history of computers made by Qlouder. The goal is to give an idea on why computers are the way they are nowadays. This time: how making computers accessible to consumers paved the way for the cloud.

Part I:  The difference between an advanced calculator and a computer.

Part II: The road from an idea to Artificial Intelligence

 

We left the last installment with the uncertain future of Artificial Intelligence, but for this last part of our series we’re going back to the 60s. The 60s saw the conception of a lot of ideas that would shape the future of computers. We’ve come a long way from the supercomputers of the time to the cloud we use today, and in a way this story is also a story of the way users interact with computers; the evolution of different fields within technology have always been connected. We start this evolution of computers with two Computer Science research centers at Stanford University. On the one side the Stanford Artificial Intelligence Laboratory (SAIL), with the firm belief that machines were one day gonna replace humanity, and at the other side a lab with a radically different vision, the “Augmentation Research Center”, that envisioned a future in which computers extended, “augmented” human capabilities. We’ve already covered the rocky road of Artificial Intelligence in the last article, so let us now move on to the other side of the campus.

 

At the time computers were already there and working, but they were mainly used by companies and scientists for complicated calculations. No one had really considered them as consumer products… Till Douglas Engelbart came along. Douglas had a clear vision of what computers should be and for whom, and out of his lab came revolutionary inventions like the computer mouse.

The “Mother of All Demos” in which Engelbart introduced the world to the computer mouse, and so much more.

Computers at the time were “supercomputers”, huge machines that could do very complicated calculations and that didn’t exactly look like they would fit in a living room soon. But a contemporary of Engelbart was Moore, and those familiar with Moore’s law (though more of a “common knowledge” in the valley at the time) know that computers would become smaller, and very soon. People knew that, and yet only Engelbart and a bunch of others saw their potential for the rest of the world.

From supercomputers to mainframes

Over the years his vision endured and as computer’s usability increased so did their actual use. Airlines had been experimenting with automatic booking processing since the 40s, but it wasn’t until 1963 that suddenly every office of Trans Canada Airlines possessed a terminal to access their booking system. The key thing here was that not every office had an enormous computer in the back. Booking transactions were processed in a central place, a central computer, that was accessible through all the terminals in all the locations. It wasn’t exactly a supercomputer either. Booking transactions were complicated because of their scale, not because of their complexity, and so they were processed through a central computer we call a mainframe. Mainframes couldn’t do very complex calculations, but they could handle a big amount of different input streams.

Steve Jobs and his Macintosh personal computer

From mainframes to servers

These mainframes were used more and more and people slowly grew familiar with using them in business settings. And then 1984 came along and Steve Jobs made an entrance. He introduced the public (or, at first, the Boston Computer Society) to the Macintosh computer. A computer for people, with graphics instead of a terminal, and all of a sudden a computer was indeed something anyone could have.
Did the fact that everyone could have a computer now make mainframes go away? Not exactly, central transactions were still necessary. But a new concept emerged, “the internet”, and with the internet came servers. Central processing was still necessary, yes, but now part of that processing could be done from everyone’s own desktop. The “processing” load was shared between computer and server. PCs could only handle so much work, so servers were necessary for covering the rest (and scaling!).

Google releases App Engine to the public in 2008, the first of their Google Cloud services.

The Cloud

And now we have cloud computing. It has become such a buzzword that the reality of it seems almost like an anticlimax; the cloud is just a bunch of servers that you will (probably) never physically see. It’s just that, but it has also become so much more. Qlouder works with the Google Cloud Platform, a “platform” that not just hosts your applications, databases and can manage huge computing loads, but it has also extended into a wide range of tools for developers and consumers, from Firebase and Tensorflow to Google Drive. And it may just be a bunch of servers at its core, but the possibility to only pay for what you use, to scale (almost) indefinitely and not have to worry about keeping your own servers safe and working. We’ve come a long way from the mainframes and supercomputers of the 20th century, but in a way, with the cloud, our own computer has become a “supercomputer”.

 

Qlouder is a Google premium partner that helps clients with big data, application development and yes, also machine learning solutions in Google Cloud. We’re an organization working with the newest technologies but aware of where they come from. This is the last in a three-part series on the history of computers made by Qlouder. 

 

An important source for this article was John Markoff’s What The Dormouse Said. You can watch Engelbart’s Mother of All Demo’s here, and Steve Jobs presentation of the first Macintosh computer here. A special thanks to our “Chief Geek” Stefan Hogendoorn’s and his insights, and also to a good friend of the author for his insights and the book recommendation.