Join us

From Abacus to Containers - A Brief History of Computing

From Abacus to Containers - A Brief History of Computing

We live in a world built by our collective ingenuity and imagination. We see further and more than our predecessors, not because of keener vision or greater heights but because we are standing on the shoulders of giants that came before us.
The Japanese word "sensei" literally means the person who came before.

Do you remember the first time you touch the computer keyboard?

Do you remember the typewriter clanging sound?

Do you remember your first HTML rendered on the world wide web or your first "hello world" application?

Maybe you were a gamer, and you blew the cartridges on your family computer?

No doubt, your first lines of code rely on the combined outcomes of thousands of years of accumulated knowledge and wisdom.

I'm your host Kassandra Russel and today, we will navigate through history to discover how our ancestors made knowledge out of information. We'll talk about the technologies that have marked contemporary and modern history. Then we'll go back to modern times to talk about the first web server, virtualization, cloud computing, Docker, and Kubernetes. This is the tale of computing, in which humans are heroes, and their greatest weapon is the imagination.


On the rivers of Tigris and Euphrates, more than 4000 years ago, lived an ancient civilization: The Babylonians and their ancestors, the Sumerians.

They were fishermen, herdsmen, and farmers. Over time, with bountiful harvests, their society became more prosperous, their economy kept scaling, and they felt the need for a computational system: an information processing system that can be represented mathematically.

Imagination is more important than knowledge.

Albert Einstein

The Sumerians used the power of imagination and started abstracting the concept of numbers.

They have left the oldest known written texts. These early writings made use of a sexagesimal system. They're the distant ancestors of the base 60 that we still use when we count the seconds and minutes in each hour.

In addition to the first love song and the first lullaby, the Sumerians invented the first calculator too.

How do we, as humans, conceptualize a number as big as 1000 or 10000? And even though we might never know who came with the idea of abstracting numbers. We nonetheless see the result of it: The first calculator - the first counting device used to count large numbers - the "Abacus".

The Egyptians subsequently adopted the same system followed by the Greeks around the 5th century BC, the Romans around 300 BC, then the Chinese around 200 BC.

After mastering the first mathematical arithmetics, attention turned to using it to solve problems far more complex than counting, selling, or buying like automated reasoning and data processing. This was when we started using algorithms.

The word algorithm comes from the name of the 9th-century Persian mathematician Al-Khwarizmi.

In the 9th century, history knew the first hacker: Al-Kindi. This Arabic polymath gave birth to cryptanalysis through his very famous book and documented the first "crypto attack".

Starting from the 12th century, Europeans focused on studying Greek, and Arabic culture, and the Hindu-Arabic numerals were introduced to Europe through the writings of Al-Kindi and Al-Khawarizmi.

With the advent of the base 10 system and Arabic numerals from India, Europeans replaced the Roman numerals and moved away from abacus use.


We're in the middle of the European Renaissance, whose very first traces appear in Italy, where a polymath named Leonardo da Vinci conceived the first mechanical calculator, followed by the French mathematician Pascal Blaise. To help his father, a tax collector in Rouen, Pascal developed the Pascaline, which could handle multiplication and division through repeated additions or subtractions.

After a few years, philosophical mathematicians, including Leibniz and Lambert, started treating the operations of formal logic followed by George Boole, then Augustus De Morgan, and then others like Jevons, Schröder, and Huntington. They were all studying a systematic mathematical treatment of logic. We reached the modern conception of an abstract mathematical structure: Boolean algebra. Their work helped Claude Shannon, an American electrical engineer, invented the first digital circuit that implements boolean algebra.

We can't actually talk about the history of computing without talking about two things:

Transistor and Turing machine. The latter was invented by Alan Turing in 1936. It was the first reliable approach to make thinking calculable, and it's considered the foundation of artificial intelligence. On the other hand, the transistor was invented in 1947 by John Bardeen, William Shockley, and Walter Brattain, researchers at Bell Laboratories. They were awarded the Nobel Prize in Physics in 1956 for this invention.


In July 1969, the computer on the spacecraft carrying Neil Armstrong and Buzz Aldrin to the Moon had 60,000 times less RAM than an iPhone 3G, and the transmission speed between Apollo 11 and Houston was 6,000 times slower than a regular DSL line.

Indeed, it was not until 1971 that the invention of the microprocessor increased the power, speed, and reliability of computers while making it possible to reduce their size and cost.

So much that they soon became part of every business and then every home, providing everyone with the means to access information and communicate easily.

There was a series of inventions and discoveries developed by researchers, hackers, scientists, and entrepreneurs that made our digital world what it is today.

Dennis Ritchie, Linus Torvalds, Richard Stallman, Roland Moreno, Bill Gates, Paul Allen, Steve Jobs, Steve Wozniak, Tim Berners-Lee, Larry Page, Sergey Brin, and many others.

They all contributed in a way or another to make computers and servers accessible to everyone.


The very first website was launched on December 20, 1990, by Tim Berners-Lee. At the time, he was a researcher at the European Centre for Nuclear Research in Switzerland.

The website, Info.cern.ch, was hosted on a NeXT computer at the research center. The source code of the HTTP core was published in parallel by Tim Berners-Lee, as well as the first browser that he developed with the help of an engineer from the center, Robert Cailliau.

Tim Berners-Lee's server technology then spread to other research centers throughout Europe and the world. In November 1992, there were 26 web servers, then 200 in October 1993.

After the democratization of the Internet and the increasing need to create infrastructures that support higher workload, server and hardware virtualization has become more and more accessible with free and open-source software such as KVM but also with proprietary software such as VMWare Server or Virtual Server. Just before that, much of the work on virtualization was developed at IBM's Cambridge Science Center during the 1970s.

Cloud computing technologies benefited from hardware virtualization, grids, service-oriented architecture, and web services.

As with virtualization, cloud computing can be as attractive to the end-user because of its scalability. Indeed, the cost is a function of the time spent using the service and does not require any prior investment.


In reality, cloud computing is the kind of concept for which it is not easy to attribute a precise origin. Some attribute it to the founding father of the Internet: Arpanet, the technology without which cloud computing would not exist. Others believe that the discipline only appeared later, in the 2000s, as a solution to a problem encountered by Amazon: the oversized server park during non-holiday periods... To make these machines profitable, Amazon decided to rent its servers to other companies, on-demand.

The first person to use the term cloud computing was Professor Ramnath Chellappa in 1997, and the first company that turned this concept into a business was Salesforces in 1999. Amazon followed its lead in 2002.

In all cases, without virtualization, nothing was possible. It gave birth to cloud computing and optimized our use of physical resources.


The innovation did not stop there. After a few years, containerization became the new virtualization.

It's actually a type of virtualization used at the application level. The principle is based on the creation of several user spaces isolated from each other on a common core. The term "container" is then used to refer to such an instance. This separation is based on a concept similar to that of partitioned application modules, communicating using web services and applications.

Containers, although independent, share a common kernel and the same memory space. In addition to that, they have an interesting advantage for all application developers as they provide a standard and a shared space containing the code, runtime environment, configuration, network file systems, and other libraries necessary for their proper functioning.

In reality, the first work on containerization can be traced back to 1979. It all started when the Chroot Jail and the Chroot system calls were introduced during the development of Version 7 Unix.


At the annual PyCon in March 2013, Solomon Hykes showed Docker to the public for the first time:

Containers are self-contained units of software in a server over there and server over here. It will run the same way

Solomon Hykes

And after a 5-minute talk about Linux Containers or what we know as "LXC", Docker officially entered the world.

Now, Docker is the most known and used container technology, but in reality, it was the tree that hides a forest of other projects.

During the last years starting from 2015, distributed computing was one of the most in-demand skills. It graduated from a niche skill to a more prominent one in the global workforce. Containerization and orchestration technologies became the trendiest topics in the cloud economy and the IT landscape.

Containerization and orchestration are solving complex puzzles using the power of abstraction, and they became mainstream. If this proves a thing, it's the potential of abstraction.

Even now, 4000 years later, we still abstract concepts, and we need to do it, certainly.

In the boundless universes that we imagined and re-imagined, there is a secret door, a hidden gateway whose key is the abstraction.

We are the designers of the ideas and the builders of the technologies. We built on top of the existing legacy.

Isaac Newton said once:

If I have seen further than others, it is by standing upon the shoulders of giants.

Isaac Newton

Technologies like containers are an abstraction of the app layer, the same way the Abacus was an abstraction of everything we can count.


In the upcoming episodes, we are going to look into the history of containers and the story behind the most popular orchestration framework: Kubernetes.

Stay in touch by following @joinFAUN on Twitter, reach us using the hashtag AskFAUN and join the community by visiting faun.dev/join


Start blogging about your favorite technologies, reach more readers and earn rewards!

Join other developers and claim your FAUN account now!

FAUN

Developers Community

Avatar

Aymen El Amri

Founder, FAUN

@eon01
Founder of FAUN, author, maker, trainer, and polymath software engineer (DevOps, CloudNative, CloudComputing, Python, NLP)
User Popularity
2k

Influence

203k

Total Hits

38

Posts