Chapters
The word “computer” comes from the Latin computare, “to calculate”. A computer was originally a mechanical calculator or a person who did calculations where they were needed (such as engineering companies), first by hand, then with the help of a mechanical calculator.
Much of the math needed for the early space programmes was done by human computers, most of them women.
So how, from the elaboration of true mathematics in Antiquity to the first commercial computer sold in 1951 (the UNIAC), to IBM’s first Personal Computer (PC) in 1981 to the graphic interface of Windows 95, did we get where we are today?
The world has transitioneded from an Industrial Age to the Information Age. But Rome wasn’t built in a day: modern-day computing was established over many decades with the help of dozens of mathematicians, physicians and theorists.
Evolution of Computers: From Algorithms to the First Programme
al-Khwarizmi, the father of Algebra and algorithms

Let’s start with Abu Jaffar al-Khwarizmi, also called Mr. Algorithm - the word “algorithm” is, in fact, a latinisation of his name.
The development of computers is actually very closely linked to fundamental research in mathematics, most particularly logic and the algorithms that al-Khwarizmi elaborated in the 9th century AD. We also owe our modern Arabic numerals (1,5, 10 as opposed to the the Latin I, V, X) to him.
To calculate is basically to resolve a specific problem while using a specific set of rules.
Algebra - and algorithms - is the science of organising the operations needed to complete a task. These are abstract operations - for example, adding two numbers together. To go from there to a concrete operation, you have to code the abstract ideoa into a specific language, such as single signs to represent a value, with the placement of these signs influencing their information - “1” has a different meaning in the number "31" and "13" - and various other symbols encoding certain operations, such as “+”, “-” and “=” (all of which, however, came later than al-Kharizmi.)
This is how we go from "I have two apples and my friend gives me two more" to 2 + 2 = ?
al-Khwarizmi gave mathematics its own "programming" language.
Improve your coding skills by taking programming courses on Superprof.
The First Computer Programme
The first programme to influence the running of a man-made system was not for a computing machine, but for a mechanical loom. In 1801, the famous French weaver Joseph Marie Jacquard introduced a mechanical loom that could be programmed for different motifs using punch cards created using special typewriter-like machines.
The position of the punches changed the position of the loom’s mechanical parts, chose which shuttles to use, etc. - rather like modern industrial robots. One punch card equaled one row of the motif being woven; the cards were bound together to make strips containing a whole piece - rug, wall hanging, upholstery fabric etc.

Charles Babbage planned to used punch cards for his Analytical Engine, and the Harvard Mark I later used punched rolls of paper for programming.
Ada Lovelace and the Analytical Engine
In the end, though, computing aims to remove thought from calculations in order to make it possible for a machine - incredibly fast but entirely without thought - to calculate by itself.
Charles Babbage is considered the father of modern computers. He was never able to finish his Difference Engine (a machine that calculates polynomial functions) as the research and testing took so long that the Crown cut his funding. However, his son eventually finished building the first machine. It can be seen at the London Science Museum and it still works!
Note: You know you can take IT courses on Superprof.

He spent the rest of his life on the more complicated Analytical Engine, which had an arithmetic logic unit.
Ada Lovelace, a 19th-century mathematician, first published her work in 1840 under a masculine name. She is credited with writing the first computer programme for Charles Babbage’s Analytical Engine. The Analytical Engine was supposed to execute any calculation demanded of it by Man: both symbolical and numerical operations.
Inspired by Ada and want to learn programming? Find coding courses near you!
The Evolution of Computers: from Mechanical Calculators to Software
Man learned to make tools, then moved on to machines - objects that used a power source other than himself and which could execute certain tasks on their own.
But a mechanical machine cannot modify itself.
A computer, on the other hand, can modify its own programming, becoming a universal machine. It is capable of a certain kind of “mechanical intelligence”.
Alan Turing and Universal Algorithms
In 1936, Alan Turing wrote the founding article of computer science. He proved that certain elementary calculations were universal - by combining nothing but these calculations in different ways, it was possible to execute any algorithm at all - thus creating the principles for the creation of universal, programmable calculators.
Alan Turing inaugurated the Information Age.
In 1943, during the Second World War, Nazis communicated using the Enigma encryption machines. They looked much like typewriters and were equipped with a mechanism of cipher wheels that substituted letters, transmitting a message that third parties were unable to decipher. An identical machine at the other end received the text, but as its encryption cylinders turned in the other direction it wrote out the message in clear language. They were a combination of mechanical and electric components.
The British had managed to capture one of the Enigma machines but could not crack it. The code was deciphered in 1933 by Polish mathematicians, but the calculations took several days to decipher a message, and the Nazis changed codes every day. In the end, due to a combination of German negligence and hard cryptographical work, Enigma was cracked and the decoded messages helped the war effort. Alan Turing was part of the team working on the Enigma cipher in Britain.

The Harvard Mark I
After Ada Lovelace, another exceptional woman to contribute to the history of computers is Grace Hopper, who worked on IBM’s first entirely automatic digital computer: the Harvard Mark I.

Fun Fact: Computer “bugs” could sometimes be actual insects. One day in 1947 a Mark II broke down. They found a moth stuck in the relay circuit. The moth was carefully removed and glued to the computer log with the mention: “First actual case of bug being found.”

The first computers couldn’t be used universally, being programmed instead for calculations in a specific field. Grace Hopper was one of the first to defend the use of a computer language that would be based on English words. She invented a compiler, a programme that translated the algorithm’s programming code into machine language.
The first true computers
The first computers were born around 1940. In 1944, theoretical physicist John von Neumann describes the first computer architecture, the “von Neumann architecture”, which has triumphed over others and is the one used in almost all computers today. Von Neumann’s machine, the IAS, was built between 1945 and 1951 by engineers with a soldering iron, while women programmed the machine.
By the time Grace Hopper died in 1992, computers were becoming a staple in homes and no longer took up an entire room; they cost as much as a television set rather than the equivalent of a house or car. They revolutionised communication on a global level with the rise of the Internet.
The Web has become a fact of daily life for billions of humans worldwide.
The Computer Revolution: From Coding Information to Cyber-Descriptions
Information is an abstract concept, yet it can be measured. A message, no matter its real or supposed worth, no matter whether it is true or false, contains a specific amount of information. What the atom is to a molecule, the bit is to information: “yes/no”, “true/false”, “0/1”. Describing someone as male or female, young or old, small or tall may not yet allow us to recognise them on the street, but already we have three pieces of information about him or her - three bits.
Binary calculations first appeared in Europe around 1697 thanks to work by Leibniz. Binary calculation is at the heart of the first computers.
Shannon has defined the quantification of information in a mathematical way using the probability theory equations elaborated by Kolmogorov. Together, they changed the face of digital exchanges.
Every object (image, sound, text, data) has a digital reflection allowing that information to be memorised, transmitted, reproduced endlessly and manipulated in specific ways using all sorts of algorithms.
Metadata and the Semantic Web
Rose Dieng-Kuntz helped to define the semantic Web, a word designating a series of technologies aiming to render Web information accessible and usable by all software programmes - and by their users - using a metadata system.
Type “traffic accident” into a search engine and it will find all the documents where the words “traffic” and “accident” appear. But if the document mentions a “collision between a lorry and a bike” without mentioning “traffic accident” anywhere, it won’t appear on the list. The idea behind the semantic Web is to find a way for it to appear anyway.
The semantic web is a vast programme that is still being written.
Our Digital World Today: Evolution of the User/Computer Interface
At the beginning of the 21st century, interfaces between computers and the human brain were at their infancy.
We are beholden to Xerox at the Palo Alto Research Center for the graphic user interface at a time where PCs did not yet exist.
In 1968, taking advantage of the invention of colour TV, Douglas Engelbart presented a graphic environment with windows you can open and close with the help of a pointer plugged into the computer: a mouse.
From 1969 to 1983, the user interface was pretty minimal: the keyboard is used to relay information that is displayed on a screen. At the time, computers are reserved for professional use in certain elite domains.
From 1984 to present day, after various technological advancements, user-friendly interfaces became important. The information displayed on the screen is WYSIWYG (What You See Is What You Get), an expression made popular by Apple for its famous Macintosh PC. Interaction with the machine becomes symbolic, with windows, icons, menus and various means of selecting content, making learning how to use a computer much more accessible to the general public.
It was the birth of mass-market computer products, the true rise of the Information Age.
Whether you are a programming student, a computer historian or simply curious, we hope that this article answered some of your questions about the invention of the computer, the Turing machine or the birth of computer programming.
Check out our Beginner's Guide To Computers.
See What Computer Accessories You Should Get.
Discover more about Facebook.
The platform that connects tutors and students