Today computers are a part of nearly every aspect of our lives. We have many uses for them, such as communicating with each other, entertainment and learning. They are our access to the World Wide Web, which provides unimaginable amounts of information to anyone who wants it. However, it is much less common for a computer to be used for the reason that it was originally invented, to calculate. The dictionary defines 'to compute' as to make a calculation [check]. In fact, it was only after 1945 that the word 'computer' began to be used to describe machines. Before this, the word 'computer' referred to a person who did calculations.
Looking at the hardware that we possess today it is hard to believe that not so long ago computers were only able to perform simple mathematical functions.
In this essay on the development of computers we will focus on the mathematical roots of computing, daring not to venture too close to the computer dependent time that we live in today.
Chapter 1: Early Calculating Devices.
The very first calculating device was most probably the 'counting board' which developed at around the same time in different parts of the world.
The counting board started as a board covered in sand, calculations could be written on the sand and then wiped away after use. Later versions of the counting board were very similar to the abacus in principle, they had vertical grooves representing powers of 10 and pebbles that were used to represent numbers. In fact, the Latin word for 'pebbles' is 'calculi' which is the plural of calculus [1, p.168].
The device that we know as the abacus was originally developed in China in the 11th century. There are many variations of the abacus but in general they consist of wooden frames with beads strung across them on wires. Calculations could me made be sliding the beads along the wires, each row of beads representing a different power of 10. There are methods for multiplying and dividing large numbers and also ways of finding square and cube roots [1, p.169-173] [2, p.15].
The next development in calculating devices came in the early 16th century , but was not actually known about until 1967 when one of Leonardo da Vinci's notebooks with a sketch of a mechanical calculating machine was discovered. The machine was able to perform addition and subtraction. However, this idea was well beyond its time and so the machine could not have actually been built [2, p.15-16].
A slide rule is a device consisting of lines of numbers that can side against one another in order to do calculations. The most basic of slide rules would have had two linear lines of numbers with equal distance between each number. These could have been used to help with simple addition and subtraction.
In 1621, William Oughtred invented a slide rule that used moving number lines on which the distances between the numbers on one line were proportional to the logarithm of the number. One could use this device to add and subtract logarithms of numbers, and so was able to perform multiplication and division.
In the following years the slide rule was greatly improved by the addition of new logarithmic number lines, these made it useful for things such as finding the reciprocal of a number, doing trigonometry and working with exponential functions [1, p.176-181].
The slide rule continued to be used right up until the first hand calculator was created in 1973 [3, p.xiv].
In 1642, Blaise Pascal, A French mathematician who has contributed to a number of areas of mathematics [1, p.434] , invented a calculating machine that in principal worked in the same way as Leonardo da Vinci's machine. Pascal's machine could add, subtract and could also multiply but only by using addition multiple times [2, p.17].
In 1671, Gottfried Leibniz, who would later go on to become one of the founders of differential calculus [1, p.676], built an even more sophisticated machine. Leibniz found a way to automate multiplication and cut out the large number of repetitions that calculating machines before it needed. Leibniz's machine was very effective and much faster than any previous machine. Calculating machines after this point were all based on Leibniz's ideas.
In 1810, the first commercially produced calculating machine went into production in Germany, it was based on Leibniz's machine [2, p.17].
Chapter 2: Charles Babbage.
Charles Babbage could be considered as one of the fathers of modern computers. His ideas were very much ahead of his time but he very accurately predicted the direction that modern computers would eventually take [4, p.36].
Babbage was born in Devon in 1791, and was already very fond of mathematics when he attended Trinity collage at Cambridge in 1810, where he found that he already knew more that his tutor [4, p.36] [5, p.xi-xii].
in 1820 Babbage helped to found the Astronomical society, this meant that Babbage spent a large amount of time working with astronomical tables. He became frustrated by the number of errors he found in the tables. It was this frustration that gave Babbage the idea to build a machine that could accurately calculate tables [4, p.36] [5, p.xii-xiv].
A few years earlier the French adopted the metric system and hence had to recalculate many of their mathematical tables. A French mathematician named Gaspard de Prony was in the process of creating a new set of tables for the French government. De Prony tackled this problem by using the method of differences and a large group of 'computers', people trained in only the most basic mathematics. Because of the method that de Prony was using the any one 'computer' was only required to understand addition and subtraction [6, p.43-44].
Babbage saw what de Prony was doing and realised that a machine could exactly replicate the process that de Prony's unit of mathematicians went through to calculate tables. Using the method of differences Babbage could build a machine where each individual part needed only to be able to do addition and subtraction but the machine as a whole could create perfectly accurate tables of polynomial values. Babbage called his machine the Difference Engine [5, p.34] [6, p.44].
Babbage originally planned for his Difference Engine to consist of 96 wheels and 24 axes, however, this turned out to be too ambitious and the design was simplified. When the machine was built it consisted of 18 wheels and 3 axes [5, p.xiv], it was able to calculate the values of quadratic functions with an accuracy of up to 8 decimal places [2, p.18-19].
In 1822 Babbage demonstrated his Difference Engine to the Royal Society, he explained how a larger machine would be able to calculate accurate tables for astronomy and navigation. Babbage promised that he could build a Difference Engine that could compute much more complicated functions to 20 decimal places, the Society agreed to fund Babbage as did the British government [2, p.19] [5, p.37].
Babbage worked on his Difference Engine for several years, though due to his nature he kept having new ideas and had to restart from scratch. In 1828 Babbage once again applied for funds and was once again successful, the government also built a fireproof workshop on land next to Babbage's house [5, p.xiv-xv].
In 1833, plans were being made to move all of Babbage's work to his new workshop. It was at this time that Babbage's engineer, Clement, demanded more money and refused to let Babbage move the various parts of the engine that they had worked on together.
After several months Clement allowed Babbage to take the drawings and engine parts but kept all of the tools that they had built specially for the making of the Difference Engine.
It was when work on the Difference Engine had ground to a halt that Babbage had an idea for a much more advanced calculating engine, he called it the Analytical Engine [5, p.xv].
The Analytical Engine was, in a sense, a mechanical version of a modern day computer. One could input a series of instructions into the Analytical Engine by the use of 'operation cards' which made Babbage's new engine able to automate the computation of any function [2, p.19] [5, p.55-56].
'Operation cards' were essentially pieces of card with holes punched in them. Babbage had previously encountered this method of instructing machines with punched cards when he studied the Jacquard loom, the loom would be told what to weave by a string of punched cards [2, p.20-21] [4, p.42-43].
Using these 'operation cards' one could create what was essentially a program, Babbage had designed the very first programable computer [2, p.19].
The Analytical Engine caught the interest of a mathematician named Ada Byron, Countess of Lovelace, a woman now thought of as the worlds first computer programer. Byron worked closely with Babbage, writing programs for the not yet existent Analytical Engine. She gave very accurate descriptions of how to solve difficult maths problems using a program, she also spoke of the familiar if-else statement and hinted at looping strings of cards multiple times [2, p.21-22] [4, p.43].
In 1834 Babbage asked the government if he should try to salvage the Difference Engine or start working on his new Analytical Engine, after 8 years of waiting for an answer he was told that the government were no longer going to fund his work.
Babbage used his own funds to work on the Analytical engine until 1848 when he made a complete set of plans for an improved difference engine. Babbage continued to work on his calculating engines until his death in 1871 [2, p.20] [5, p.xv-xvi].
Though very little of what Babbage designed was actually built, his idea of a machine that could be used for more than one function was completely revolutionary at the time. Charles Babbage had made the first step towards the modern computer [2, p.21] [4, p.42]
Chapter 3: The Development of Computers During World War II.
The 'Decidability Problem', set by David Hilbert in the 1930s, asks if every valid mathematical statement can be shown to either be true or false by using an algorithm. Two men solved this problem independently, showing that there are indeed statements that are mathematically undecidable. One was a man named Alonzo Church, who solved the problem using his own branch of mathematics, 'lambda calculus'. The other was a Fellow at Cambridge University named Alan Turing. It is the method that was used by Turing that is most interesting, his concept of the universal Turing machine [2, p.23-28] [4, p.46-48].
The universal Turing machine is an abstract model, it consists of an infinitely long piece of tape divided into sections and a device that is able to read and write data. The machine moves up and down the tape, reading instructions from sections of the tape. It can also write data onto the tape and has the ability to overwrite old information.
Turing showed that his theoretical machine is able to perform every possible method needed to prove or disprove a mathematical statement. He also showed that there are tasks that his Turing machine could not complete, and so solved the Decidability Problem. The universal Turing machine was, in theory, a 'stored-program' computer, this idea was to have a definite influence on the development of computers in the 1940s [2, p.28] [7, p.107-109].
During the war Turing worked at Bletchley Park as a cryptanalyst, his job was to find ways of quickly decrypting intercepted messages. German messages were sent encrypted using the 'Enigma Machine', the Enigma was a very well designed method of encryption and was long thought of as impossible to crack [7, p.110-111] [8, p.169-170].
Turning managed to find a flaw in the way the messages were written that allowed the code to be broken. Using this flaw he managed to find a way to design a machine that would only need to check 17,576 different options in order to break the code for a particular day. The Enigma machine had around 900,000,000,000,000,000,000 possible initial settings so creating a machine that only needed to check 17,576 cases was a huge achievement. These machines were called Bombes, soon there were 16 Bombes in operation at Bletchley Park working around the clock to provide the military with essential intelligence [8, p.170-177] [9, p.35].
In 1939 in the US, a professor named John Atanasoff and a student named Clifford Berry began constructing one of the first completely electronic computer. This computer (that would later become known as the 'Atanasoff-Berry Computer', or 'ABC') was built to be able to solve systems of linear equations [10, p.3-5].
The ABC used capacitors to store binary information so a circuit was needed to 'top up' the capacitors and keep them from loosing their charge.
The machine was essentially completed in 1941 but when the US entered into World War II both Atanasoff and Berry took other jobs and left the ABC on hold. When the chance came to return to working on the ABC neither of the two men wanted to leave their current occupation, and so the ABC was eventually dismantled to make space [10, p.4-6].
Though the ABC never accomplished much it did succeed in inspiring a physics professor named John Mauchly.
It had always been Mauchly's ambition to create a machine powerful enough to quickly find solutions to the differential equations used to model the weather. These equations could take weeks to solve by hand, making it pointless to even attempt to predict the weather [10, p.7-8].
In 1940 Mauchly met Atanasoff who introduced him to the concept of the digital computer, an idea that would make Mauchly's 'weather machine' a possibility. Inspired, Mauchly took a course in electrical engineering. It was on this course that he met Presper Eckert, one of his teachers. Mauchly and Eckert formed a partnership and set about designing their machine. However, they soon encountered a problem. The machine would cost a vast amount of money and few people were willing to fund a machine who's purpose was to generate more accurate weather predictions [10, p.7-8].
Luckily for Mauchly and Eckert a new option became available when World War II came to the US. The war saw the development of many new kinds of projectile weaponry in the US, for each new weapon that was created the military needed to know the distance that the bullet would travel for all firing angles. This was an enormous task as for each firing angle making table describing the trajectory of the bullet could take around 40 hours and each new weapon needed hundreds of firing angles to be calculated.
The military dealt with this problem by hiring large groups of mathematicians to work on the problems as a team. These mathematicians were given the title 'computers'. At the time there was little work for female mathematicians and so most 'computers' were women [10, p.10-11].
Using this method it would take one team of computers weeks to complete the firing tables for a single gun. Mauchly saw that this was a chance to have someone fund him to build his differential equation solver, he was correct and so began designing his computer, the ENIAC [2, p.38-39] [10, p.9-12].
The ENIAC was completed in 1946 and ended up costing the military in the region of $500,000, it was also enormous and weighed tons. It was a successful and could calculate the firing tables for a gun in hours. Many of the human 'computers', who were no longer needed, were hired as programers for the ENIAC. This was not an easy job to do as the ENIAC was programed by re-wiring the whole machine [2, p.40-41] [10, p.16-19].
The first job given to the ENIAC was to work out if the creation of a hydrogen bomb is feasibly possible. The mathematics behind the question is very complicated and so at the time the ENIAC was the only way to answer it quickly [2, p.40] [10, p.19-20].
Chapter 4: Commercial computing.
In the US Constitution it states that every 10 years each state must hold a census. In 1790 when the first census was held the population was sufficiently small so that the whole process took relatively little time to complete. As the population of the US grew so did the amount of time that it took to analyse the results of the census, by the late 19th century it was taking around 7 years to finish [10, p.xix].
It was this situation that inspired a man named Herman Hollerith to invent a machine called the mechanical tabulator. The tabulator was designed to handle and organise very large quantities of data, just what was needed to cut the time that took to analyse census results. Information was entered into the tabulator using punched cards, the same method that Babbage planned to use with his Analytical Engine. Each card represented a person, the tabulator could sort these cards into different sections depending on the information on the card [10, p.xix].
Many other businesses also needed to deal with large amounts of data, this created a strong industry in tabulators and caused the machine developed very fast. Hollerith had invented the first computation machine that was widely used by businesses.
Hollerith was very successful and in 1924 his company was renamed to International Business Machines, it is now known as IBM . For a very long time IBM dominated the tabulator market, developing a large range of machines for work with punched-cards [2, p.23] [10, p.xix-xx].
Computing developed very rapidly in the US after 1945. It was the time of the cold war and the military, inspired by the success of the ENIAC, poured money into the development of computers. One such project was the WHIRLWIND, a computer built for controlling the air defense at Cape Cod [9, p.375-377] [11, p.7].
At this time there was also a very strong market for commercial computers. IBM had grown into an enormous corporation using their punched-card technology, selling mainly to businesses [11, p.7-8]. At this time IBM was not convinced that electronic computers would ever replace their tabulators [11, p.34].
In 1946, John Mauchly and Presper Eckert set up their own company and moved on to a new project, the UNIVAC. They intended to produce multiple copies of the UNIVAC and sell them for use in businesses, science and the military. This of course meant that the UNIVAC needed to rival the widely accepted tabulators of IBM [10, p.31-32] [11, p.30].
By 1950 the census would once again take far too long to complete, the same problem that was solved by Hollerith in the late 19th century. The Census Bureau was in trouble and so it was easy for Mauchly and Eckert to find funding for the creation of their 'multi-purpose' computer [10, p.31-32]. There were many problems along the way, resulting in their whole company being owned by Remington Rand, a rival company of IBM more famous for their range of typewriters [10, p.32-39].
The first fully operational UNIVAC was given to the Census Bureau in 1951, just in time to process the results for the 1950 census. It was so big that they didn't dare move it from where it was built due to fear of it breaking [10, p.40] [11, p.27].
The UNIVAC then became involved in a project to predict the result of the 1952 presidential election. This was something that had never been done before and gained a lot of public attention. The prediction was made correctly, and suddenly the general public become much more aware of the power that computers possessed [10, p.40-42].
In the end 25 UNIVACs were sold, this was a large number at the time but IBM soon took control of the market once again with its own electronic computer [10, p.44] [11, p.34].
Another man worth mentioning is the German engineer, Konrad Zuse. Zuse started designing computers in 1936. By 1938 he had built his first computer, the Z1. Zuse was the first person to build a computer that used a binary number system, rather than the classic decimal system. Zuse also built his Z1 so that it used floating-point arithmetic, this allowed for both very small numbers and very large numbers to be used in calculations.
Zuse finished building the Z1 in 1938 but decided that it was not as reliable as he would have liked and so Zuse went on to build the Z2 and the Z3, both of which used a large number of telephone relays in the arithmetic unit. Zuse claimed that his Z3 was able to do any calculation and was even able to play chess [2, p.34-36] [10, p.xxi-xxii].
During the war all three of Zuse's computers where destroyed in an air raid, along with most of his plans and drawings, he began to build the Z4 but was forced into hiding in 1945 by the war. Zuse completed the Z4 in 1948 and sold it to the Institute of Applied Mathematics in Zurich in 1950, this was the first computer to be sold commercially. Zuse formed his own company and by 1969 he had sold over 250 computers. [2, p.37-38] [10, p.xxii].
Chapter 5: The Dawn of the Information Age.
The period between 1952 and the current day has seen computers developing at an incredible rate [11, p.7]. The UNIVAC had kick-started the commercial selling of computers in the US [11, p.27].
By the early 1960s the first operating systems were being created and programing languages such as FORTRAN and Cobol already existed. In 1964 Gordon Moore, the founder of Intel, noticed a 'law' in the development of computers. More noticed that since the invention of the integrated circuit in 1958, the number of transistors on a single chip doubled every year. In 1997 this slowed to every year and a half and is still keeping that rate today, this effect is known as 'Moore's law' [3, p.12-13] [11, p.217].
In 1971 the first silicone microprocessor was created by engineers at Intel [11, p.217-218]. The mid 1970s saw the first PCs appearing, soon it became common to work on a computer. Soon after Microsoft BASIC had been released, allowing applications to be written more easily. This was followed by the floppy disk and more advanced operating systems. Software was no longer dependent on the model of the computer it was running on. By 1977 PCs came with keyboards and monitors, software companies had emerged. This was the creation of the PC that we use today [11, p.221-241].
Since then computers have become more and more a crucial part of our everyday lives.
In 1992 the World Wide Web, though it was created in 1989, brought computing into a whole new age. Computers were no longer being used purely for calculation, they became a part of peoples everyday lives [3, p.xiv] [11, p.1].
Since then computers have continued to work their way into our lives at an alarming rate, most of us keeping at least one computer with us at all times. It is strange to think that every time we give a computer a task our request is turned into a mathematical problem which the computer then solves. As we have seen, the computers that now run our lives are simply bi-products of a technology designed to speed up the process of solving mathematical problems.
- J. Gullberg, Mathematics: From the Birth of Numbers, (New York: W.W. Norton, 1997).
- N. Barrett, The Binary Revolution, (London: Weidenfeld & Nicolson, 2006).
- P.J. Denning, R.M. Metcalfe, Beyond Calculation: The Next Fifty Years of Computing, (New York: Springer, 1998).
- G. O'Regan, A Brief History of Modern Computing, (London: Springer, 2008).
- P. Morrison, E. Morrison (ed.), Charles Babbage and his Calculating Engines, (New York: Dover Publications, 1961).
- A. Hyman, Charles Babbage: Pioneer of the Computer, (New Jersey: Princeton University Press, 1985).
- B.J. Copeland, Alan Turing's Automatic Computing Engine: The Master Codebreaker's Struggle to Build the Modern Computer, (Oxford: Oxford University Press, 2005).
- S. Singh, The Code Book: The Secret History of Codes and Code-breaking, (London: Fourth Estate, 1999).
- N. Metropolis, J. Howlett, G. Rota (ed.), A History of Computing in the Twentieth Century, (New York: Academic Press, 1980).
- M. Hally, Electronic Brains: Stories From the Dawn of the Computer Age, (London: Granta Books, 2006).
- P.E. Ceruzzi, A History of Modern Computing, (Massachusetts: MIT Press, 1998).