History of computer began with mathematical computations. In early centuries people faced problems performing calculations. They used hand and fingers to calculate. It is said that Babylonians in 300 B.C. use abacus, but wrongly attributed to China. Abacus can perform addition and subtraction by skilled abacus user. After this device for a long time there was no sign of improvement/inventions of new devices.
Fast forward to 17th century, 1645 in France we notice the first mechanical calculating machine invented by Blaise Pascal and named the "Pascaline". By a creative arrangement of cog wheels and dials this early calculator could add and subtract. Soon after that (in 1672), a German mathematician called Gottfried Leibnitz came up with a new mechanical gadget that could add, subtract, multiply and divide. Unluckily, both gadgets were too costly to manufacture on a marketable basis and a small number of were ever sold.
The main features of modern computer are the ability to run a pre-programmed set of instructions. Today we can find many means for storing a program but, as far back as 1801, Jacquard was programming textile patterns to be repeated indefinitely upon his looms. Jacquard used a series of boards with holes which triggered the lifting or dropping of hooks to produce the desired pattern in the weaving process.
To resume with the theme of mechanical calculator, during the 19th century Charles Babbage dedicated his life's work to designing and building a computer complete with the ability to be programmed to perform complex calculations. As a result he invented machine named Difference Engine machine weighing over fifteen tons but the project was never completed. Working replicas now locate in the Science Museum in London and in the Computer History Museum in California. But Babbage did not stop his work rather he continued to develop his designs for a new model, called the Analytical Engine, which occupied his genius for the rest of his life. Same as the Difference Engine, the Analytical Engine used punch cards - similar to that used by Jacquard for his textile looms - to store and run the programs. Unfortunately, the machine was never built but it did provide an early conceptual model for the coming computer age. During this Ada Byron (Countess Lady Lovelace by marriage), she earn her spot in the history as the first programmer, as she had written series of "notes" and instructions for Analytic engine. Also, she invented sub routine and was the first to recognize the importance of looping.
Around that time, a boy named Herman Hollerith, who won the prize from US Census Bureau, used the concept of punched card and invented a tabulating machine. His tabulating machine helped U.S. Census Bureau to finish its work in around 3 years time (1890 US Census). In this machine, the holes in the punched card permitted electrical contact and, prevented such contact where there were no holes. Different kinds of data could be stored, analysed and organised efficiently and quickly using this method.
Herman built a company, the Tabulating Machine Company today popularly known as International Business Machines (IBM). IBM grew rapidly with the popularity of punched cards as it can record details of people. When we entered in a toll way (a highway that collects a fee from driver) we are given a punch card which record where we started and then when we exit from a toll way, fee was computed based upon the miles we drove. At the time of election we are handed the ballot which was a punch card. In this way punched card got popularity. Incidentally, the Herman census machine became the first machine to ever featured on the cover page on a magazine. IBM continued to invent mechanical calculators to increase business to help with financial accounting and inventory accounting.
In 1944 Harvard and IBM together worked, they came with Harvard Mark I which was one early success in history. Mark I was the first programmable digital computer ever made in the U.S. Grace Hopper is known as one of the primary programmer for Mark I. In 1953, she invented the first high-level programming language which eventually became COBOL.
Early 20th Century Developments
Over the next 30 years or so, many inventions took place which, although not particularly apart of any design for a "computer" (which still did not have a proper descriptions at that time), were however important in the development of what we now know a computer to be. These contain advances in mathematics leading to the earliest programming languages (Kurt Gödel, Konrad Zuse and Alan Turning) and a gradual move from mechanical towards electronic gadgets.
The Enigma Machine
Alan Turing involved in adapting the new design of a Polish decryption machine called a Bomba: adopting the name, "Bombe" the British version. This electro-mechanical machine was highly successful in decrypting Enigma programmed messages thereby upsetting many of the German military campaigns. At the end of the war, Bletchley Park was to see the progress of what some think being the first electronic computer: "Colossus". The active word we are looking at here is "electronic". The hardware for Colossus was designed by a British Post Office today known as British Telecom, engineer called Tommy Flowers and used electronic valve technology. Colossus was not a general purpose computer; its function was particularly to break the new German Lorenz codes in use at the time. However, it was electronic and it did run stored programs (punch-tape was used rather than punched cards), it claim to being one of the first electronic computers in the history.
The Post-War Computer Age
After the war, the commercial powers knew the prospective of electronics in common and computers in particular. It was not long before electronic valve (or vacuum tube) technology was seen on televisions to industrial control systems. Subsequent work by mathematician, John von Neumann, a standard model for the design of the stored program computer designer was adopted which has hardly changed since. This model integrated the basic workings of a Control Unit, an Arithmetic Logic Unit, memory and input/output (I/O) devices. Later on the Control and Logic Units together became integrated in the Central Processing Unit (CPU).
Around 1950s and 1960s saw the development of big corporations manufacturing big computers -known as "mainframes". At that time, before Microsoft and Google, the most well renowned names in the industry were IBM, Burroughs, Univac, Honeywell, and Control Data with ICL were the best known in the United Kingdom. The mainframe market began to fall down during the late 1980s and 90s until some of the larger companies realised that, still there was a place for the big machines as large scale database servers. Nowadays, still IBM dominates what is left of the mainframe market but its place as market head in the industry as an entire has long since been usurped by Bill Gates and his company, Microsoft Corporation.
Near the end of 1960s and early 1970s, new developers were starting to appear - primarily in educational and research institutions - providing smaller computers. As mainframe was the size of a small truck and require its own temperature controlled room, new mini-computers were smaller than a desk - even fit under a desk - and they enabled the individual university department or the smaller business to have its own processing power.
At the moment, the vacuum tubes were disappeared: replaced first by the transistor and later on by the "Integrated Circuit" (IC) or "Microchip". Revolution of microelectronics is what allowed the amount of hand-crafted wiring seen to be mass-produced as an integrated circuit (IC) which is small silver silicon, size of thumbnail. The chief advantages of IC are not that the transistors are minuscule's, rather that millions of transistors can be formed and interrelated in a mass-production process. All the fundamentals on the Integrated Circuit are fabricated concurrently via a small digit of optical masks which define the geometry of each layer. This speeds up the procedure of fabricating the computer and thus reduces its costs. By early 1980s this lot of transistors could concurrently fabricated on an integrated circuit. Nowadays Pentium 4 microprocessor contains 42,000,000 transistors in this similar thumbnail sized piece silicon.
At that time IBM was still the successor in the mini-computer market too, while other names such as Digital Equipment Corporation (DEC), Hewlett-Packard (HP) and Prime were also making an impact in market. Some mini-computer developers started to install a new operating system, written primarily in 1969 by some staffs of Bell Laboratories to run on a DEC PDP-7 mini, and finally called Unix (name changed from multics and unics). This operating system afterward became liberally available to all via the Open Software Foundation though, by then various slightly dissimilar incarnations of it had been seen. We have AIX from IBM, HP-UX from HP, and Solaris from Sun. Shortly, a version distinctively designed to run on Intel PC was developed by Linus Torvalds and after him it was named Linux.
Now that smaller organisations and academic units could afford the new mini-computers, the time was approaching for the first of the even smaller micro-computers to appear. To appear, not only in the individual classrooms or in the back office of the small trader, but also in the home. The first few turned up as kits to be assembled by hobbyists who had already performed outstandingly well as trailblazers for the electronic pocket-calculator industry. Indeed, as early as 1975, a DIY kit appeared in the US magazine, Popular Electronics. The magazine featured the Altair, a computer which had no keyboard and no display and could be programmed by means of flip-switches on the front panel (as could the PDP-11 in the picture above). This magazine article is very significant in the history of computing because it was read by the young Bill Gates who saw an opportunity to sell an interpreter for the BASIC programming language developed by himself and Paul Allen. Bill and Paul called their fledgling software company "Microsoft".