In 2008, the number of computers being used in the world grew to over one billion, a total which continues to grow. Computers are an integral part of the western world’s modern home and office. Many people depend on computers for communication, their work, their news, and much more. Personal computers didn’t really become available until the late 1977s, however the making and development of computers has been happening for centuries, beginning way back with the classical mathematicians.
Sometime before 1387 A.D.
Although the exact date is unknown, the abacus was developed in Asia sometime before 1387 A.D. It was around this date when the word “abacus” was used for the first time to describe a tool used to make calculations. The abacus is believed to have been first invented in Asia, but was traded by merchants over time causing the abacus to spread to Egypt, Greece, India, Japan, Russia, and the world. Since a computer is merely a machine which solves equations, albeit rather difficult equations these days, the abacus is arguably the first computer, or at least the first computing device.
The first successful movable type printing press was created around 1448 by Johann Gutenberg. While other Dutch, Chinese, and Korean inventors have claimed to also make printing presses, Gutenberg’s was the first one to print enough books, most notably the Bible, to give to the masses.
Francis Pellos of Nice developed the decimal point system in 1492. Decimals are an important part of financial and other calculations computed by computers today.
A French mathematician, Blaise Pascal, developed the Pascaline, which had the ability to do addition and subtraction problems which involved the “tens-carrying” function. This device was originally created as a tax compilation tool for the French government.
In the early 1800s, J.M. Jacquard developed a loom. Although this loom was just used to weave goods, it was actually a big step forward towards the invention of modern computers. The loom used punch cards which stored different weave patterns, and partially automated the weaving process. These punch cards were early versions of computer storage devices and programs.
In the 1860s, Charles Babbage, who is often dubbed the “father of the computer”, invented his Difference Engine which further developed into the Analytical Engine. The Analytical Engine was powered by steam and had two modern components of a computer: a central processing unit and memory. The machine wasn’t finished during Babbage’s lifetime, but it did continue paving the way for the modern computer. Lord Byron’s daughter, Ada Augusta Lovelace, is responsible for figuring out to use punch cards to run programs on the Analytical Engine.
In 1911, the Computing-Tabulating-Recording Company (CTR) was founded, the company that would later change its name to IBM. CTR sold a range of office products including Julius E. Pitrap’s computing scale, Alexander Dey’s dial recorder and Willard Bundy’s time clock. These devices were used in offices across America and IBM was one of the first technology companies to have enough faith in the future of computing devices to get into the tech business early on.
Another big company to enter the computer scene was Hewlett-Packard, founded in 1939 in Palo Alto, California. Hewlett- Packard designed test equipment popularly used by engineers. Most notably, their machines were used by Walt Disney to produce Fantasia in 1940.
Bell Telephone Laboratories commissioned a project to create a calculating machine which could be operated remotely. In 1940, the Complex Number Calculator (CNC) was released.
The year 1952 was a big one for progress in the computer industry. Mathematician Grace Hopper developed the first compiler based on words which were “English-like”, the Institute for Advanced Studies in Princeton, New Jersey bought an IAS computer which prompted other research institutes to invest in their own large computers, and IBM and other companies began using magnetic tape for mass storage of information, a far cheaper solution for storing data.
The decade from 1965-1975 is considered the 3rd generation of computers. At this point, computers were large machines which included transistors, resistors, capacitors, and much more. These computers were often so large they took up a whole room. In 1971, Ray Tomlinson made a vital contribution to the modern world of computing – Email. Using ARPANET, which is a computer network many pinpoint as the forerunner of the Internet, Tomlinson sent the first email.
The Altair 8800, a hobbyist’s build-it-yourself style microcomputer was made available to the masses. This little computer made it into the hands of many people who shaped the modern computer world, notably Bill Gates and Paul Allen, the founders of Microsoft. Gates and Allen used the microcomputer to learn and program using the computer language Altair BASIC.
Apple was founded by Steve Wozniak and Steve Jobs in 1976. Apple began selling the Apple 1 computer in the same year and suddenly at-home computers became more available to the general public and not just the military, government, and companies.
MS-DOS, the operating system from Microsoft, was made available on the Intel 8086 microprocessor in 1981. The Osborne 1 computer was also released the same year. This computer was the first computer that was portable.
Mac releases their operating system with the intent to transform the world of personal computers.
Windows 95 is released by Microsoft, catapulting the company into the leading operating systems provider. This OS featured the Start menu, taskbar, minimize, maximize, and close program buttons, which computer users are familiar with today.
In the year 1996, more e-mails were sent than postal letters, a true sign of the change in communication.
After being established in 1994, the World Wide Web Consortium (W3C) became a leading voice for web technology standards. As more and more people began using the Internet, it became apparent there would be a need for computer language standards. In 1998, W3C recommended XML as the standard for web programmers.
After 1995, the world of computers exploded into what computer users are familiar with today. The iPad, laptops, tablets, new web browsers, and the never ending advancements, upgrades, and improvements are nothing short of overwhelming. However, in 2008 IBM was able to cut through the clutter with the release of the IBM Roadrunner, recognized as the fastest computer in the world.
Timeline of Computer History
A timeline of computer history, complete with photos, and documented advancements from 1939-1994
The Charles Babbage Institute
As a nod to the “Father of Computers”, the University of Minnesota has created the Charles Babbage Institute, which contains archives and more.
Image Courtesy of Wikimedia Commons.