|
Computers are so named because they were first designed to deal with numbers—that is, to compute. But modern computers also process words, draw, reproduce sound, and perform many other functions. The early history of the computer can be traced to Charles Babbage, an English inventor who designed an "analytical machine" that theoretically could do some of the things a modern computer does. However, it was never built. Had it been, it would have covered an area equal to a football field and required the power of five steam engines. A more practical plan came from the American inventor Herman Hollerith, who patented a calculating machine in 1889. His machine, which relied on punch cards, was used the following year to compute census data. Hollerith's Tabulating Machine Company went through several mergers and was absorbed into a company which in 1924 adopted the name International Business Machines Corporation (IBM). Index cards were used for many data-sorting operations into the 1960s, but they were slow and cumbersome compared to the modern computer. An essential piece of the computer puzzle was supplied in the 20th century by the English mathematician George Boole. His system of mathematics, called Boolean Algebra, used only the digits 0 and 1, instead of the ten digits in common use. With ten digits he could represent any number. In the binary system, for example, the number six is represented by three digits, 110. Boole's work was particularly useful because the modern computer would consist of thousands, then millions, of tiny switches. They were either on or off, two possibilities suggesting the 0 and 1 of Boolean algebra. These on-off switches came to be known as bits, for binary digits. The bit is the fundamental building block of the modern computer. A group of eight bits is known as a byte. In time bits were "taught" by computer programmers to store, manipulate, and reproduce various kinds of data, such as letters, drawings, music, and motion pictures. But before these marvels could be achieved—and made available to the general public—computer hardware had to be designed and improved. World War II provided a stimulus to computer development. In the United States, Howard Aiken led in the development of a computer known as the Mark I. Using 3,304 on-off switches, it created ballistic tables used by naval artillery. The British developed a computer using vacuum tubes instead of switches and used it to decode German messages. Shortly after the war Americans built the ENIAC, the most sophisticated computer of its time. It occupied 450 sq m (1500 sq ft) and contained 17,468 vacuum tubes. Its capacity, though impressive at the time, was less than that of a modern laptop computer. A crucial breakthrough in computing came in 1947 with the invention at Bell Laboratories in the United States of the transistor, which was much faster, smaller, and cheaper than the vacuum tube. The microchip was invented a few years later, along with the microprocessor, allowing information to be stored and manipulated in a small area. n 1974 a company in Albuquerque, New Mexico, called Micro Instrumentation Telemetry Systems (MITS), released the Altair 8800, a personal computer in a kit for less than $400—unassembled. The Altair had no keyboard, only a panel of switches with which to enter information. Its capacity was less than 1 percent that of the 1991 Hewlett-Packard handheld computer. But the Altair inaugurated a revolution in computer electronics that continues today. Hardware manufacturers soon introduced personal computers, and software manufactures began developing software to allow the computers to process words, manipulate data, and draw. During the 1980s, computers became progressively smaller, better, and cheaper. As the hardware became more powerful, software became more sophisticated. It pushed the limits of the hardware, encouraging the building of new hardware with bigger drives, faster processors, and larger memories. In 1992 the computer industry was the fastest-growing industry in the world. Experts predicted that by the year 2000, the worldwide revenues of the computer industry would be second only to agricultural revenues. Computers guided airplanes, controlled traffic, processed words and numbers, and kept track of appointments. They became the heart of modern business, medical research, and academics. |