The Evolution Of Computer Science The birth of computers and information technology goes back many centuries. The development of mathematics led to the development of tools to help in computation. Blaise Pascal, in 17th century France, was credited with building the first calculating machine. In the 19th century, the Englishman Charles Babbage, generally considered the father of computing, designed the first "analytical engine." This machine had a mechanical computing "mill" and, like the Jacquard loom of the early 19th century, used punch cards to store the numbers and processing requirements. Ada Lovelace worked on the design with him and developed the idea of a sequence of instructions-a program. The machine was not complete at Babbage's death in 1871. Almost a century later, the ideas re-emerged with the development of electro-mechanical calculating machines. In 1890, Herman Hollerith used punch cards to help classify information for the United States Census Bureau. At the same time, the invention of the telegraph and telephone laid the groundwork for telecommunications and the development of the vacuum tube. This electronic device could be used to store information represented as binary patterns-on or off, one or zero. The first electronic digital computer, ENIAC (Electronic Numerical Integrator and Computer), was developed for the U.S. Army and completed in 1946. Von Neumann, a Princeton mathematics professor, developed the idea further. He added the idea of a stored computer program. This was a set of instructions stored in the memory of the computer, which the computer obeyed to complete the programmed task. From this stage, computers and computer programming evolved rapidly. The move from vacuum tubes to transistors significantly reduced the size and cost of the machines, and increased their reliability. Then came integrated circuit technology, which has reduced the size (and cost) of computers. In the 1960s, the typical computer was a transistor-based machine that cost half a million dollars, and needed a large, air-conditioned room and an on-site engineer. The same computer power now costs $2,000 and sits on a desk. As computers became smaller and cheaper, they also became faster-made possible by a single integrated circuit called a chip. The evolution of microcomputers follows the evolution of integrated circuit (or chip) technology. This technology allows computer logic to be 'burnt into' the layers of a chip. A 5-millimetre-square chip can contain all the logic needed for a computer processor to run programs. This technological breakthrough made possible a massive reduction in the size of computers, especially compared to transistor-based logic, where components were wired onto boards. The size reduction enabled logic switching at many millions of times a second.