Monday, April 14, 2014

The Fourth Generation: The Microprocessor



The fourth generation of computers started in 1971 and stretches to the present. The fourth generation saw the development of the Personal Computer (PC).
Considering the development of the IC (integrated circuit), the next step was to shrink the hardware while expanding its components. And thus, the microprocessor was born. About a half centimeter wide, the microprocessor was a tiny computer chip that could function directly from the computer's central processing unit.
The innovators of the PC were Apple and IBM, which introduced their own takes on the PC in 1976, and 1981,  respectively. As time went a long, computers became smaller and smaller, making way for the laptop, as well as faster and faster. Fans were built inside the computers to prevent overheating. Eventually more advanced OS (operating systems) were introduced, and prices dropped, making it standard for every household to have at least one PC.




Third Generation: Integrated Circuits




The third generation of computers lasted briefly, but was highlighted by a monumental breakthrough. Lasting from 1965 to 1971, the invention of integrated circuits.

Invented by Texas Instruments engineer, Jack Kilby, the integrated circuit is also known plainly by IC, which took the place of the hallmark of the second generation of computers, transistors. ICs were miniature silicon chips which contained hundreds of transistors.
The invention of integrated circuits allowed computers to be much smaller than the preceding generations. Another benefit of ICs is that they made computers functionally faster, but there still downsides to the third generation of computers we take for granted today, most importantly among those downsides, there were still the issues of overheating and cost.      
    

Jack Kilby and the IC Invention

Second Generation Computers: Transistors




The second generation of computers stretched from the mid-1950's to the mid-1960's, and had a huge impact on modern hardware. The biggest innovation that influenced the second generation of computers was the invention of transistor. Replacing the vacuum tubes that were the prime instigator of first generation computers, transistors took up far less space. A lone transistor was able to store as much data as hundreds of vacuum tubes. In addition, there were many other advantages to transistors besides in lieu of tubes, besides being smaller and more equipped to store data: they were considerably faster operationally and did not produce as much heat as ENIACs and other first generational computers. There were new aspects to second generational computers taken for granted in modern day computers such as disk drives, memory and inputs for printers.
These advantages gave headway to the production of more accessible computers, that were eventually adopted by the business world. For example, IBM created the IBM 1401, which was the prototype for all current computing devices. The 1401 was relatively small compared to previous computers, and was adopted by businesses to help sort through financial information.  
       

First Generation Computers: Tubes and the ENIAC



It is debatable exactly what year one could point to in which you could definitively proclaim the official genesis of computers. Some could say the true beginning of computers began with Alan Turing, who designed the Turing machine in 1936, which was a model of a machine that helped solve complex mathematical algorithms. Yet the design never came to concrete fruition. The first fac-simile of a computer as we know it was the ENIAC computer. 

Constructed in 1943 at the University of Pennsylvania by John Mauchly and J. Presper Eckert the ENIAC (Electronic Numerical Integrator and Computer) consisted of 18,000 vacuum tubes, 70,000 resistors, and occupied 1,000 square feet. It was wholly impractical for not only its size, but it was also enormously expensive, required massive amounts of power, and generated so much heat that it had to be kept in a room requiring constant air conditioning. Due to the readily apparent impracticality of such a device, the ENIAC, among other first generation computers did not catch on in the mainstream.    


   Courtesy of UPENN