Generation
|
Period
|
Main Electronic Components
|
Main Computers
|
Operating System
|
1
|
1945-56
|
Vacuum tubes, Flip-flops
|
UNIVAC, ENIAC
|
Mainly Batch Operating System
|
2
|
1956-63
|
Transistors
|
IBM-700, IBM-1401
|
Time Sharing Operating System
|
3
|
1964-71
|
Integrated Circuits (IC’s)
|
IBM-1620, ATLAS, ICL-1901, IBM- 360, IBM- 370
|
Real Time and Time Sharing Operating System
|
4
|
1971- Present
|
Micro processors (LSI-Large Scale Integration)
|
NCR-395, CDC-1700,APPLE, DCM
|
Time Sharing network
|
5
|
Present & beyond
|
Optical Fibre, Artificial Intelligence (VLSI and VVLSI- Very Very Large Scale Integration )
|
------------
|
Windows( Xp, 7, 8, 8.1, 10)
Fedora, Macintosh OSX,
Mac OS and many more…
|
First
Generation (1940-1956) Vacuum Tubes
The first computers used vacuum tubes for circuitry and
magnetic drums for memory, and were often enormous, taking up entire rooms.
They were very expensive to operate and in addition to using a great deal of
electricity, generated a lot of heat, which was often the cause of
malfunctions.
First generation computers relied on machine language, the
lowest-level programming language understood by computers, to perform
operations, and they could only solve one problem at a time. Input was based on
punched cards and paper tape, and output was displayed on printouts.
The UNIVAC and ENIAC computers are examples of
first-generation computing devices. The UNIVAC was the first commercial
computer delivered to a business client, the U.S. Census Bureau in 1951.
Second Generation
(1956-1963) Transistors
Transistors replaced vacuum tubes and ushered in the second
generation of computers. The transistor was invented in 1947 but did not see
widespread use in computers until the late 1950s. The transistor was far
superior to the vacuum tube, allowing computers to become smaller, faster,
cheaper, more energy-efficient and more reliable than their first-generation
predecessors. Though the transistor still generated a great deal of heat that
subjected the computer to damage, it was a vast improvement over the vacuum
tube. Second-generation computers still relied on punched cards for input and
printouts for output.
Second-generation computers moved from cryptic binary
machine language to symbolic, or assembly, languages, which allowed programmers
to specify instructions in words. High-level programming languages were also
being developed at this time, such as early versions of COBOL and FORTRAN.
These were also the first computers that stored their instructions in their
memory, which moved from a magnetic drum to magnetic core technology.
The first computers of this generation were developed for
the atomic energy industry.
Third Generation
(1964-1971) Integrated Circuits
The development of the Integrated Circuit was the hallmark
of the third generation of computers. Transistors were miniaturized and placed
on silicon chips, called semiconductors, which drastically increased the speed
and efficiency of computers.
Instead of punched cards and printouts, users interacted
with third generation computers through keyboards and monitors and interfaced
with an operating system, which allowed the device to run many different
applications at one time with a central program that monitored the memory.
Computers for the first time became accessible to a mass audience because they
were smaller and cheaper than their predecessors.
The Macro Processor brought the fourth generation of
computers, as thousand of integrated circuits were built onto a single silicon
chip. What in the second generation filled an entire room could now fit in the
palm of the land. The Intel 4104 chip, developed in 1941, located all the
components of the computer—from the Central Processing Unit and memory to
input/output controls—on a single chip.
In 1981 IBM introduced its first computer for the home
users, and in 1984 Apple introduced the Macintosh. Macro processors also moved
out of the real of desktop computers and into many areas of life as more and
more everyday products began to use macro processors.
As these small computers became more powerful, they could be
linked together to form networks, which eventually led to the development of
the Internet. Fourth generation computers also saw the development of GUIs, the
Mouse and handheld devices.
Artificial Intelligence
Fifth generation computing devices, based on Artificial
Intelligence, are in development, though there are some applications, such as
voice recognition, that are being used today. The use of Parallel Processing
and superconductors is helping to make artificial intelligence a reality.
Quantum computation and molecular and Nanotechnology will radically change the
face of computers in years to come. The goal of second-generation computing is
to develop devices that respond to Natural Language input and output are
capable of learning and self-organisation
No comments:
Post a Comment