The computer evolution is indeed an interesting topic that
has been explained in some
different ways over the years, by many authors. According to The Computational Science
Education Project, US, the computer has evolved through the
following stages:
The Mechanical Era (1623-1945)
Trying to use machines to solve mathematical problems can be
traced to the early 17th
century. Wilhelm Schickhard, Blaise Pascal, and Gottfried
Leibnitz were among mathematicians who designed and implemented calculators that were capable of addition,
The first multi-purpose or programmable computing device was probably Charles Babbage's Difference
Engine, which was begun in 1823 but never completed. In 1842, Babbage designed a more
ambitious machine, called the Analytical Engine but unfortunately it also was only
partially completed. Babbage, together with Ada Lovelace recognized several important programming
techniques, including conditional branches, iterative loops and index variables. Babbage designed the machine which is arguably the first to be used in computational
science. In 1933, George Scheutz and his son, Edvard began work on a smaller version of the
difference engine and by 1853 they had constructed a machine that could process 15-digit numbers and calculate fourth-order differences.
The US
Census Bureau was one of the first organizations to use the mechanical
computers which used punch-card equipment designed by Herman
Hollerith to tabulate data for the 1890 census. In 1911 Hollerith's company merged with
a competitor to found the
corporation which in 1924 became International Business
Machines (IBM).
First Generation Electronic Computers (1937-1953)
These devices used electronic switches, in the form of
vacuum tubes, instead of
electromechanical relays.
The earliest attempt to build an electronic computer was by J. V.
Atanasoff, a professor of physics and mathematics at Iowa
State in 1937. Atanasoff set out to
build a machine that would help his graduate students solve systems of partial
differential
equations. By 1941 he and graduate student Clifford Berry
had succeeded in building a machine that could solve 29 simultaneous equations with 29 unknowns. However, the
machine was not programmable, and was more of an electronic
calculator.
A second early electronic machine was Colossus, designed by
Alan Turing for the British
military in 1943. The
first general purpose programmable
electronic computer was the Electronic Numerical Integrator and Computer
(ENIAC), built by J. Presper Eckert and John V. Mauchly at the University of Pennsylvania. Research work
began in 1943, funded by the Army Ordinance Department, which needed a way to compute
ballistics during World War II. The machine was completed in 1945 and it was used
extensively for calculations during the design of the hydrogen bomb. Eckert, Mauchly, and John von Neumann, a
consultant to the ENIAC project, began work on a new machine before ENIAC
was finished. The main contribution of EDVAC, their new project, was the notion of
a stored program. ENIAC was controlled by a set of external switches and dials; to change
the program required physically altering the settings on these controls. EDVAC was able to
run orders of magnitude faster than ENIAC and by storing instructions in the same medium as data, designers could concentrate on improving the internal structure of the
machine without worrying about matching it to the speed of an external control. Eckert and Mauchly later designed what was arguably the first commercially successful computer, the UNIVAC; in 1952. Software technology during this period was very primitive.
Second Generation (1954-1962)
The second generation witnessed several important
developments at all levels of computer
system design, ranging from the technology used to build the
basic circuits to the
programming languages used to write scientific
applications. Electronic switches in
this era
were based on discrete diode and transistor technology with
a switching time of
approximately 0.3 microseconds. The first machines to be
built with this technology include
TRADIC at Bell Laboratories in 1954 and TX-0 at MIT's
Lincoln Laboratory. Index
registers were designed for controlling loops and floating
point units for calculations based
on real numbers.
A number of high level programming languages were introduced
and these include
FORTRAN (1956), ALGOL (1958), and COBOL (1959). Important
commercial machines of
this era include the IBM 704 and its successors, the 709 and
7094. In the 1950s the first two
supercomputers were designed specifically for numeric
processing in scientific applications.
Third Generation (1963-1972)
Technology changes in this generation include the use of
integrated circuits, or ICs
(semiconductor devices with several transistors built into
one physical component),
semiconductor memories, microprogramming as a technique for efficiently designing
complex processors and the introduction of operating systems
and time-sharing. The first ICs were based on small-scale integration (SSI) circuits, which
had around 10 devices per circuit (or ‘chip’), and evolved to the use of medium-scale
integrated (MSI) circuits, which had up to 100 devices per chip. Multilayered printed circuits were
developed and core memory was replaced by faster, solid state memories.
cuuusssss
cuuusssss
In 1964, Seymour Cray developed the CDC 6600, which was the
first architecture to use
functional parallelism. By using 10 separate functional
units that could operate
simultaneously and 32 independent memory banks, the CDC 6600 was able to attain a
computation rate of one million floating point operations
per second (Mflops). Five years
later CDC released the 7600, also developed by Seymour Cray. The CDC 7600, with its
pipelined functional units, is considered to be the first
vector processor and was capable of
executing at ten Mflops. The IBM 360/91, released during the
same period, was roughly
twice as fast as the CDC 660.
Early in this third generation, Cambridge University and the University of London
cooperated in the development of CPL (Combined Programming
Language, 1963). CPL was,
according to its authors, an attempt to capture only the
important features of the complicated and sophisticated ALGOL. However, like
ALGOL, CPL was large with many features that
were hard to learn. In an attempt at further simplification, Martin Richards of Cambridge
developed a subset of CPL called BCPL (Basic Computer
Programming Language, 1967). In
1970 Ken Thompson of Bell Labs developed yet another
simplification of CPL called simply
B, in connection with an early implementation of the UNIX
operating system. comment):
Fourth Generation (1972-1984)
Large scale integration (LSI - 1000 devices per chip) and
very large scale integration (VLSI -
100,000 devices per chip) were used in the construction of
the fourth generation computers.
Whole processors could now fit onto a single chip, and for
simple systems the entire
computer (processor, main memory, and I/O controllers) could
fit on one chip. Gate delays
dropped to about 1ns per gate. Core memories were replaced by semiconductor
memories.
Large main memories like CRAY 2 began to replace the older
high speed vector processors,
such as the CRAY 1, CRAY X-MP and CYBER
In 1972, Dennis Ritchie developed the C language from the
design of the CPL and
Thompson's B. Thompson and Ritchie then used C to write a
version of UNIX for the DEC
PDP-11. Other
developments in software include very high level languages such as FP
(functional programming) and Prolog (programming in logic).
IBM worked with Microsoft during the 1980s to start what we
can really call PC (Personal
Computer) life today.
IBM PC was introduced in October 1981 and it worked with the
operating system (software) called ‘Microsoft Disk Operating
System (MS DOS) 1.0.
Development of MS DOS began in October 1980 when IBM began
searching the market for
an operating system for the then proposed IBM PC and major
contributors were Bill Gates,
Paul Allen and Tim Paterson.
In 1983, the Microsoft Windows was announced and this has
witnessed several improvements and revision over the last
twenty years.
Fifth Generation (1984-1990)
This generation brought about the introduction of machines
with hundreds of processors t
could all be working on different parts of a single program. The scale of integration
semiconductors continued at a great pace and by 1990 it was
possible to build chips wit
million components - and semiconductor memories became
standard on all comput
Computer networks and single-user workstations also became
popular.
Parallel processing started in this generation. The Sequent Balance 8000 connected up to
processors to a single shared memory module though each
processor had its own local cac
The machine was designed to compete with the DEC VAX-780 as
a general purpose U
system, with each processor working on a different user's
job. However Sequent provide
library of subroutines that would allow programmers to write
programs that would use m
than one processor, and the machine was widely used to
explore parallel algorithms a
programming techniques.
The Intel iPSC-1, also known as ‘the hypercube’ connected e
processor to its own memory and used a network interface to
connect processors. T
distributed memory architecture meant memory was no longer a
problem and large syste
with more processors (as many as 128) could be built. Also
introduced was a machi
known as a data-parallel or SIMD where there were several
thousand very simple process
which work under the direction of a single control
unit. Both wide area network (WAN) a
local area network (LAN) technology developed rapidly.
Sixth Generation (1990 - Now)
Most of the developments in computer systems since 1990 have
not been fundamen
changes but have been gradual improvements over established systems. This generat
brought about gains in parallel computing in both the
hardware and in improv
understanding of how to develop algorithms to exploit
parallel architectures. Workstation
technology continued to improve, with processor designs now using a combination of RISC, pipelining, and parallel
processing. Wide area networks, network bandwidth and speed of operation and networking capabilities
have kept developing tremendously. Personal
computers (PCs) now operate with Gigabit per second processors, multi-Gigabyte disks, hundreds of Mbytes of RAM, colour
printers, high-resolution graphic monitors, stereo sound cards and graphical user
interfaces. Thousands of software
(operating systems and application software) are existing today and Microsoft Inc. has been a major contributor.
Microsoft is said to be one of
the biggest companies ever, and its chairman – Bill Gates has been rated as the richest man for several
years.
Finally, this generation has brought about micro controller
technology. Micro controllers are ’embedded’ inside some other devices (often consumer
products) so that they can control the features or actions of the product. They work as small computers inside devices and now serve as essential components in most machines.
Tidak ada komentar:
Posting Komentar