Sunday, June 27, 2010

The 5 generations of computers

In the beginning ...
A generation refers to the state of improvement in the development of a product. This term is also used in the different advancements of computer technology. With each new generation, the circuitry has gotten smaller and more advanced than the previous generation before it. As a result of the miniaturization, speed, power, and memory of computers has proportionally increased. New discoveries are constantly being developed that affect the way we live, work and play.

The First Generation: 1946-1958 (The Vacuum Tube Years)


The first generation computers were huge, slow, expensive, and often undependable. In 1946two Americans, Presper Eckert, and John Mauchly built the ENIAC electronic computer which used vacuum tubes instead of the mechanical switches of the Mark I. The ENIAC used thousands of vacuum tubes, which took up a lot of space and gave off a great deal of heat just like light bulbs do. The ENIAC led to other vacuum tube type computers like the EDVAC (Electronic Discrete Variable Automatic Computer) and the UNIVAC I (UNIVersal Automatic Computer).

The vacuum tube was an extremely important step in the advancement of computers. Vacuum tubes were invented the same time the light bulb was invented by Thomas Edison and worked very similar to light bulbs. It's purpose was to act like an amplifier and a switch. Without any moving parts, vacuum tubes could take very weak signals and make the signal stronger (amplify it). Vacuum tubes could also stop and start the flow of electricity instantly (switch). These two properties made the ENIAC computer possible.

The ENIAC gave off so much heat that they had to be cooled by gigantic air conditioners. However even with these huge coolers, vacuum tubes still overheated regularly. It was time for something new.

The Second Generation: 1959-1964 (The Era of the Transistor)


The transistor computer did not last as long as the vacuum tube computer lasted, but it was no less important in the advancement of computer technology. In 1947 three scientists, John Bardeen, William Shockley, and Walter Brattain working at AT&T's Bell Labs invented what would replace the vacuum tube forever. This invention was the transistor which functions like a vacuum tube in that it can be used to relay and switch electronic signals.

There were obvious differences between the transisitor and the vacuum tube. The transistor was faster, more reliable, smaller, and much cheaper to build than a vacuum tube. One transistor replaced the equivalent of 40 vacuum tubes. These transistors were made of solid material, some of which is silicon, an abundant element (second only to oxygen) found in beach sand and glass. Therefore they were very cheap to produce. Transistors were found to conduct electricity faster and better than vacuum tubes. They were also much smaller and gave off virtually no heat compared to vacuum tubes. Their use marked a new beginning for the computer. Without this invention, space travel in the 1960's would not have been possible. However, a new invention would even further advance our ability to use computers.

The Third Generation: 1965-1970 (Integrated Circuits - Miniaturizing the Computer)


Transistors were a tremendous breakthrough in advancing the computer. However no one could predict that thousands even now millions of transistors (circuits) could be compacted in such a small space. The integrated circuit, or as it is sometimes referred to as semiconductor chip, packs a huge number of transistors onto a single wafer of silicon. Robert Noyce of Fairchild Corporation and Jack Kilby of Texas Instruments independently discovered the amazing attributes of integrated circuits. Placing such large numbers of transistors on a single chip vastly increased the power of a single computer and lowered its cost considerably.

Since the invention of integrated circuits, the number of transistors that can be placed on a single chip has doubled every two years, shrinking both the size and cost of computers even further and further enhancing its power. Most electronic devices today use some form of integrated circuits placed on printed circuit boards-- thin pieces of bakelite or fiberglass that have electrical connections etched onto them -- sometimes called a mother board.

These third generation computers could carry out instructions in billionths of a second. The size of these machines dropped to the size of small file cabinets. Yet, the single biggest advancement in the computer era was yet to be discovered.

The Fourth Generation: 1971-Today (The Microprocessor)


This generation can be characterized by both the jump to monolithic integrated circuits(millions of transistors put onto one integrated circuit chip) and the invention of the microprocessor (a single chip that could do all the processing of a full-scale computer). By putting millions of transistors onto one single chip more calculation and faster speeds could be reached by computers. Because electricity travels about a foot in a billionth of a second, the smaller the distance the greater the speed of computers.

However what really triggered the tremendous growth of computers and its significant impact on our lives is the invention of the microprocessor. Ted Hoff, employed by Intel (Robert Noyce's new company) invented a chip the size of a pencil eraser that could do all the computing and logic work of a computer. The microprocessor was made to be used in calculators, not computers. It led, however, to the invention of personal computers, or microcomputers.

It wasn't until the 1970's that people began buying computer for personal use. One of the earliest personal computers was the Altair 8800 computer kit. In 1975 you could purchase this kit and put it together to make your own personal computer. In 1977 the Apple II was sold to the public and in 1981 IBM entered the PC (personal computer) market.

Fifth Generation (Present and Beyond) Artificial Intelligence.


Fifth generation computing devices, based on artificial intelligence, are still in development, though there are some applications, such as voice recognition, that are being used today. The use of parallel processing and superconductors is helping to make artificial intelligence a reality. Quantum computation and molecular and nanotechnology will radically change the face of computers in years to come. The goal of fifth-generation computing is to develop devices that respond to natural language input and are capable of learning and self-organization.

Wednesday, June 23, 2010

History of Computer

Computer History
The history of the computer owes its existence to the fact that people, who are lazy by nature, have always sought to improve their ability to calculate, in order to reduce errors and save time.
Origins: The abacus
The "abacus" was invented in the year 700; it was in use for a long time, and still is in some countries.


Then came the logarithm
The invention of the logarithm is generally credited to the Scotsman John Napier (1550-1617). In 1614, he showed that multiplication and division could be performed using a series of additions. This discovery led, in 1620, to the invention of the slide rule.
However, the true father of logarithm theory is Mohamed Ybn Moussa Al-Khawarezmi, an Arab scholar from the Persian town of Khawarezm. This scholar also developed algebra, a term which comes from the Arabic "Al-Jabr", meaning compensation, with the implication being "looking for the unknown variable X in order to compensate by balancing the results of the calculations."
The first calculating machines
In 1623, William Schickard invented the first mechanical calculating machine.
In 1642, Blaise Pascal created the arithmetic machine (called the Pascaline), a machine that could add and subtract, intended to help his father, a tax collector.
In 1673, Gottfried Wilhelm Von Leibniz added multiplication and division to the Pascaline.
In 1834, Charles Babbage invented the difference engine, which could evaluate functions.
However, once he learned that a weaving machine (called a Jacquard loom) was programmed with perforated cards, he started building a calculating machine that could take advantage of this revolutionary idea.

In 1820, the first four-function mechanical calculators debuted. They could:
• add
• subtract
• multiply
• divide

By 1885, they were being built with keyboards for entering data. Electrical motors quickly supplanted cranks.


Programmable computers
In 1938, Konrad Zuse invented a computer based around electromechanical relays: The Z3. This computer was the first to use binary instead of decimals
In 1937, Howard Aiken developed a programmable computer 17 metres long and 2.5 metres high, which could calculate 5 times faster than a human.
It was IBM's Mark I.
It was built using 3300 gears and 1400 switches linked with 800 km of electrical wiring.
In 1947, the Mark II appeared, with its predecessor's gears being replaced by electronic components.


Vacuum tube computers
In 1942, the ABC (Atanasoff Berry Computer), named after its designers, J.V. Atanasoff and C.Berry, was introduced.
In 1943, the first computer with no mechanical parts was created by J.Mauchly and J.Presper Eckert: the ENIAC (Electronic Numerical Integrator And Computer). It was made using 18000 vacuum tubes, and took up 1500 m2 of space. It was used for calculations required for designing the H-bomb.

The ENIAC's main drawback was its programming:
It could only be programmed manually, by flipping switches or plugging in cables.
The first computer error was caused by an insect, which was attracted to the vacuum tubes by the heat and became lodged in them, creating a short circuit. Thus, the name "bug" came to mean a computer error.

Indeed, as the tubes were poor conductors, they required a great deal of electrical energy, which they released as heat. This problem was solved in 1946 with the creation of the EDVAC (Electronic Discrete Variable Computer), which could store programs in memory (1024 words in central memory and 20000 words in magnetic memory).


The transistor
In 1948, the transistor was created by the firm Bell Labs (thanks to the work of the engineers John Bardeen, Walter Brattain and William Shockley). With transistors, the computers of the 1950s could be made less bulky, less energy-hungry, and therefore less expensive: This marked a turning point in computing history.
The integrated circuit
The integrated circuit was perfected in 1958 by Texas Instruments, and made even smaller and cheaper computers possible, by integrating multiple transistors on the same circuit without using electrical wiring.
The first transistor computers
In 1960, the IBM 7000 became the first transistor computer.
In 1964, the IBM 360 appeared, along with the DEC PDP-8.


Microcomputers
In 1971, the first microcomputer came out: the Kenback 1, with a 256-byte memory.
Microprocessors
In 1971, the first microprocessor, the Intel 4004, appeared. It could carry out 4 bits of operations at once.
Around the same time, Hewlett Packard put out the HP-35 calculator.
The Intel 8008 processor (which could process 8 bits at a time) was released in 1972.
In 1973, The Intel 8080 processor was used in the first microcomputers: the Micral and the Altair 8800, with 256 bytes of memory. In late 1973, Intel came out with processors that were already 10 times faster than their predecessor (the Intel 8080) and included 64 Kb of memory.
In 1976, Steve Wozniak and Steve Jobs created the Apple I in a garage. This computer had a keyboard, a 1 MHz microprocessor, 4 Kb of RAM and 1 KB of video memory.
The story goes that the two friends didn't know what to name the computer; Steve Jobs, seeing an apple tree in the garden, decided he would call the computer "apple" if he couldn't think up another name in the next five minutes.
In 1981, IBM sold the first "PC", made from an 8088 processor with a clock speed of 4.77 MHz.


Computers today
It is very difficult today to tell where computers are going. Their development has followed Moore's Law: "Every three years, four times as many transistors can be put on a chip."
This would imply that there will be 1 billion transistors on a chip around the year 2010.

http://en.kioskea.net/contents/histoire/ordinateur.php3


Computer History
Year/Enter Computer History
Inventors/Inventions Computer History
Description of Event

1936
Konrad Zuse - Z1 Computer First freely programmable computer.

1942
John Atanasoff & Clifford Berry
ABC Computer Who was first in the computing biz is not always as easy as ABC.

1944
Howard Aiken & Grace Hopper
Harvard Mark I Computer The Harvard Mark 1 computer.

1946
John Presper Eckert & John W. Mauchly
ENIAC 1 Computer 20,000 vacuum tubes later...

1948
Frederic Williams & Tom Kilburn
Manchester Baby Computer & The Williams Tube Baby and the Williams Tube turn on the memories.

1947/48
John Bardeen, Walter Brattain & Wiliam Shockley
The Transistor No, a transistor is not a computer, but this invention greatly affected the history of computers.

1951
John Presper Eckert & John W. Mauchly
UNIVAC Computer First commercial computer & able to pick presidential winners.

1953
International Business Machines
IBM 701 EDPM Computer IBM enters into 'The History of Computers'.

1954
John Backus & IBM
FORTRAN Computer Programming Language The first successful high level programming language.

1955
(In Use 1959)
Stanford Research Institute, Bank of America, and General Electric
ERMA and MICR The first bank industry computer - also MICR (magnetic ink character recognition) for reading checks.

1958
Jack Kilby & Robert Noyce
The Integrated Circuit Otherwise known as 'The Chip'

1962
Steve Russell & MIT
Spacewar Computer Game The first computer game invented.

1964
Douglas Engelbart
Computer Mouse & Windows Nicknamed the mouse because the tail came out the end.

1969
ARPAnet The original Internet.

1970
Intel 1103 Computer Memory The world's first available dynamic RAM chip.

1971
Faggin, Hoff & Mazor
Intel 4004 Computer Microprocessor The first microprocessor.

1971
Alan Shugart &IBM
The "Floppy" Disk Nicknamed the "Floppy" for its flexibility.

1973
Robert Metcalfe & Xerox
The Ethernet Computer Networking Networking.

1974/75
Scelbi & Mark-8 Altair & IBM 5100 Computers The first consumer computers.

1976/77
Apple I, II & TRS-80 & Commodore Pet Computers More first consumer computers.

1978
Dan Bricklin & Bob Frankston
VisiCalc Spreadsheet Software Any product that pays for itself in two weeks is a surefire winner.

1979
Seymour Rubenstein & Rob Barnaby
WordStar Software Word Processors.

1981
IBM
The IBM PC - Home Computer From an "Acorn" grows a personal computer revolution

1981
Microsoft
MS-DOS Computer Operating System From "Quick And Dirty" comes the operating system of the century.

1983
Apple Lisa Computer The first home computer with a GUI, graphical user interface.

1984
Apple Macintosh Computer The more affordable home computer with a GUI.

1985
Microsoft Windows Microsoft begins the friendly war with Apple.
SERIES TO BE CONTINUED

http://inventors.about.com/library/blcoindex.htm