Apr 29, 2012

HISTORY OF COMPUTER

Begining Of the Computer

Computer comes from the Latin language "computare" which means to compute, because early in the first computer designed and used for calculation purposes. Inspiration taken from the oldest Abaccus calculators or better know by the name of the Sipoa is derived from the land of china.


Computer Generation
The First Generation
Howard G. Aiken from the Havard University (1937) cooperation with IBM (International Bussiness Machine Corp) has successfully created a machine that works with electro mechanical power that is named is Havard Mark -1. The first generation of computer weighs about 5 tons that can make the calculations the number 23 digits for 6 seconds. Creation of the first computers was still mostly use the von Neumann architecture proposed in early 1940 by John Louis von Neumann (1903 - 1957). Concept of computer architecture is the division into two, in broad outline the memory and processor, where data is located in the memory and processor is manipulating data. This architecture describes a computer with 3 sections : CPU (Central Processing Unit) with Arithmatic Logic Unit (ALU) and Control Unit, Memory, Input Device and Output Device (are connected through I/O port).
Between the parts are connected by bus system.
                                          Image : Strukture of a simple computer

A physicist named John V.Atanasoft collaboration with Clifford Berry is credited with creating the first electronic digital computer (1942 - 1973), that is named Atanasoft - Berry Computer (ABC). Atanasoft - Berry Computer is the first computer to use modern digital switching techniques and the use of vacuum tubes with no switch. this computer introduces the concept of binary arithmetic and logic circuits.

In 1943, ENIAC (Electronic Numerical Integrator And Computer) introduced the first fully electronic computer designed (fully qualified machine). ENIAC was made by a partnership between the government of the United States and the University of Pennsylvania. Consisting of 18,000 vacuum tubes, 70,000 resistors, 20 accumulators each accommodate 10-digit decimal, and 5 million soldered joints, the computer is a machine that consume enormous power of 160kW. can be reprogrammed by rearranging the cord in order to resolve any kind of calculation. This computer was designed by John Presper Eckert (1919-1995) and John W. Mauchly (1907-1980), ENIAC is a versatile computer (general purpose computer) that work 1000 times faster than Mark I. ENIAC was still doing the calculations in decimal form, not a binary

                                         Image : ENIAC

On October 19, 1973 U.S. Federal Judge, Erl R.Larson Anatasoft decide the existence of the ABC as the first digital electronic computer to cancel the ENIAC patent of Mauchly and Eckert and set Anatasoft as the inventor of the first electronic digital computer. 

In the mid-1940s, John von Neumann (1903-1957) joined the University of Pennsylvania team, initiating concepts in computer design that is up to 40 years is still used in computer engineering. Von Neumann designed the Electronic Discrete Variable Automatic Computer (EDVAC) in 1945 with a good memory to accommodate the program or data. This technique allows the computer to stop at some point and then resume his job back. The main key to the von Neumann architecture is a central processing unit (CPU), which allowed all computer functions to be coordinated through a single source. 1951, the UNIVAC I (Universal Automatic Computer I) made ​​by Remington Rand, became the first commercial computer that utilizes the Von Neumann architecture model.

Both the United States Census Bureau and General Electric have UNIVAC. One of the impressive results achieved by the UNIVAC dalah success in predicting victory Dwilight D. Eisenhower in the 1952 presidential election.


Second Generation
In 1948, the invention of the transistor greatly influenced the development of computers. Transistors replaced vacuum tubes in televisions, radios, and computers. As a result, the size of the reduced electric machines drastis.Transistor into use in computers began in 1956.Penemuan other form of magnetic core memory, a second generation computers smaller, faster, more reliable, and more energy efficient compared to the first pendahulunya.Mesin utilize this new technology is superkomputer.IBM make supercomputers, Stretch and Sprery-Rand-computer named LARC computer was developed for atomic energy laboratories, could handle large amounts of data, a capability much in demand by atomic scientists. 

                                           Image : LARC

Only two LARC ever installed and used: one at the Lawrence Radiation Labs in Livermore, California, and the other at the U.S. Navy Research and Development Center in Washington DCKomputer second generation replaced machine language with assembly language. Assembly language is a language that uses abbreviations to replace biner.Beberapa code programming language began to appear at the time. Programming language Common Business-Oriented Language (COBOL) and FORTRAN (Formula Translator) came into common use. These languages ​​replaced cryptic binary machine code with words, sentences, and math formulas more easily understood by humans. This allows a person to program and set the computer. A wide range of emerging careers (programmer, systems analyst, and expert computer systems). Industr software also began to appear and grow during this second-generation computer.


Third Generation
Although the transistor is in many ways the vacuum tube, but the transistor generates substantial heat, Jack Kilby, an engineer at Texas Instruments, developed the integrated circuit (IC: integrated circuit) in 1958.IC combined three electronic components in a small silicon disc made of quartz sand .Para scientists later managed to fit more components into a single chip, called a semiconductor. The result, computers became ever smaller as more components were squeezed onto the chip. Other third-generation development is the use of the operating system (operating system) that enables the machine to run many different programs at once with a central program that monitored and coordinated the computer's memory.

Fourth Generation 
After IC, the only place to go was down the size of circuits and electrical components. Large Scale Integration (LSI) could fit hundreds of components onto one chip. In the 1980's, the Very Large Scale Integration (VLSI) contains thousands of components on a single chip.Ultra-Large Scale Integration (ULSI) increased that number into the millions. The ability to put so many components in a chip that berukurang half pushing coins falling computer prices and sizes. It also increased power, efficiency and reliability. Intel chips are made in the year 4004 1971membawa advances in IC by putting all the components of a computer (central processing unit, memory, and control input / output) in a very small chip. Previously, the IC made to do a certain task specific. Now, a microprocessor could be manufactured and then programmed to meet all the requirements. Not long after, every household devices such as microwave ovens, televisions, and automobiles with electronic fuel injection (EFI) is equipped with a microprocessor.Such developments allow ordinary people to use a regular computer. The computer is no longer a dominance of large companies or government agencies. In the mid-1970s, computer assemblers offer their computer products to the general public. These computers, called minicomputers, sold with the software package that is easy to use by the layman. The most popular software at the time was word processing and spreadsheet programs. In the early 1980's, such as the Atari 2600 video game consumer interest in home computers are more sophisticated and can be programmed.In 1981, IBM introduced the use of the Personal Computer (PC) for use in homes, offices, and schools. Number of PCs in use jumped from 2 million units in 1981 to 5.5 million units in 1982. Ten years later, 65 million PCs in use. Computers continued evolution towards smaller sizes, from computers that are on the table (desktop computer) into a computer that can be put into bags (laptop), or even a computer that can be grasped (palmtop).IBM PC to compete with the Apple Macintosh in the fight over the computer market. Apple Macintosh became famous for popularizing the graphical system on his computer, while his rival was still using text-based computer. Macintosh also popularized the use of mouse devices.At the present time, we know the way to the use of IBM compatible CPU: IBM PC/486, Pentium, Pentium II, Pentium III, Pentium IV (series of CPU made by Intel). Also we know AMD K6, Athlon, etc.. This is all included in the class of fourth-generation computers.Along with the proliferation of computer usage in the workplace, new ways to explore the potential to be developed. Along with the increased strength of a small computer, the computers can be connected together in a network to share a memory, software, information, and also to be able to communicate with each other. Computer network allows a single computer to form electronic co-operation to complete an assignment process. By using direct cabling (also called Local Area Network or LAN), or [telephone cable, the network can become very large.


Source : http://id.wikipedia.org/wiki/Sejarah_komputer





 

No comments:

Post a Comment