Computers now are very useful - Computer Mathematics introduction. They make your research works easier and faster. Technology now has helped a lot in making computer much upgraded and more high-tech. It is important for us to know how computers were discovered and developed in order for us to know the necessity of each component that makes up each computer unit.
More Essay Examples on Computer Rubric
Early forerunners of the computer were the abacus, developed in the ancient times in the Far East, and an adding machine invented in 1641 by Blaise Pascal of France. The principle of the punched card was developed about 1801 by Joseph Marie Jacquard, also of France. His cards were used to control the pattern in textiles by a loom.
All of the basic principles of the modern digital computers, the input and output devices, storage and arithmetic units, and the sequencing of instructions were conceived in the 1820’s and 1830’s by Charles Babbage, an English Mathematician. He completed a small computer, called a difference engine, in 1822. It consisted primarily of gears and levers and was similar to a modern mechanical desk calculator.
In about 1836, with aid from British government, Babbage completed plans for what he called an analytical engine. This was a mechanical device, controlled by punched cards, that was comparable in many respects to a modern computer. Babbage was ahead of his time, however, because to build such a machine required more precision than existing technology could provide.
The next development occurred late in the 19th century when Herman Hollerith, an American, developed the modern punched cards. Hollerith devised the 12-row code, still used today, for use in the 1890 census.
In 1925, Vannever Bush, an American physicist, began work on an analog computer, and by 1931 he had developed a practical machine. The first large digital computer, an electrical device using relays for switching was the Mark I, Automatic Sequence Controlled Calculator, installed at Harvard University in 1943. It was developed by Howard Aiken, an American, with support of International Business Machines Corporation.
The digital computer operates by counting. The term digital refers to digits; that is, to the individual numerals that make up numbers. Most digital computers use the Binary or base 2 number system, which is made up of only two digits; namely, 1 and 0. Under this system, the decimal number 1 is written as 1; 2 as 10; 3 as 11; 4 as 100; 5 as 101; 6 as 110; 7 as 111; 8 as 1000; etc.
By using binary digits, the computer has to differentiate between only two possibilities. For example, if a certain spot on a magnetic tape is magnetized in one direction, it represents 1; if magnetized in the opposite direction, it represents 0. Similarly, if a switch is open, it represents 0; if closed, it represents 1; thus one closed switch followed by two open switches represents the binary number 100, or the numerical 4 in the decimal system.
The computer performs its functions through the use of signals in the form of electrical pulses. The pulses can magnetize objects; they can be routed on various paths by mechanical or electronic switches; and they can control switches to operate various kinds of machines
The next step was the replacement of the relatively slow relay with a faster switch, the electron tube. The first computer to employ this device was ENIAC or the Electronic Numerical Integrator and Calculator, developed by J. Presper Eckert and John W. Mauchly during the World War II and put into operation in 1946.
The concept of the storage program was developed in the late 1940’s and first put into use in 1958. Earlier computers had to be programmed step by step by inserting and removing circuit boards. A pioneer in this field was John Von Neumann, an American mathematician.
Electron- tube computers had several disadvantages; the tubes generated large amounts of heat, were bulky, and required power-supply units. Also, the tubes, even though they could switch on or off in a millionth of a second, were too slow for many potential applications. The transistor, invented in 1948, made possible smaller and faster computers called second-generation computers.
The next step was the development of even smaller circuits, resulting in the introduction of third-generation computers about 1963. Continued improvements in techniques for placing more and more circuits on a single chip resulted in reductions in both the size and cost of computers through the 1960’s and 1970’s. Minicomputers came on the market around 1967 and the even smaller microcomputer became practical about 1971.
Since the early 1950’s the computer industry have been one of the most rapidly growing industries in the United States. Its growth was spurred in the mid- 1960’s by the introduction of time-sharing. By 1980 improvements in microcomputers had sparked a fast-growing demand for their use as personal computers in the home, school, office, and elsewhere.
The rapidly increasing number of personal computers promoted the development of computer networks that provide subscribers a variety of information and communication services, such as access to instructional material, data banks, and news reports. In the early 1980’s microcomputers were being used as controlling devices in an ever-growing number of products, from games and toys to typewriters, automobiles, and industrial robots. In the 1990’s personal computers became more popular and is widely used. Instant access to the internet and online games, chatting, surfing and even talking to someone is made possible due to the fast developing technology that is now being adopted. Now, palmtops, laptops and others smaller and portable computers are being made and are now used by the majority for the business, schoolwork’s and etc. Life’s work has been made simpler due to the development of computers and the still on-going technology applied in the computer.
Bailey, D. M. , and Laura Castore. Careers in Computers ( Messner, 1985).
Darcy, Laura, and Louise Boston. Webster’s New World Dictionary of Computer Terms, revised edition (Simon and Schuster, 1986).
Long, Larry. Introduction to Computers and Information Processing (Prentice-Hall, 1988).
Schneiderman, Ron. Computers: From Babbage to the Fifth Generation (Watts, 1986).