Future of Computer Hardware

Table of Content

Introduction

            Although there has been tremendous growth in the miniaturization and processing speed of computer hardware, but the logic of computers is no different from the first 30 ton computer made in 1941 by German engineer, Konrad Zuse. They are merely the smaller transistor equivalent of the first computer built from the large vacuum tubes.  Moore’s law tells us that the data density and processing speed doubles every 18 months. The microprocessor industries, namely Intel and AMD seem to have kept up with Moore’s Law. Recently their efforts of increasing the processing speed by increasing the clock frequency have reached a major roadblock, the tremendous power dissipation due to the high frequency of operation. These resulted into the recent development of dual-core processors. If Moore’s law is to be followed, even the multi-core from dual-core, quad-core to nth core, will not be enough in the future. The multi-core technology will reach a point where the common data bus between n numbers of cores will become a bottleneck. The demand from businesses and consumers will push future developments in the computer hardware to meet the exponential increase of digital data and globalization of internet.

This essay could be plagiarized. Get your custom essay
“Dirty Pretty Things” Acts of Desperation: The State of Being Desperate
128 writers

ready to help you now

Get original paper

Without paying upfront

Probable Alternative Technology

There are several candidates as what would be the future of computer hardware.  One of these is the replacement of electrical signals with optical signals to utilize the maximum speed of light. NASA scientists are working to solve the need for computer speed using light itself to accelerate calculations and increase data bandwidth (NASA, 2000). According to NASA organic materials can perform functions such as switching, signal processing and frequency doubling using less power than inorganic materials. This optical technology, due to the blazing speed of light, could relieve the bandwidth problems imposed by electronics. Optical data processing can perform several operations simultaneously (in parallel) much faster and easier than electronics (NASA, 2000). Also, scientist at Harvard University have shown how ultra-cold atoms can be used to freeze and control light to form the “core” – or central processing unit – of an optical computer (Reid, 2006). These could be a major breakthrough that could give rise to affordable optical computers in the near future.
Another alternative uses basically the same technology but increases the operating frequency further. To prevent the tremendous heat build-up, nanofluids are used for cooling. It is projected that if frequency of operation is used to increase processing power, the next generation of computer chips will produce localized heat flux over 10 MW/m2, with the total power exceeding 300 W. No existing low-cost cooling device can effectively manage the heat produced at this level except the use of nanofluids for ultrahigh-heat-flux electronic systems (Marquit, 2006). With nanofluid cooling methodology, the frequency of operation for computer chips could be increased further without burning the chip from the heat build up due to power dissipation.
Another alternative is quantum computing. In a quantum computer, the fundamental unit of information is called a quantum bit or qubit. It is not binary but rather more quaternary in nature. This quaternary nature makes its processing faster relative to a binary processing, For example, a system of 500 qubits, which is impossible to simulate classically, represents a quantum superposition of as many as 2500 states (West, 2000). With these tremendous processing power, scientist are now interested on what are the useful applications. The field of quantum computing has made numerous promising advancements since its conception, including the building of two- and three-qubit quantum computers capable of some simple arithmetic and data sorting. However, there are potentially large obstacles that prevent us from building a quantum computer that is comparable to today’s modern digital computer. Among the most formidable difficulties are error correction, decoherence, and hardware architecture (West, 2000).

Comparison of Alternative Computer Technology

Quantum computers are still basically far from reality since most of its large applications are still in theory. The remaining computer technology of the future is between optical computers and the use of nano-fluid cooling. The use of nano-fluids to cool microprocessors operating at extremely high power dissipation would still mean that these computers will consume large amount of power. As we have seen in the recent move to dual-core and multi-core technology, the power consumption is already becoming an issue. Therefore it is more probable that optical computers will be favored over nano-fluid cooled processors in the future.

Conclusion

With developments in optical computer technology booming worldwide, there is a big indication that it will become the future of computer hardware. It is the most feasible alternative to the present silicon based computers. The ever increasing demand of data traffic for digital electronics will meet a satisfying bandwidth through this technology.

References

NASA.  (2000). Now, Just a Blinkin’ Picosecond. Retrieved on January 7, 2007

from the NASA web site : ttp://science.nasa.gov/headlines/y2000/ast28apr_1m.htm.

Reid, D. (2005). Optical Computer made from Frozen Light. Retrieved on January 7,

2007 from the NASA web site : www.innovations-report.com/html/reports/physics_astronomy/report-42906.html.

Marquit, M. (2006). Future Computer Chips Could be Cooled With Nanofluid. Retrieved

on January 7, 2007 from the Physorg.com web site : http://www.physorg.com/news64771086.html

West, J. (2000). The Quantum Computer: An Introduction. Retrieved on January 7,

2007 from the Caltech.Edu web site : http://www.cs.caltech.edu/~westside/quantum-intro.html

.

 

Cite this page

Future of Computer Hardware. (2017, Jan 20). Retrieved from

https://graduateway.com/future-of-computer-hardware/

Remember! This essay was written by a student

You can get a custom paper by one of our expert writers

Order custom paper Without paying upfront