According to recent scientific predictions, unless vast improvements are made to the designing of computer systems, the world will not generate enough electricity to power computers by 2040. The idea of Moore’s Law, which is the theory that the number of transistors in an integrated circuit doubles roughly every couple of years is now looking increasingly unlikely after this revelation.
There was a report released in late 2015 by the Semiconductor Industry Association (SIA) which contained some of these predictions, but it has only come to light now, due to the SIA waiting on the final roadmap assessment to be complete before release. The idea of the report is that as computers are becoming more and more powerful, they will inevitably demand more power to function effectively, which after looking at the research, is considerably more than what will be produced by the energy industry on a global spectrum.
The only way to combat this is to improve the world’s energy efficiency and develop greater power density.
"Industry's ability to follow Moore's Law has led to smaller transistors but greater power density and associated thermal management issues," the 2015 report explains. "More transistors per chip mean more interconnects – leading-edge microprocessors can have several kilometres of total interconnect length. But as interconnects shrink they become more inefficient."
SIA calculates that if the supply of electricity continues at the pace its going today, with the same approach its currently using, power for computers will not be sustainable by 2040 and the demand for electricity will ultimately outweigh the supply of it.
Looking for a new role in the energy industry?
Energy Jobline is working with the leading companies in your industry globally.
Click here to browse our latest jobs.