Scientists have predicted that unless radical improvements are made in the way we design computers, by 2040, computer chips will need more electricity than what our global energy production can deliver.
The projection could mean that our ability to keep pace with Moore’s Law – the idea that the number of transistors in an integrated circuit doubles approximately every two years – is about to slide out of our grasp.
The prediction about computer chips outpacing electricity demand was originally contained in a report released late last year by the Semiconductor Industry Association (SIA), but it’s hit the spotlight now, due to the group issuing its final roadmap assessment on the outlook for the semiconductor industry.
The basic idea is, that as computer chips become ever more powerful thanks to their greater transistor counts, they’ll need to suck more power in order to function (unless efficiency improves).
Semiconductor manufacturers can counter this power draw by clever engineering, but the SIA says there’s a limit to how far this goes in current methods.
“Industry’s ability to follow Moore’s Law has led to smaller transistors but greater power density and associated thermal management issues,” the 2015 report explains.
“More transistors per chip mean more interconnects – leading-edge microprocessors can have several kilometres of total interconnect length. But as interconnects shrink they become more inefficient.”
In the long run, the SIA calculates that, at the rate things are going using today’s approaches to chip engineering, “computing will not be sustainable by 2040, when the energy required for computing will exceed the estimated world’s energy production”.