The success of the internet and computing industry over the last 25 years has brought with it unintended consequences. A big issue for modern economies is the spiralling percentage of GDP spent on computing and on dealing with its increasing environmental impact, comparable to the airline industry in terms of carbon footprint. The US spends an estimated 3% of its budget on maintaining the power needed for information processing and storage (some say this is an underestimate).
The problem isn’t going away as the world grows ever more interconnected and digital. As scientists, we must ask if we can bring technological solutions to an essentially technical problem. So our big question is how we can make number-crunching more efficient to reduce or at least off-set the growing power demands of the information age. Doing so would enable us to cut costs and carbon emissions.
Now for the slightly (and only) technical bit in this article. Thermodynamics, a branch of physics that deals with heat flows, suggests that some biological systems are still much more efficient information processors compared to man-made computers. Computers need between 1,000 – 10,000 units of energy per computational step (measured in terms of tiny a unit called ‘kT’ equivalent to ~ 10-21 Joules), compared to, say, a strand of DNA, which uses only around 100 kT per base pairing – the equivalent of a computational step since it can be thought of as a logical gate operation.
There is a real scientific opportunity here to mimic nature to reduce power consumption in computation. Even a small efficiency-gain could have a multiplier effect impact on power reduction, hence on economic and environmental cost. The problem is an urgent one because the number and power of processors in vast server farms (needed for information storage and retrieval) brought into service year-on-year by organizations like Google, Facebook, NSA, as well as the banking industry, will continue to rise dramatically.
The first step is to understand in detail specific biological information processes and their efficiency. As we are learning in our current programme on bio-inspired quantum technologies, quantum effects (where particles behave also as waves) could play a big role in efficient information and energy transfer. These quantum effects help information processing to become more reversible, and as we know from thermodynamics, a completely reversible physical process doesn’t lose any heat. Easy enough, in theory.
Now for the harder bit. The second step requires us to reverse-engineer the biological efficiency to make our basic computational steps more efficient. This does not necessarily mean making full-blown quantum computers – though that is the ultimate goal – but the key would be to embed features of quantum behaviour into classical information processors to make them more efficient.
This would have a disproportionately high impact since even a single-digit efficiency gain could bring about a large overall reduction in energy expenditure. Indeed, there is evidence that such a hybrid quantum-classical approach is also the route that nature takes.
An interesting question for economists, environmentalists and cost-benefit analysts would be to estimate the economic and environmental impact of doing computations at, say, 100 kT units of energy per gate instead of the current thousands of kTs. Would it indeed follow that GDP expenditure and carbon footprint would drop ten-fold as result? Or would some other non-linear multiplier effects kick-in that could leverage even a seemingly modest 1% energy saving per computation step?
Such research is inherently interdisciplinary and requires fresh thinking on the part of scientists, but equally, from policy makers and funders to cut across the traditional boundaries between the disciplines.