Computing isn’t just getting cheaper. It’s becoming more energy efficient. That means a world populated by ubiquitous sensors and streams of nanodata.
The performance of computers has shown remarkable and steady growth, doubling every year and a half since the 1970s. What most folks don’t know, however, is that the electrical efficiency of computing (the number of computations that can be completed per kilowatt-hour of electricity used) has also doubled every year and a half since the dawn of the computer age.