Currently, the world's most powerful supercomputers can ramp up to more than a thousand trillion operations per second, or a petaflop. But computing power is not growing as fast as it has in the past. On Monday, the June 2015 listing of the Top 500 most powerful supercomputers in the world revealed the beginnings of a plateau in performance growth.
...
The development rate began tapering off around 2008. Between 2010 and 2013, aggregate increases ranged between 26 percent and 66 percent. And on this June's list, there was a mere 17 percent increase from last November.
...
Despite the slowdown, many computational scientists expect performance to reach exascale, or more than a billion billion operations per second, by 2020.
Hmm, if they reach exascale computing will the weatherman finally be able to predict if it's going to rain this afternoon? Because he sucks at that now.
(Score: 4, Interesting) by hash14 on Tuesday July 21 2015, @02:15AM
I recall reading that Tianhe in China isn't getting much use because it consumes too much power. Researchers are hesitant to use it because it costs so much to get compute time, so they run it on a slower machine so they can save money while taking a small hit to the amount of time it takes to run.
I don't know much about building supercomputers, but from vague memories of what I read, it's not as difficult to build a superpowerful computer as it is to make one that's economically reasonable (ie. low power enough that researchers want to use it).