We are witnessing the beginning of a new age in computing, an age that will allow computers to perform even thousands of times faster than the fastest supercomputer of our days. Those computers will operate on the exaFLOP level and are expected to appear by the end of the decade.
Dimitrios Nikolopoulos, professor at the School of Electronics at the Queen's University Belfast offered an interview for CNN in which he described the exaFLOP computers as the next frontier for high-performance computing. According to the same source, the biggest superpowers of the world, like the U.S., China or Japan, have already invested millions in supercomputer research.
Professor Nikolopoulos says that today's supercomputers operate at the petaFLOP level, being able to perform operations at one quadrillion per second.
According to TOP500, the most powerful supercomputer is owned by Japan - the K computer - which is as big as a football field, says Nikolopoulos. You're talking about many, many lanes of computer racks and thousands of processors, continues the professor. For example, the K computer features over 88,000 processors, while the upcoming exaFLOP computers will have at least 1 million processors. In spite of this, the next-generation supercomputers might get smaller, says Nikolopoulos.
The current projections suggest that power consumption of exascale computers will be 100 megawatts. It's impossible to build a suitable facility and have enough power (...) Changing materials and also the architecture of processors and memories is critical to exascale's success, the professor said.
Will the Exascale Computing Bring Something Different?
Nikolopolous says the exascale computing will improve the research process in several areas of science, such as Aerospace engineering, biology, astrophysics, national security and even social sciences.
More and more people are interested in understanding the behaviors of societies as a whole. These require simulations - how people interact, communicate, how they move. That will require exascale computing, he said.
(reported by Laurentiu Stan, edited by Surojit Chatterjee)