The Fugaku supercomputer has become the most powerful in the world

Japan now has the most powerful supercomputer in the world, according to the 55th edition of the Top500 ranking which has just been published. Called Fugaku, in reference to Mount Fuji, it was built by Riken and Fujitsu, in the city of Kobe. According to the HPL test, this new machine reaches a power of 415.5 petaflops, or 415 x 1015 operations per second. The supercomputer far exceeds the American Summit, which has so far been the fastest with 148 petaflops, or 2.8 times less.

It is the first time in nine years that Japan has risen to the top of the Top500 ranking, since the supercomputer K. Fugaku also took first place on the Graph 500, HPL-AI and HPCG tests, a placement never before obtained simultaneously by a single supercomputer.

A supercomputer made up of 7.3 million cores
The manufacturers opted for A64FX 48-core SoC produced by Fujitsu. For the first time, a supercomputer based on ARM processors dominates the Top500 ranking. The test was performed with 396 racks, containing 152,064 A64FX processors, or 7.3 million ARM cores.

Fugaku has already been put to work, well ahead of the original schedule. Its launch was scheduled for 2021, but it has been brought forward one year to assist in the fight against the Covid-19. The supercomputer works on two research axes. The first research is carried out at the molecular level, which attempts to see the effect of existing drugs on the virus. The second, at the macroscopic level, is concerned with the means of transmission and the effect on society. However, Fugaku is not expected to reach full power until next year.

Summit, the most powerful supercomputer in the world
Posted by Marc Zaffagni on 06/11/2018

The US Department of Energy has just unveiled its new supercomputer presented as the most powerful in the world. Delivering 200 petaflops, it is above all the very first of its kind to have performed an “exascale” scientific calculation.

In terms of supercomputers, the race for power that the great nations are engaged in tends towards a precise objective: to reach a so-called “exascal” computing capacity, that is to say one billion billion operations per second. In the United States, the national Oak Ridge laboratory has taken an important step in this direction with its new supercomputer.

Called Summit, this power monster made up of 4,608 IBM servers each containing two Power9 processors with 22 cores and six Nvidia Tesla V100 graphics processors, delivers a calculation capacity of 200 petaflops, therefore 200 x 1015 floating point operations per second. This makes it the most powerful supercomputer in the world. But the most important is that Summit made it possible to carry out the very first exascale calculation, in this case a comparative genomics calculation for research in bioenergy in health. According to the press release, the maximum calculation speed reached 1.88 exaops, or 1.88 x 1018 operations per second, but on whole numbers. The researchers said that the results of this calculation were identical to those performed by Titan, the Oak Ridge supercomputer of 17.59 petaflops put into service in 2012, which had worked much longer. For some operations, the statement said, the researchers hope to climb to 3.3 exaops.

China, France to target exascale supercomputer for 2020
Thanks to this advance, the United States believes that it will have its first exaflopic supercomputer in 2021. In this quest, China has announced that it will launch the first exascale class supercomputer in 2020. French Bull (Atos) has announced the same date for its Sequana exaflopic supercomputer. CEA, GENCI, Intel and the University of Versailles-Saint-Quentin-en-Yvelines have also created an Exascale research laboratory.

In the meantime, Summit will make it possible to advance research in astrophysics, in particular to study the way in which supernovas create heavy elements like gold and iron. The machine will also be used for atomic-scale simulations for the development of new materials (storage, conversion and production of energy), for the analysis of public health data to better understand the development of cancer in the population the United States. Finally, automatic and deep learning algorithms will be used for genetic and biomedical analysis on human diseases.

Leave a Reply

Your email address will not be published. Required fields are marked *