by Tom Simonite: Plenty of people around the world got new gadgets Friday…
but one in Eastern Tennessee stands out. Summit, a new supercomputer unveiled at Oak Ridge National Lab is, unofficially for now, the most powerful calculating machine on the planet. It was designed in part to scale up the artificial intelligence techniques that power some of the recent tricks in your smartphone.
America hasn’t possessed the world’s most powerful supercomputer since June 2013, when a Chinese machine first claimed the title. Summit is expected to end that run when the official ranking of supercomputers, from an organization called Top500, is updated later this month.
Supercomputers have lost some of their allure in the era of cloud computing and humongous data centers. But many thorny computational problems require the giant machines. A US government report last year said the nation should invest more in supercomputing, to keep pace with China on defense projects such as nuclear weapons and hypersonic aircraft, and commercial innovations in aerospace, oil discovery, and pharmaceuticals.
Summit, built by IBM, occupies floor space equivalent to two tennis courts, and slurps 4,000 gallons of water a minute around a circulatory system to cool its 37,000 processors. Oak Ridge says its new baby can deliver a peak performance of 200 quadrillion calculations per second (that’s 200 followed by 15 zeros) using a standard measure used to rate supercomputers, or 200 petaflops. That’s about a million times faster than a typical laptop, and nearly twice the peak performance of China’s top-ranking Sunway TaihuLight.
During early testing, researchers at Oak Ridge used Summit to perform more than a quintillion calculations per second in a project analyzing variation between human genome sequences. They claim that’s the first time a scientific calculation has reached that computational scale.
America’s new best computer is significant for more than just the geopolitics of computational brawn. It’s designed to be more suited than previous supercomputers to running the machine learning techniques popular with tech companies such as Google and Apple.
One reason computers have lately got much better at recognizing our voices and beating us at board games is that researchers discovered that graphics chips could put more power behind an old machine learning technique known as deep neural networks. Facebook recently disclosed that a single AI experiment using billions of Instagram photos occupied hundreds of graphics chips for almost a month.
Summit has nearly 28,000 graphics processors made by Nvidia, alongside more than 9,000 conventional processors from IBM. Such heavy use of graphic chips is unusual for a supercomputer, and it should enable breakthroughs in deploying machine learning on tough scientific problems, says Thomas Zacharia, director of Oak Ridge National Lab. “We set out to build the world’s most powerful supercomputer,” he says, “but it’s also the world’s smartestsupercomputer.”
Eliu Huerta, a researcher at the National Center for Supercomputing Applications, at the University of Illinois at Urbana-Champaign, describes Summit’s giant GPU pool as “like a dreamland.” Huerta previously used machine learning on a supercomputer called Blue Waters to detect signs of gravitational waves in data from the LIGO observatory that won its founders the 2017 Nobel Prize in physics. He hopes Summit’s might will help analyze the roughly 15 terabytes of imagery expected to arrive each night from the Large Synoptic Survey Telescope, due to switch on in 2019.
Summit will also be used to apply deep learning to problems in chemistry and biology. Zacharia says it could contribute to an Energy Department project using medical records from 22 million veterans, about a quarter-million of which include full genome sequences.
Some people worried about US competitiveness in oversized calculating machines hope that the hoopla around Summit will inspire more interest in building its successors.
The US, China, Japan, and the European Union have all declared the first “exascale” computer—with more than 1,000 petaflops of computing power—as the next big milestone in large-scale computing. China claims it will achieve that milestone by 2020, says Stephen Ezell, vice president for global innovation policy at the Information Technology and Innovation Foundation. The US may get there in 2021 if Summit’s successor, known as Aurora, is completed on schedule, but the program has previously had delays.
The Trump administration’s budget this spring asked for $376 million in extra funding to help meet the 2021 target. It’s now up to the nation’s legislators to approve it. “High-performance computing is absolutely essential for a country’s national security, economic competitiveness, and ability to take on scientific challenges,” Ezell says.