Information
Equipo Nizkor
        Bookshop | Donate
Derechos | Equipo Nizkor       

08Jun18


Move Over, China: U.S. Is Again Home to World's Speediest Supercomputer


The United States just won bragging rights in the race to build the world's speediest supercomputer.

For five years, China had the world's fastest computer, a symbolic achievement for a country trying to show that it is a tech powerhouse. But the United States retook the lead thanks to a machine, called Summit, built for the Oak Ridge National Laboratory in Tennessee.

Summit's speeds, announced on Friday, boggle the mind. It can do mathematical calculations at the rate of 200 quadrillion per second, or 200 petaflops. To put in human terms: A person doing one calculation a second would have to live for more than 6.3 billion years to match what the machine can do in a second.

Still stupefying? Here is another analogy. If a stadium built for 100,000 people was full, and everyone in it had a modern laptop, it would take 20 stadiums to match the computing firepower of Summit.

China still has the world's most supercomputers over all. And China, Japan and Europe are developing machines that are even faster, which could mean the American lead is short-lived.

Supercomputers like Summit, which cost $200 million in government money to build, can accelerate the development of technologies at the frontier of computing, like artificial intelligence and the ability to handle vast amounts of data.

Those skills can be used to help tackle daunting challenges in science, industry and national security – and are at the heart of an escalating rivalry between the United States and China over technology.

For years, American tech companies have accused China of stealing their intellectual property. And some Washington lawmakers say that Chinese companies like ZTE and Huawei pose a national security risk.

Supercomputers now perform tasks that include simulating nuclear tests, predicting climate trends, finding oil deposits and cracking encryption codes. Scientists say that further gains and fresh discoveries in fields like medicine, new materials and energy technology will rely on the approach that Summit embodies.

"These are big data and artificial intelligence machines," said John E. Kelly, who oversees IBM Research, which helped build Summit. "That's where the future lies."

The global supercomputer rankings have been compiled for more than two decades by a small team of computer scientists who put together a Top 500 list. It is led by Jack Dongarra, a computer scientist at the University of Tennessee. The newest list will not be released until later this month, but Mr. Dongarra said he was certain that Summit was the fastest.

At 200 petaflops, the new machine achieves more than twice the speed of the leading supercomputer in November, when the last Top 500 list was published. That machine is at China's National Supercomputing Center in Wuxi.

Summit is made up of rows of black, refrigerator-size units that weigh a total of 340 tons and are housed in a 9,250 square-foot room. It is powered by 9,216 central processing chips from IBM and 27,648 graphics processors from Nvidia, another American tech company, that are lashed together with 185 miles of fiber-optic cable.

Cooling Summit requires 4,000 gallons of water a minute, and the supercomputer consumes enough electricity to light up 8,100 American homes.

The global supercomputer sprint comes as internet giants like Amazon, Facebook and Google in the United States and Alibaba, Baidu and Tencent in China take the lead in developing technologies like cloud computing and facial recognition.

Supercomputers are a measure of a nation's technological prowess. It is a narrow measure, of course, because raw speed is only one ingredient in computing performance. Software, which bring the machines to life, is another.

Scientists at the government labs like Oak Ridge are doing exploratory research in areas like new materials to make roads more robust, designs for energy storage that might apply to electric cars or energy grids, and potential power sources like harnessing fusion. All of those areas can benefit from supercomputing.

Modeling the climate, for example, can require running code on a supercomputer for days, processing huge amounts of scientific data like moisture and wind patterns, and modeling all the real-world physics of the environment. It is not the sort of task that can run efficiently on the cloud computing services supplied by internet companies, said Ian Buck, a computer scientist and general manager of Nvidia's data center business.

"Industry is great, and we work with them all the time," said Rick Stevens, an associate director of the Argonne National Laboratory in Illinois. "But Google is never going to design new materials or design a safe nuclear reactor."

At Oak Ridge, Thomas Zacharia, the lab director, cites a large health research project as an example of the future of supercomputing. Summit has begun ingesting and processing data generated by the Million Veteran Program, which enlists volunteers to give researchers access to all of their health records, contribute blood tests for genetic analysis, and answer survey questions about their lifestyles and habits. To date, 675,000 veterans have joined; the goal is to reach one million by 2021.

The eventual insights, Mr. Zacharia said, could "help us find new ways to treat our veterans and contribute to the whole area of precision medicine."

Dr. J. Michael Gaziano, a principal investigator on the Million Veteran Program and a professor at the Harvard Medical School, said that the potential benefit might well be a modern, supercharged version of the Framingham Heart Study. That project, begun in 1948, tracked about 5,000 people in a Massachusetts town.

Over a couple of decades, the Framingham study found that heart disease – far from previous single-cause explanations of disease – had multiple, contributing causes including blood cholesterol, diet, exercise and smoking.

Today, given the flood of digital health data and supercomputers, Dr. Gaziano said that population science might be entering a new golden age.

"We have all this big, messy data to create a new field – rethinking how we think about diseases," he said. "It's a really exciting time."

Although impressive, Summit can be seen as a placeholder. Supercomputers that are five times faster – 1,000 petaflops, or an exaflop – are in the works, both abroad and in the United States. The Energy Department's budget for its advanced computing program is being increased by 39 percent in the two fiscal years ending September 2019, said Paul M. Dabbar, the Energy Department's under secretary for science.

"We're doing this to help drive innovation in supercomputing and beyond," Mr. Dabbar said.

[Source: By Steve Lohr, The New York Times, 08Jun18]

Bookshop Donate Radio Nizkor

Privacy and counterintelligence
small logoThis document has been published on 11Sep18 by the Equipo Nizkor and Derechos Human Rights. In accordance with Title 17 U.S.C. Section 107, this material is distributed without profit to those who have expressed a prior interest in receiving the included information for research and educational purposes.