Can Supercomputers Help Japan Predict Earthquakes? - In the wake of a natural disaster, aid often comes in the form of food, water and medical supplies. Several universities are offering Japan another form of assistance that will help in the effort to rebuild: supercomputing capacity.
High-powered computers let researchers create simulations that can reveal important clues as to what may happen next, including where more earthquakes are likely to happen or the potential environmental impact of radiation in the air and water that was released from the Fukushima Dai-Ichi nuclear power plant. The magnitude-7.1 aftershock that struck Japan today at 10:30 a.m. Eastern time, about 215 miles northeast of Tokyo, underscores the need to run these simulations.
The rolling blackouts in Japan after the earthquake have made it difficult for researchers to use their own supercomputers as simulations can take several days to run. “They’ve got this actual data from the earthquake that they could be putting through models to think about things like aftershocks, tsunamis, as well as some of the climatological impact related to the water or the air,” says Tim Carroll, director and global leader for high performance computing at Dell.
After the March 11 earthquake, when Japanese researchers told Dell about the power problems, Dell coordinated with university facilities, including the University of Texas Advanced Computing Center, Florida State, Lawrence Livermore National Labs and Cambridge University to donate capacity.
Currently, the Texas Advanced Computing Center has six researchers from Japan on their high performance computing system. Three of those researchers are from the University of Tokyo Earthquake Research Institute and three are from RIKEN, a large natural sciences research institute in Japan, says Colleen Ryan, a Dell spokesperson.
Those researchers have so far used 117,000 of the 500,000 compute hours donated by the Texas Advanced Computing Center, she says. That is equivalent to 406 days of running data on a single server, but the researchers have done this computation in less than two weeks. “We don’t know precisely what they are doing, but the researchers from the University of Tokyo ERI are most likely doing some research around what happened during the initial earthquake, what changed in the earth’s structure and what might happen in the future,” she says.
Supercomputers are often used for complex calculations such as climate modeling or to simulate nuclear reactions. After the initial 9.0 earthquake in Japan on March 11, Dell computers running at a National Oceanic and Atmospheric Administration lab calculated where the tsunami waves actually hit.
“It correctly calculated wave height and distance of where those waves would actually hit, but it took roughly 12 hours to do the total computation, which means they got the answer right, but if we’d been able to do the computation faster it might have been useful to the people on the ground,” says Dell’s Carroll. The tsunami waves hit Japan’s coast within minutes after the earthquake.
Other areas where researchers might use high-performance computing power is for seismic analysis of nuclear reactors and other buildings, as well as how long it may take for radiation to dissipate from sea water, ground water and the atmosphere, says Dell’s Carroll.
On April 6, fisherman in Ibaraki prefecture, Japan’s fifth-largest seafood producer, stopped operations after tainted fish were found south of the location where radioactive water from a nuclear reactor at Fukushima Dai-Ichi contaminated the sea, reported Bloomberg News.
Researchers will likely run data from today’s earthquake. Says Carroll, “Each time they get real data from actual quakes, it makes the predictions better and better.” ( businessweek.com )
High-powered computers let researchers create simulations that can reveal important clues as to what may happen next, including where more earthquakes are likely to happen or the potential environmental impact of radiation in the air and water that was released from the Fukushima Dai-Ichi nuclear power plant. The magnitude-7.1 aftershock that struck Japan today at 10:30 a.m. Eastern time, about 215 miles northeast of Tokyo, underscores the need to run these simulations.
The rolling blackouts in Japan after the earthquake have made it difficult for researchers to use their own supercomputers as simulations can take several days to run. “They’ve got this actual data from the earthquake that they could be putting through models to think about things like aftershocks, tsunamis, as well as some of the climatological impact related to the water or the air,” says Tim Carroll, director and global leader for high performance computing at Dell.
After the March 11 earthquake, when Japanese researchers told Dell about the power problems, Dell coordinated with university facilities, including the University of Texas Advanced Computing Center, Florida State, Lawrence Livermore National Labs and Cambridge University to donate capacity.
Currently, the Texas Advanced Computing Center has six researchers from Japan on their high performance computing system. Three of those researchers are from the University of Tokyo Earthquake Research Institute and three are from RIKEN, a large natural sciences research institute in Japan, says Colleen Ryan, a Dell spokesperson.
Those researchers have so far used 117,000 of the 500,000 compute hours donated by the Texas Advanced Computing Center, she says. That is equivalent to 406 days of running data on a single server, but the researchers have done this computation in less than two weeks. “We don’t know precisely what they are doing, but the researchers from the University of Tokyo ERI are most likely doing some research around what happened during the initial earthquake, what changed in the earth’s structure and what might happen in the future,” she says.
Supercomputers are often used for complex calculations such as climate modeling or to simulate nuclear reactions. After the initial 9.0 earthquake in Japan on March 11, Dell computers running at a National Oceanic and Atmospheric Administration lab calculated where the tsunami waves actually hit.
“It correctly calculated wave height and distance of where those waves would actually hit, but it took roughly 12 hours to do the total computation, which means they got the answer right, but if we’d been able to do the computation faster it might have been useful to the people on the ground,” says Dell’s Carroll. The tsunami waves hit Japan’s coast within minutes after the earthquake.
Other areas where researchers might use high-performance computing power is for seismic analysis of nuclear reactors and other buildings, as well as how long it may take for radiation to dissipate from sea water, ground water and the atmosphere, says Dell’s Carroll.
On April 6, fisherman in Ibaraki prefecture, Japan’s fifth-largest seafood producer, stopped operations after tainted fish were found south of the location where radioactive water from a nuclear reactor at Fukushima Dai-Ichi contaminated the sea, reported Bloomberg News.
Researchers will likely run data from today’s earthquake. Says Carroll, “Each time they get real data from actual quakes, it makes the predictions better and better.” ( businessweek.com )
No comments:
Post a Comment