Back to Contents of Issue: November 2003
|
|
by Tim Hornyak |
|
This is Kageyama's conception of the engine behind our planet's magnetic fields. He thumbs a button and another particle comes into being, tracing helical convection currents through the mantle as the perspective soars to a point several kilometers above the North Pole. Suddenly he flicks a laser wand and the mesh sphere disappears. The virtual reality chamber's walls go completely blank.
"The human eye is a nice tool to extract information from images," Kageyama notes as he removes his goggles. "We use this facility to visualize the 3D structure hidden in a sea of numerical data. By following the virtual particle's motion, we can visualize or understand fluid motion. It's very complicated, but we can extract some organization."
Kageyama is trying to understand the subterranean forces that make compasses point north, cause auroras to appear near the poles and ultimately shield our atmosphere from decimation by the solar wind.
Exactly how the magnetosphere is produced is a mystery being unraveled by scientists like Kageyama, a physicist at the Japan Marine Science and Technology Center (JAMSTEC). One of the reasons Kageyama and others like him are ahead of the game is because they work on the most advanced computer in the world.
Power tool
Outside the fence surrounding the government-run Earth Simulator Center, you can hear the air conditioners humming around the massive hangar as the JPY40 billion NEC-built behemoth begins its work. Its primary mission is to render our world in digital form so that scientists can better understand environmental changes and processes, from the magnetosphere's origins to tectonic motion to sea temperature fluctuations.
"The primary objective is to make reliable prediction data for Earth's environmental changes, such as global warming and earthquake dynamics," says Tetsuya Sato, director general of the ESC, which is under JAMSTEC. "Already, global ocean circulation simulations have confirmed that mesoscale phenomena such as typhoons and rain fronts can be nicely reproduced."
It's been just over a year since actual research projects using the Earth Simulator were started in September 2002, with roughly 40 studies. While it's still too early for definitive conclusions from such complex analyses, the results so far are promising.
Takahashi is doing unprecedented weather simulations with a "coupled" ocean-atmosphere model that incorporates the creation and depletion of sea ice to understand the mechanisms of climate change and what role the periodic El Nino effect plays in the process. The work is especially timely amid the high number of extreme weather incidents around the world this year, from tornadoes in the US in May that killed 41, to the summer heat wave in France that is believed to have caused a staggering 15,000 deaths.
Such events have led the UN World Meteorological Organization, which usually occupies itself with compiling statistics, to issue a warning that hazardous weather incidents could continue to increase in number and intensity, squarely blaming global warming and climate change. It added that land temperatures for May 2003 were the warmest on record.
"If we continue with similar lifestyles, we may see more severe conditions," Takahashi says, noting that the task of accurately unraveling the relationships between local weather conditions, climate and global warming is an extremely complex task and would require a next-generation Earth Simulator.
For now, she is preparing for the future by turning to the past to shed light on the present: Another focus of her work is the world's paleoclimatic conditions tens of thousands of years ago. Observational data on the chemical makeup of corals, ice cores and marine sediments from cold and warm eras can provide a picture of the weather conditions early humans faced, and can contribute to our understanding of modern predicaments.
Of catfish and magic stones
"The Earth Simulator is a bit small for that, but by using it we'll be able to make preparations and some progress toward predicting earthquakes," says Sato. "Also: we'll be able to know whether true prediction is impossible by simulation or scientific efforts."
Despite inquiries into their electrosensory abilities, which they use to catch prey in murky water, studies with catfish in labs throughout the country have produced no firm conclusions about tremor timing.
Historical records dating to the 7th century indicate that thrust faulting of the Philippine Plate has caused earthquakes along the Nankai Trough off Shikoku every 90 to 140 years, causing more than 2,500 casualties in 1946 and 1994. Some experts have forecast several tremors of magnitude 8 or higher in the area between 2020 and 2050. Meanwhile, Furumura has used the supercomputer to estimate the potential effects of the next quake in terms of the seven-point Japanese seismic intensity scale.
The machine can also be used in fields outside earth science. For instance, scientists are using its tremendous computing power to study the conductivity and strength properties of nanocarbon tubes, microscopic tubular structures a billionth of a meter in diameter that could be 30 to 100 times tougher than steel while only one-sixth its weight. Potential applications include electronics with a much higher transistor density, atomic-scale mechanisms, new aerospace materials and new energy storage devices.
Hello, Computenik
Linking them is 2,400 km of cable, enough to join Hokkaido to Okinawa, and surging through it all (including a lighting system) is roughly 7 million watts of electricity per year at a cost of $7 million.
The water-cooled vector supercomputer based on NEC SX technology has a theoretical top speed of 40 teraflops per second, or 40 trillion floating point operations, and a main memory of 10 terabytes, which is about 10 trillion bytes. Visitors might expect to be greeted by the silky voice of the HAL 9000 from 2001: A Space Odyssey. But the sealed chamber, where the blue-green cabinets are arrayed in a circular pattern that evokes the Earth itself, is quiet save for the endless drone of air conditioners.
Meuer and his colleagues publish their rankings twice a year. The Earth Simulator remained tops in the latest edition, with a certified speed of 35.86 teraflops, far ahead of second place Hewlett-Packard's ASCI Q, which clocked in at 13.88. Despite fierce competition in the small but strategically important $5 billion supercomputer market, Meuer says he expects the NEC instrument to retain its lead until June 2005, when IBM's ASCI Purple, a 100-teraflop machine commissioned by the US Department of Energy for nuclear weapons testing, will enter the scene.
In April 1997, the project received funding from the government, and Miyoshi spent many late nights over the next five years overseeing design and construction, which began in March 2000. The initial design, based on the processor technology of the time, called for a machine the size of a baseball stadium. But his dream became reality when the Earth Simulator was turned on two years later, boasting the key development of a highly efficient transfer rate between switches and a low delay time in data transfer.
Miyoshi died four months before the Earth Simulator was completed. In June 2002, it was ranked the fastest machine in the world.
"The arrival of the Japanese supercomputer evokes the type of alarm raised by the Soviet Union's Sputnik satellite in 1957," says Top500 coauthor Jack Dongarra of the University of Tennessee. "In some sense, we have a Computenik on our hands."
US manufacturers have been focusing on producing machines with lots of cheaper components in a reflection of the rising popularity of smaller, cost-effective computing with off-the-shelf chips led by the PC market.
Grid computing, for example, lets users access machines on the Internet or through private networks so that programs can solve complex problems by farming out small pieces of a task to a large number of machines. The SETI@home project attempts to sift through radio telescope monitoring data for signs of extraterrestrial life by harnessing the power of individual computers owned by over 4 million volunteers.
"It is very important for our mission to ensure that the Earth Simulator is used only for scientific or peaceful purposes," Kiyoshi Otsuka, a JAMSTEC official who worked at the ESC's Research Exchange Group said of visiting scholars. "We must check their research carefully."
The algorithm of nature
"The dominance of the Earth Simulator will continue for six or seven years in terms of practical performance," says Sato, who has already started lobbying Tokyo to fund the construction of a more powerful successor in seven years with the same budget as the first, a general-purpose "holistic simulator" that would be faster by a factor of 10 to the power of 3 or 4, and geared at tackling the two fronts of micro and macro phenomena, from protein structures to predicting climate change and, perhaps, earthquakes.
"The next-generation computer should be based on the algorithm of nature," Sato adds, "by adapting natural structure into the architecture of the Earth Simulator itself."
"In order to change humans' way of thinking, we need bigger general-purpose supercomputers," he says. "I mean something that can cover all of nature, from the microscopic to the macroscopic. We need the tools to deal with entire systems." He pauses thoughtfully. "Otherwise, we cannot change real lives." @
|
Note: The function "email this page" is currently not supported for this page.