There's an old saying: "Everything's bigger in Texas."
That now applies to supercomputers as well.
Sun Microsystems announced today that its hardware will power the largest supercomputer ever built, weighing in with 62,976 CPU cores, 125
terabytes of memory, 1.7 petabytes of disk space, and 504 teraflops of performance.
The computer, which has been dubbed "Ranger," will be hosted at the Texas Advanced Computing Center at the University of Texas, Austin. It is due to go online on January
1, 2008.
Ranger costs $30 million in hardware alone, and an additional $29 million for staffing and maintenance -- and is being
entirely funded by a grant from the National Science Foundation.
Still, Sun officials say that’s a bargain.
"(We have reached) unprecedented cost performance for scientific computing -- we are at sub-hundred thousand dollars per
teraflop," said Andy Bechtolsheim, chief architect and co-founder of Sun Microsystems.
Under the hood, Ranger's brain will be built from 16,744 quad-core AMD Opteron processors. The machine's production timeline
is dependent on how fast AMD can crank out the as-yet-unreleased chips, Bechtolsheim said.
At the time of its completion, Ranger will likely be the largest and fastest supercomputer in the world, beating out the
reigning champion, IBM's BlueGene computer, which comes in at a "paltry" 327 teraflops.
As if that weren't enough, the entirely new cluster will demand three megawatts of power and will cost the university one
million dollars per year to keep humming.
Beyond Ranger's sheer scale, the real advantage, scientists say, is that it will be entirely open to the scientific community.
Scientists nationwide will be able to conduct research on it at an unprecedented scale, whereas BlueGene is for classified
work only.
"To give you an idea, the system will be about six or seven times larger than any of the existing systems that the researchers
have access to," said Tommy Minyard, assistant director at the computing center.
Scientists expect that research in astrophysics, genomics, nanotechnology and meteorology will be carried out on the Ranger
system.
"A bigger supercomputer (not only) lets you run more forecast models. It can also allow you to run higher-resolution models,"
said Jay Boisseau, the center’s director. "You'll also be able to run models on much larger scales at high fidelity. As they build bigger
and bigger supercomputers, you can do nationwide (weather) forecasting at a higher resolution."
Other computer scientists say that Ranger marks the beginning of a new generation of machines that approach the petaflop
mark, as similar machines are going to be installed around the country in 2008.
"Five to ten years from now, a machine of this scale will be routine in a modest-sized cluster," said Rick Stevens, associate laboratory director for Computing and Life Sciences at Argonne National Laboratory. "In some ways it's like giving
us a time machine to look forward five to ten years as to what general purpose machines will look like. The smart developers
will take advantage of that (today)."