Rice building Texas' fastest academic supercomputer

August 05, 2002

HOUSTON-- Aug. 5, 2002 -- Rice University has secured grant funding from the National Science Foundation and Intel Corporation to build a supercomputer that will rank among the world's fastest.

When fully operational next year, the Rice Terascale Cluster (RTC) will be approximately three times faster than any university computer in Texas. The RTC will consist of at least 70 interconnected servers containing the powerful new Intel Itanium 2 processor.

Housed at Rice's Computer and Information Technology Institute (CITI), RTC will be the first university computer in Texas with a peak performance of 1 teraflop, or 1 trillion floating-point operations per second (FLOPs), the standard measure of supercomputer performance. The total cost for RTC is undetermined. Funding includes $1.15 million from the NSF.

Were it operational today, RTC would rank among the 10 fastest academic supercomputers in the country and the top 25 university computers worldwide, according to www.top500.org, a semi-annual ranking of the world's top supercomputers that is compiled by researchers at the University of Tennessee and Germany's University of Mannheim.

The fastest supercomputer in Texas, according to the list, is the University of Texas at Austin's IBM Regatta-HPC Cluster, which has a peak performance of one-third of a teraflop.

Scientists need faster computers to tackle increasingly complex mathematical problems that would require weeks or months to compute on existing machines. For example, to precisely map the movements of every atom in a large molecule, researchers need to develop a complex mathematical model that contains thousands of variables. Such models are useful for drug designers and biomedical researchers, and a whole new scientific discipline known as bioinformatics has been created to solve this and other complex biological computations. Increasingly, research across academic disciplines requires a similar level of complex computation, and it also requires a new generation of distributed software.

"Rice faculty from disciplines as diverse as biochemistry, earth science, economics, neuroscience, computer science and political science will use RTC in their research," said Moshe Vardi, RTC principal investigator and CITI director. "It will also be a vital tool for basic computational research aimed at better designing software than can run on hundreds or even thousands of processors simultaneously."

Rice's proposal for NSF funding for RTC faced stiff competition in a process that saw awards for just one-in-three applicants. Rice won based on independent evaluations by reviewers who praised CITI's expertise in high-performance computing, the interdisciplinary nature of CITI research, and the caliber of the faculty involved.

Complex research already slated for RTC includes simulations of biomolecular interactions, the physics of heavy ion collisions, simulations of Internet-based computer applications running on hundreds of computers, and simulations that aim to better understand and predict international conflicts.

RTC is slated to begin operation in early 2003. Tentative plans call for the cluster to include 70 interconnected HP servers, each containing four 900-megahertz, Itanium 2 processors. The cluster will have more than 500 gigabytes of RAM, and it will be linked to a 1 terabyte array of dedicated hard drives.

According to www.top500.org, most of the fastest supercomputers, including the world's fastest -- the 35.9-teraflop NEC Earth Simulator in Japan -- and the United States' fastest -- the 7.2-teraflop ASCI White-Pacific at Lawrence Livermore National Laboratory in California -- are operated by private or government-run research laboratories. See http://www.top500.org/list/2002/06/ for the Top 500 list of supercomputers.
-end-


Rice University

Related Supercomputer Articles from Brightsurf:

Supercomputer reveals atmospheric impact of gigantic planetary collisions
The giant impacts that dominate late stages of planet formation have a wide range of consequences for young planets and their atmospheres, according to new research.

Supercomputer model simulations reveal cause of Neanderthal extinction
IBS climate scientists discover that according to new supercomputer model simulations, only competition between Neanderthals and Homo sapiens can explain the rapid demise of Neanderthals around 43 to 38 thousand years ago.

Supercomputer simulations present potential active substances against coronavirus
Several drugs approved for treating hepatitis C viral infection were identified as potential candidates against COVID-19, a new disease caused by the SARS-CoV-2 coronavirus.

Coronavirus massive simulations completed on Frontera supercomputer
Coronavirus envelope all-atom computer model being developed by Amaro Lab of UC San Diego on NSF-funded Frontera supercomputer of TACC at UT Austin.

Supercomputer shows 'Chameleon Theory' could change how we think about gravity
Supercomputer simulations of galaxies have shown that Einstein's theory of General Relativity might not be the only way to explain how gravity works or how galaxies form.

Scientists develop way to perform supercomputer simulations of the heart on cellphones
You can now perform supercomputer simulations of the heart's electrophysiology in real time on desktop computers and even cellphones.

Tianhe-2 supercomputer works out the criterion for quantum supremacy
A world's first criterion for quantum supremacy was issued, in a research jointly led by Prof.

Supercomputer simulations show new target in HIV-1 replication
Nature study found naturally-occurring compound inositol hexakisphosphate (IP6) promotes both assembly and maturation of HIV-1.

Researchers measure the coherence length in glasses using the supercomputer JANUS
Thanks to the JANUS II supercomputer, researchers from Spain and Italy (Institute of Biocomputation and Physics of Complex Systems of the University of Zaragoza, Complutense University of Madrid, University of Extremadura, La Sapienza University of Rome and University of Ferrara), have refined the calculation of the microscopic correlation length and have reproduced the experimental protocol, enabling them to calculate the macroscopic length.

Officials dedicate OSC's newest, most powerful supercomputer
State officials and Ohio Supercomputer Center leaders gathered at a data center today (March 29) to dedicate the Owens Cluster.

Read More: Supercomputer News and Supercomputer Current Events
Brightsurf.com is a participant in the Amazon Services LLC Associates Program, an affiliate advertising program designed to provide a means for sites to earn advertising fees by advertising and linking to Amazon.com.