Distributed terascale facility to commence with $53 million NSF award

August 09, 2001

The world's first multi-site supercomputing system -- Distributed Terascale Facility (DTF) -- will be built and operated with $53-million from the National Science Foundation (NSF). The DTF will perform 11.6-trillion calculations per second and store more than 450-trillion bytes of data, with a comprehensive infrastructure called the "TeraGrid" to link computers, visualization systems and data at four sites through a 40-billion bits-per-second optical network.

The National Science Board (NSB) today approved a three-year NSF award, pending negotiations between NSF and a consortium led by the National Center for Supercomputing Applications (NCSA) in Illinois and the San Diego Supercomputer Center (SDSC) in California, the two leading-edge sites of NSF's Partnerships for Advanced Computational Infrastructure (PACI). NCSA and SDSC will be joined in the DTF project by Argonne National Laboratory (ANL) in suburban Chicago and the California Institute of Technology (Caltech) in Pasadena.

"The DTF will be a tremendous national resource," said NSF director Rita Colwell. "With this innovative facility, NSF will demonstrate a whole new range of capabilities for computer science and fundamental scientific and engineering research, setting high standards for 21st Century deployment of information technology."

"Terascale" refers to computers that perform more than one trillion floating-point operations per second, called "teraflops." The DTF would begin operation in mid-2002, reaching peak performance of 11.6 teraflops by April 2003. The facility will support research such as storm, climate and earthquake predictions; more-efficient combustion engines; chemical and molecular factors in biology; and physical, chemical and electrical properties of materials.

"This facility will stretch the boundaries of high-performance computing and give U.S. computer scientists and other researchers in all science and engineering disciplines access to a critical new resource," said NSB chair Eamon Kelly.

Adds Ruzena Bajcsy, NSF assistant director for Computer and Information Science and Engineering, "The DTF can lead the way toward a ubiquitous 'Cyber-Infrastructure' in which the national Grid of research networks will permit calculations, storage and throughput at tera levels. This facility will serve the high-end computational science community, help train the next generation of information-technology professionals and propagate the latest technology for maximum public benefit."

The partnership will work primarily with IBM, Intel Corporation and Qwest Communications to build the facility, along with Myricom, Oracle Corporation and Sun Microsystems. "The DTF will be the most comprehensive information infrastructure ever deployed for open scientific research, and we feel privileged to have a leadership role in this historic effort," said NCSA director Dan Reed and SDSC director Fran Berman in a joint statement. "The TeraGrid will integrate the most-powerful computers, software, networks, dataaccess systems and applications, creating a unique national resource that will catalyze new breakthroughs and yield unforeseen benefits for all of society." Berman and Reed are DTF co-principal investigators.

Each of the four DTF sites will play a unique role in the project:

* NCSA will lead the project's computational aspects with an IBM Linux cluster powered by Intel's second-generation 64-bit Itanium family processor, code-named "McKinley." Peak performance will be 6.1 teraflops with the cluster, which will work in tandem with existing hardware to reach 8 teraflops with 240 terabytes of secondary storage.

* SDSC will lead the project's data- and knowledge-management effort with a 4-teraflops IBM Linux cluster based on Intel's McKinley processor, with 225 terabytes of storage and a next generation Sun high-end server for managing access to Grid distributed data.

* Argonne will have a 1-teraflop IBM Linux cluster to host advanced software for high-resolution rendering, remote visualization and advanced Grid software.

* Caltech will focus on scientific data, with a .4-teraflop McKinley cluster and a 32-node IA-32 cluster that will manage 86 terabytes of on-line storage.

The DTF project director will be Rick Stevens, who is a computer science faculty member at the University of Chicago and director of the mathematics and computer science division of ANL, a U.S. Department of Energy laboratory. "I'm excited by this opportunity to help build on prior NSF and PACI successes," said Stevens, and it is a wonderful example of interagency cooperation."

The DTF will join a previous terascale facility commissioned by NSF in 2000. That system, located at the Pittsburgh Supercomputing Center, came on-line ahead of schedule in early 2001 and is expected to reach peak performance of 6 teraflops in October.
-end-
NSF is an independent federal agency that supports fundamental research and education across all fields of science and engineering, with an annual budget of about $4.5 billion. NSF funds reach all 50 states, through grants to about 1,800 universities and institutions nationwide. Each year, NSF receives about 30,000 competitive requests for funding, and makes about 10,000 new funding awards.

Receive official NSF news electronically through the e-mail delivery system, NSFnews. To subscribe, send an e-mail message to listmanager@nsf.gov. In the body of the message, type "subscribe nsfnews" and then type your name. (Ex.: "subscribe nsfnews John Smith")

Program contact:
Bob Borchers
703-292-8970/rborcher@nsf.gov

National Science Foundation

Related Data Articles from Brightsurf:

Keep the data coming
A continuous data supply ensures data-intensive simulations can run at maximum speed.

Astronomers are bulging with data
For the first time, over 250 million stars in our galaxy's bulge have been surveyed in near-ultraviolet, optical, and near-infrared light, opening the door for astronomers to reexamine key questions about the Milky Way's formation and history.

Novel method for measuring spatial dependencies turns less data into more data
Researcher makes 'little data' act big through, the application of mathematical techniques normally used for time-series, to spatial processes.

Ups and downs in COVID-19 data may be caused by data reporting practices
As data accumulates on COVID-19 cases and deaths, researchers have observed patterns of peaks and valleys that repeat on a near-weekly basis.

Data centers use less energy than you think
Using the most detailed model to date of global data center energy use, researchers found that massive efficiency gains by data centers have kept energy use roughly flat over the past decade.

Storing data in music
Researchers at ETH Zurich have developed a technique for embedding data in music and transmitting it to a smartphone.

Life data economics: calling for new models to assess the value of human data
After the collapse of the blockchain bubble a number of research organisations are developing platforms to enable individual ownership of life data and establish the data valuation and pricing models.

Geoscience data group urges all scientific disciplines to make data open and accessible
Institutions, science funders, data repositories, publishers, researchers and scientific societies from all scientific disciplines must work together to ensure all scientific data are easy to find, access and use, according to a new commentary in Nature by members of the Enabling FAIR Data Steering Committee.

Democratizing data science
MIT researchers are hoping to advance the democratization of data science with a new tool for nonstatisticians that automatically generates models for analyzing raw data.

Getting the most out of atmospheric data analysis
An international team including researchers from Kanazawa University used a new approach to analyze an atmospheric data set spanning 18 years for the investigation of new-particle formation.

Read More: Data News and Data Current Events
Brightsurf.com is a participant in the Amazon Services LLC Associates Program, an affiliate advertising program designed to provide a means for sites to earn advertising fees by advertising and linking to Amazon.com.