Brookhaven Lab demonstrates participation in worldwide computing efforts at SC2003

November 13, 2003

UPTON, NY - For a glimpse of how the U.S. Department of Energy's Brookhaven National Laboratory manages millions of gigabytes of data produced by one of the world's premier nuclear physics experiments, visit the Laboratory's exhibit at the SC2003 (Supercomputing) conference in Phoenix, Arizona, from November 15 to 21.

The exhibit will demonstrate Brookhaven's computing services for the Relativistic Heavy Ion Collider (RHIC), a large-scale nuclear physics facility that accelerates gold ions to nearly the speed of light in opposite directions around a 2.4-mile circular path, then forces them to collide. RHIC scientists believe the collisions will "melt" the ions to form quark-gluon plasma, a type of matter thought to have existed just after the Big Bang, which may offer insight into the basic structure of matter and how it has evolved over time.

RHIC generates so much data that a global computing system is required to collect, process, and analyze it. Brookhaven Lab operates the main computing facility for RHIC, with several petabytes -- several million gigabytes -- of data-storage capacity. By comparison, a new home computer has just a few dozen gigabytes of storage.

The Laboratory is also currently developing a computing system for an upcoming world-leading particle-physics facility, the Large Hadron Collider (LHC) in Geneva, Switzerland, which is under construction and scheduled to begin operating in 2007.

Brookhaven will be the primary U.S. computing site for one of the LHC's largest experiments and is creating a system of more than 1,000 computers with a combined storage capacity of more than one million gigabytes. By building this system, one of several similar facilities within the larger LHC global computing infrastructure, the Laboratory will be taking part in a worldwide computing effort of unprecedented size and scope.

The current status of that work will be presented at the exhibit, as well as Brookhaven's efforts to integrate both its RHIC and LHC computing facilities into a more comprehensive data grid. The grid will give scientists across the globe access to data generated by large-scale physics and astronomy experiments.

Brookhaven's Center for Data Intensive Computing, which performs research in advanced scientific computing, will highlight the terascale simulation tools and technology project. This is a multi-lab effort through which Brookhaven studies droplet formation in diesel fuel injectors, in order to improve combustion efficiency.

Additionally, developments in the design of the Laboratory's new 10,000-processor supercomputer, which will aid in high-energy physics calculations, will be presented at 1:30 p.m. on Tuesday, November 18, in the Brookhaven booth by Robert Mawhinney of Columbia University.

Also at the booth, Brookhaven will be showing a stereoscopic visualization of RHIC collision data, results from a collaboration with NIH of early cancer detection using proteomic data from blood samples, a visualization of atmospheric sulfate studies and a visualization of protein folding with respect to new drug design research. Computational research using graphics cards and clusters will also be discussed.

The conference is at the Phoenix Civic Plaza Convention Center. See more information at:
http://www.sc-conference.org/sc2003/.
-end-


DOE/Brookhaven National Laboratory

Related Data Articles from Brightsurf:

Keep the data coming
A continuous data supply ensures data-intensive simulations can run at maximum speed.

Astronomers are bulging with data
For the first time, over 250 million stars in our galaxy's bulge have been surveyed in near-ultraviolet, optical, and near-infrared light, opening the door for astronomers to reexamine key questions about the Milky Way's formation and history.

Novel method for measuring spatial dependencies turns less data into more data
Researcher makes 'little data' act big through, the application of mathematical techniques normally used for time-series, to spatial processes.

Ups and downs in COVID-19 data may be caused by data reporting practices
As data accumulates on COVID-19 cases and deaths, researchers have observed patterns of peaks and valleys that repeat on a near-weekly basis.

Data centers use less energy than you think
Using the most detailed model to date of global data center energy use, researchers found that massive efficiency gains by data centers have kept energy use roughly flat over the past decade.

Storing data in music
Researchers at ETH Zurich have developed a technique for embedding data in music and transmitting it to a smartphone.

Life data economics: calling for new models to assess the value of human data
After the collapse of the blockchain bubble a number of research organisations are developing platforms to enable individual ownership of life data and establish the data valuation and pricing models.

Geoscience data group urges all scientific disciplines to make data open and accessible
Institutions, science funders, data repositories, publishers, researchers and scientific societies from all scientific disciplines must work together to ensure all scientific data are easy to find, access and use, according to a new commentary in Nature by members of the Enabling FAIR Data Steering Committee.

Democratizing data science
MIT researchers are hoping to advance the democratization of data science with a new tool for nonstatisticians that automatically generates models for analyzing raw data.

Getting the most out of atmospheric data analysis
An international team including researchers from Kanazawa University used a new approach to analyze an atmospheric data set spanning 18 years for the investigation of new-particle formation.

Read More: Data News and Data Current Events
Brightsurf.com is a participant in the Amazon Services LLC Associates Program, an affiliate advertising program designed to provide a means for sites to earn advertising fees by advertising and linking to Amazon.com.