Sandia, Compaq Smash World Record In Large Database Sorting

November 10, 1998

ALBUQUERQUE, N.M. -- Just as scientists need very fast number-crunching computers to replace physical testing with computer models, business people need a corollary procedure: very fast data sorting - the ability to manipulate huge amounts of data rapidly.

Faster, cheaper sorting could make it possible to mine huge amounts of data to identify patterns of medical fraud among millions of transactions, or better locate threats to the vast on-line banking or communications networks.

Now the Department of Energy's Sandia National Laboratories has teamed with Compaq Computer Corporation, the largest global supplier of personal computers, to sort information three times faster than the previous record, at approximately two-thirds the installed cost of current techniques.

"For big-data problems, this is the fastest sorting machine known to man," said Sandia computer researcher Carl Diegert of the Sandia-located computer. "The technology to do this came from Compaq. The will and the vision came from Sandia."

The Sandia cluster of computers now has sorted a terabyte of data -- about the information contained in a million unabridged dictionaries -- in under 50 minutes. The previous record, on a shared-memory supercomputer rather than a cluster of industry-standard computers, was 2.5 hours.

"Our joint work on the 144-processor system was driven by Sandia's need to process and visualize simulation data from other large, massively parallel computers, while Compaq's interest was the commercially important sorting application," said George Davidson, Sandia project manager.

The importance of sorting, already high, should increase as large-scale data storage (called warehousing) increases. Estimates suggest that in three years, warehouses will increase from a 272-gigabyte average to a predicted 6.5 terabytes, and from a $15 billion to $113 billion market by 2002, according to a marketing report from a consulting firm, the Palo Alto Management Group.

Said John Rose, Compaq Senior Vice President and General Manager, "Compaq, along with key partners like Sandia, is demonstrating that clustered Windows NT systems can deliver the performance, reliability and cost-effectiveness required by high-end applications in technical and commercial markets. This particular demonstration is striking evidence of how far and how fast we've progressed." According to Amy Behle, business development manager of the U.S. database market for MUSE Technologies, a small Albuquerque-based company, "Database companies will be interested because sorting is absolutely fundamental to database queries. For decision support, running more queries in the same amount of time or getting results faster yields real business value."

"Our system is a cluster of 72 off-the-shelf dual Pentium II Proliant servers running Windows NT and linked by ServerNet," says Diegert. "We call it Kudzu because of its propensity (like the plant) to grow in any direction." The current arrangement, he says, for Sandia is a prototype used to verify the correct operation of a larger platform by setting up, completing, and verifying a terabyte sort.

"Unique to Kudzu," he says, "is its high-performance communications architecture. This, based today on Compaq's first-generation ServerNet hardware, allows Kudzu's 72 component computers to work as one and is, we believe, the world's largest of its kind. As costs drop over time, such arrangements will become commonplace in commercial and scientific operations."

Diegert believes it was Sandia's drive for faster, cheaper, scalable machines that "pushed vendors to achieve this goal sooner than otherwise."

For Sandia, "The overall purpose of this particular cluster is to develop faster, better ways of visualizing the results of complex simulations," not just sorting, says Sandia physicist Milt Clauser. "Visualization of terabyte data sets is crucial to our understanding of large, complex simulations, and the terabyte sort is an important first step towards our goal of developing the next generation of visualization computers."

The program is part of a DOE long-range plan to revolutionize computing and information technology at the Labs, and was the result of a two-year Cooperative Research and Development Agreement between Sandia and Compaq. The core of the mission was to develop powerful distributed computing, data manipulation and visualization technologies.

For the non-military customer, the project's goal was to develop a technology that would reduce complexity, increase flexibility, and improve service, all at large cost savings. Potential markets for cluster-based technologies would benefit from Compaq's leadership in the computer industry and Sandia's decades-long leadership in parallel computing, including its development with Intel of the world's first teraflop computer, says Davidson. To these ends, each partner funded their CRADA research with matching investments of $400,000 annually. All components of the cluster are currently available on the commercial market.

"The clustering architecture that scaled to this unprecedented sort performance uses the same key to scalability that Sandia pioneered in 1987 and which evolved to become the planet's first teraflops machine with over 9,000 processors, put into operation by Sandia in 1996," says Diegert. "The implications for commercial markets are significant, in that it demonstrates that large clusters that share a system network rather than a common memory have the power and sophistication to handle the big jobs at much lower costs."

Sandia is a multiprogram DOE laboratory, operated by a subsidiary of Lockheed Martin Corp. With main facilities in Albuquerque, N.M., and Livermore, Calif., Sandia has major research and development responsibilities in national security, energy, and environmental technologies.
Media contact: Neal Singer, 505-845-7078,
Technical contacts : Carl Diegert 505-845-7193,
George Davidson, 505-844-7902,

DOE/Sandia National Laboratories

Related Data Articles from Brightsurf:

Keep the data coming
A continuous data supply ensures data-intensive simulations can run at maximum speed.

Astronomers are bulging with data
For the first time, over 250 million stars in our galaxy's bulge have been surveyed in near-ultraviolet, optical, and near-infrared light, opening the door for astronomers to reexamine key questions about the Milky Way's formation and history.

Novel method for measuring spatial dependencies turns less data into more data
Researcher makes 'little data' act big through, the application of mathematical techniques normally used for time-series, to spatial processes.

Ups and downs in COVID-19 data may be caused by data reporting practices
As data accumulates on COVID-19 cases and deaths, researchers have observed patterns of peaks and valleys that repeat on a near-weekly basis.

Data centers use less energy than you think
Using the most detailed model to date of global data center energy use, researchers found that massive efficiency gains by data centers have kept energy use roughly flat over the past decade.

Storing data in music
Researchers at ETH Zurich have developed a technique for embedding data in music and transmitting it to a smartphone.

Life data economics: calling for new models to assess the value of human data
After the collapse of the blockchain bubble a number of research organisations are developing platforms to enable individual ownership of life data and establish the data valuation and pricing models.

Geoscience data group urges all scientific disciplines to make data open and accessible
Institutions, science funders, data repositories, publishers, researchers and scientific societies from all scientific disciplines must work together to ensure all scientific data are easy to find, access and use, according to a new commentary in Nature by members of the Enabling FAIR Data Steering Committee.

Democratizing data science
MIT researchers are hoping to advance the democratization of data science with a new tool for nonstatisticians that automatically generates models for analyzing raw data.

Getting the most out of atmospheric data analysis
An international team including researchers from Kanazawa University used a new approach to analyze an atmospheric data set spanning 18 years for the investigation of new-particle formation.

Read More: Data News and Data Current Events is a participant in the Amazon Services LLC Associates Program, an affiliate advertising program designed to provide a means for sites to earn advertising fees by advertising and linking to