Columbia Unveils Supercomputer That Will Simulate Birth Of The Universe

April 21, 1998

Physicists at Columbia University have constructed one of the world's fastest supercomputers, one that can perform 400 billion calculations per second to simulate the three-trillion-degree conditions that existed at the birth of the universe, when the components of atomic nuclei boiled free into a ultra-hot plasma. The supercomputer, dubbed QCDSP, is the latest in a series of relatively inexpensive parallel supercomputers now in use at universities around the world that have been built at Columbia, in what has become a burgeoning cottage industry for the physics department. Run in tandem with a sister machine capable of 600 billion operations per second Columbia is also finishing at Brookhaven National Laboratories on Long Island, the supercomputer will be capable of peak performance at trillion-calculation-per-second levels at a cost of less than $4 million, less than a tenth the typical $50 million price tag for a commercial supercomputer with that speed. Comparable, but far more expensive, machines are operating at the National Security Agency, Fort Meade, Md.; Sandia National Laboratories, Sandia, N.M., and at Tsukuba University in Japan. A billion floating-point operations, or adds and multiplies, per second is commonly called a gigaflop; a trillion such operations is a teraflop!

The Columbia machine's vast computing power, which with others of its generation offers a 30-fold improvement over the previous generation of supercomputers, is needed to simulate the interactions between quarks and gluons, the tiny constituents of neutrons and protons, predicted by quantum chromodynamics (QCD) theory. Physicists believe that when normal matter is heated to three trillion degrees Fahrenheit, quarks and gluons -- never before observed outside an atomic nucleus -- boil free into an ultra-hot gas, called a quark-gluon plasma. The supercomputer will simulate this state, which scientists believe existed at the time of the Big Bang, perhaps 10 billion years ago, and hope to recreate at a particle accelerator under construction at Brookhaven.

"These computers should be able to make major contributions to our understanding of the properties of the known strongly interacting particles, allowing accurate calculation of particle masses and decay rates," said Norman Christ, chairman and professor of physics and one of the project's principal investigators. "By building them ourselves, we're able to acquire world-class machines at a price within the level of funding available to U.S. science."

The Columbia supercomputer will also be used to simulate interactions within atomic nuclei under other conditions, and could if necessary be programmed to carry out any of a number of highly complex calculations in fields such as weather forecasting, plasma physics and oil exploration. The only requirements in parallel supercomputing are that the problem be divisible among a large number of processors and that the data to be exchanged between these parts of the calculation be routed only between neighboring processors.

Such huge computing power can be obtained at such a low cost because of a series of design decisions made by Professor Christ's research group, which pioneered the construction of highly parallel supercomputers dedicated to QCD calculations in 1982. The supercomputer alone incorporates 8,192 individual nodes that are linked together to undertake computations in parallel rather than sequentially. Each node includes a communications controller, designed by Columbia physicists and built to their specifications, and a digital signal processor (DSP) manufactured by Texas Instruments, both mounted on a circuit board with 2 megabytes of memory. A combination of the physics topic and the type of processor used led to the supercomputer's name: QCDSP.

The Texas Instruments DSP is a unique choice of fast, inexpensive processor; no parallel computer with even a tenth as many of the processors has ever been constructed. Every node, or daughter board -- processor plus controller -- is a computer in its own right, with the computational performance of a Pentium PC but simplified to cost $80 and fit on a circuit board that measures 1-3/4 by 2-1/2 inches.

A mother board holds 64 of the daughter boards, and a portable crate holds 8 mother boards. Crates can be cabled together using a four-dimensional mesh of communications wires to create even larger computers. Adding a host workstation to talk to the machine completes the computer, since all other required circuits are supplied in each crate.

Smaller machines built by the physics department's staff are at work at other universities: a 50-gigaflop computer at Florida State University, a 6-gigaflop machine at Ohio State University and a 3-gigaflop machine at the University of Wuppertal in Germany, all built within the last year. Professor Christ said he would prefer to license the technology to a computer company that would build the machines, rather than tie up the department's staff with such projects.

The other principal investigators in Professor Christ's research group are Robert Mawhinney, associate professor of physics, who has developed a UNIX-like operating system, called QOS, for the supercomputers, and Alan Gara, associate research scientist, who has played a leading role in designing the supercomputers. Pavlos Vranas, a postdoctoral fellow, has been instrumental in creating the code that carries out the actual physics calculations on the machines. One former undergraduate and a number present or former physics Ph.D. students have made major contributions to the project, in addition to collaborators at the Columbia University's Nevis Labs, Fermilab, Florida State University, Trinity College Dublin and Ohio State University.

Quarks and gluons, also known as strongly interacting particles, are the least understood elements of the Standard Model, the prevailing physics theory that relates all known or predicted forces and particles. Quarks and gluons are so tightly bound that they have never been observed outside atomic nuclei. Physicists have thus turned to numerical simulations to study them, an approach that has demonstrated properties -- such as the impossibility of producing isolated quarks or gluons -- that could only be guessed from experiment.

Creating a quark-gluon plasma is one of the principal goals of the Relativistic Heavy Ion Collider being constructed at Brookhaven and due for completion next year. Until experimental efforts to create this quark-gluon plasma are successful, large-scale numerical simulations are the only source of information about the new form of matter, Professor Christ said. The numerical approach to the study of nuclear forces was invented nearly 25 years ago and is now an active segment of theoretical particle physics.

With the new Columbia and Tsukuba machines and a similar project in Rome, physicists expect dramatic progress in quantum chromodynamics theory. The increased capacity allows more complete and realistic QCD calculations and thus more precise tests of the theory and more accurate predictions of as-yet unobserved phenomena, such as the quark-gluon plasma. But even more exciting, Professor Christ said, is that this level of computer resources will allow computer experiments that explore new schemes for combining Einstein's relativity and quantum mechanics to create a unified physics theory, what some physicists call a "theory of everything."

"Such new quantum field theories may show further unexpected behavior that could provide clues to the origin of the Standard Model and examples of what may lie beyond it," Professor Mawhinney said.

Funding for the computers, and the Relativistic Heavy Ion Collider, has been provided by the Department of Energy. The Japanese Institute of Physical and Chemical Research, known by its Japanese acronym, RIKEN, has agreed to contribute $20 million to equip the Brookhaven collider to study high-energy protons, a key element in QCD theory. A description of the QCDSP can be found at http://phys.columbia.edu/~cqft.

This document is available at http://www.columbia.edu/cu/pr/. Working press may receive science and technology press releases via e-mail by sending a message to rjn2@columbia.edu.
-end-


Columbia University

Related Supercomputer Articles from Brightsurf:

Supercomputer reveals atmospheric impact of gigantic planetary collisions
The giant impacts that dominate late stages of planet formation have a wide range of consequences for young planets and their atmospheres, according to new research.

Supercomputer model simulations reveal cause of Neanderthal extinction
IBS climate scientists discover that according to new supercomputer model simulations, only competition between Neanderthals and Homo sapiens can explain the rapid demise of Neanderthals around 43 to 38 thousand years ago.

Supercomputer simulations present potential active substances against coronavirus
Several drugs approved for treating hepatitis C viral infection were identified as potential candidates against COVID-19, a new disease caused by the SARS-CoV-2 coronavirus.

Coronavirus massive simulations completed on Frontera supercomputer
Coronavirus envelope all-atom computer model being developed by Amaro Lab of UC San Diego on NSF-funded Frontera supercomputer of TACC at UT Austin.

Supercomputer shows 'Chameleon Theory' could change how we think about gravity
Supercomputer simulations of galaxies have shown that Einstein's theory of General Relativity might not be the only way to explain how gravity works or how galaxies form.

Scientists develop way to perform supercomputer simulations of the heart on cellphones
You can now perform supercomputer simulations of the heart's electrophysiology in real time on desktop computers and even cellphones.

Tianhe-2 supercomputer works out the criterion for quantum supremacy
A world's first criterion for quantum supremacy was issued, in a research jointly led by Prof.

Supercomputer simulations show new target in HIV-1 replication
Nature study found naturally-occurring compound inositol hexakisphosphate (IP6) promotes both assembly and maturation of HIV-1.

Researchers measure the coherence length in glasses using the supercomputer JANUS
Thanks to the JANUS II supercomputer, researchers from Spain and Italy (Institute of Biocomputation and Physics of Complex Systems of the University of Zaragoza, Complutense University of Madrid, University of Extremadura, La Sapienza University of Rome and University of Ferrara), have refined the calculation of the microscopic correlation length and have reproduced the experimental protocol, enabling them to calculate the macroscopic length.

Officials dedicate OSC's newest, most powerful supercomputer
State officials and Ohio Supercomputer Center leaders gathered at a data center today (March 29) to dedicate the Owens Cluster.

Read More: Supercomputer News and Supercomputer Current Events
Brightsurf.com is a participant in the Amazon Services LLC Associates Program, an affiliate advertising program designed to provide a means for sites to earn advertising fees by advertising and linking to Amazon.com.