Nav: Home

'Charliecloud' simplifies Big Data supercomputing

June 07, 2017

LOS ALAMOS, N.M., June 7, 2017 -- At Los Alamos National Laboratory, home to more than 100 supercomputers since the dawn of the computing era, elegance and simplicity of programming are highly valued but not always achieved. In the case of a new product, dubbed "Charliecloud," a crisp 800-line code helps supercomputer users operate in the high-performance world of Big Data without burdening computer center staff with the peculiarities of their particular software needs.

"Charliecloud lets users easily run crazy new things on our supercomputers," said lead developer Reid Priedhorsky of the High Performance Computing Division at Los Alamos. "Los Alamos has lots of supercomputing power, and we do lots of simulations that are well supported here. But we've found that Big Data analysis projects need to use different frameworks, which often have dependencies that differ from what we have already on the supercomputer. So, we've developed a lightweight 'container' approach that lets users package their own user defined software stack in isolation from the host operating system."

To build container images, Charliecloud sits atop the open-source Docker product that users install on their own system to customize the software choices as they wish. Users then import the image to the designated supercomputer and execute their application with the Charliecloud runtime, which is independent of Docker. This maintains a "convenience bubble" of administrative freedom while protecting the security of the larger system. "This is the easiest container solution for both system administrators and users to deal with," said Tim Randles, co-developer of Charliecloud, also of the High Performance Computing Division. "It's not rocket science; it's a matter of putting the pieces together in the right way. Once we did that, a simple and

The open-source product is currently being used on two supercomputers at Los Alamos, Woodchuck and Darwin, and at-scale evaluation on dozens of nodes shows the same operational performance as programs running natively on the machines without a container. "Not only is Charliecloud efficient in compute time, it's efficient in human time," said Priedhorsky. "What costs the most money is people thinking and doing. So we developed simple yet functional software that's easy to understand and costs less to maintain."

Charliecloud is very small, only 800 lines of code, and built following two bedrock principles of computing, that of least privilege and the Unix philosophy to "make each program do one thing well." Competing products range from 4,000 to over 100,000 lines of code. Charliecloud is described in detail in a technical report online.
-end-
Los Alamos National Laboratory and supercomputing have a long, entwined history. Los Alamos holds many "firsts," from bringing the first problem to the nation's first computer to building the first machine to break the petaflop barrier. Supercomputers are integral to stockpile stewardship and the national security science mission at Los Alamos.

About Los Alamos National Laboratory

Los Alamos National Laboratory, a multidisciplinary research institution engaged in strategic science on behalf of national security, is operated by Los Alamos National Security, LLC, a team composed of Bechtel National, the University of California, BWX Technologies, Inc. and URS Corporation for the Department of Energy's National Nuclear Security Administration.

Los Alamos enhances national security by ensuring the safety and reliability of the U.S. nuclear stockpile, developing technologies to reduce threats from weapons of mass destruction, and solving problems related to energy, environment, infrastructure, health and global security concerns.

DOE/Los Alamos National Laboratory

Related Supercomputer Articles:

Researchers measure the coherence length in glasses using the supercomputer JANUS
Thanks to the JANUS II supercomputer, researchers from Spain and Italy (Institute of Biocomputation and Physics of Complex Systems of the University of Zaragoza, Complutense University of Madrid, University of Extremadura, La Sapienza University of Rome and University of Ferrara), have refined the calculation of the microscopic correlation length and have reproduced the experimental protocol, enabling them to calculate the macroscopic length.
Officials dedicate OSC's newest, most powerful supercomputer
State officials and Ohio Supercomputer Center leaders gathered at a data center today (March 29) to dedicate the Owens Cluster.
A scientist and a supercomputer re-create a tornado
With tornado season fast approaching or already underway in vulnerable states throughout the US, new supercomputer simulations are giving meteorologists unprecedented insight into the structure of monstrous thunderstorms and tornadoes.
Calculating 1 billion plasma particles in a supercomputer
At the National Institutes of Natural Sciences National Institute for Fusion Science (NIFS) a research group using the NIFS 'Plasma Simulator' supercomputer succeeded for the first time in the world in calculating the movements of one billion plasma particles and the electrical field constructed by those particles.
Supercomputer simulations help develop new approach to fight antibiotic resistance
Supercomputer simulations at the Department of Energy's Oak Ridge National Laboratory have played a key role in discovering a new class of drug candidates that hold promise to combat antibiotic resistance.
Supercomputer comes up with a profile of dark matter
In the search for the mysterious dark matter, physicists have used elaborate computer calculations to come up with an outline of the particles of this unknown form of matter.
New Hikari supercomputer starts solar HVDC
The Hikari supercomputer launched at the Texas Advanced Computing Center is the first in the US powered by solar HVDC.
Wiring reconfiguration saves millions for Trinity supercomputer
A moment of inspiration during a wiring diagram review has saved more than $2 million in material and labor costs for the Trinity supercomputer at Los Alamos National Laboratory.
Chemistry consortium uses Titan supercomputer to understand actinides
A multi-institution team led by the University of Alabama's David Dixon is using Titan to understand actinide chemistry at the molecular level in hopes of designing methods to clean up contamination and safely store spent nuclear fuel.
Are humans the new supercomputer?
Online computer games allow gamers to solve a class of problems in quantum physics that cannot be easily solved by algorithms alone.

Related Supercomputer Reading:

Best Science Podcasts 2019

We have hand picked the best science podcasts for 2019. Sit back and enjoy new science podcasts updated daily from your favorite science news services and scientists.
Now Playing: TED Radio Hour

Digital Manipulation
Technology has reshaped our lives in amazing ways. But at what cost? This hour, TED speakers reveal how what we see, read, believe — even how we vote — can be manipulated by the technology we use. Guests include journalist Carole Cadwalladr, consumer advocate Finn Myrstad, writer and marketing professor Scott Galloway, behavioral designer Nir Eyal, and computer graphics researcher Doug Roble.
Now Playing: Science for the People

#530 Why Aren't We Dead Yet?
We only notice our immune systems when they aren't working properly, or when they're under attack. How does our immune system understand what bits of us are us, and what bits are invading germs and viruses? How different are human immune systems from the immune systems of other creatures? And is the immune system so often the target of sketchy medical advice? Those questions and more, this week in our conversation with author Idan Ben-Barak about his book "Why Aren't We Dead Yet?: The Survivor’s Guide to the Immune System".