Big Data and Simulation Applications for Extra-Super Supercomputers

Exascale supercomputers to be up to 100 times faster than their current counterparts.

The Cori supercomputer at Berkeley National Laboratory will be used in development and testing of future exascale computing tools for X-ray laser data analysis and the simulation of plasma wakefield accelerators. (Image courtesy of National Energy Research Scientific Computing Center.)

The Cori supercomputer at Berkeley National Laboratory will be used in development and testing of future exascale computing tools for X-ray laser data analysis and the simulation of plasma wakefield accelerators. (Image courtesy of National Energy Research Scientific Computing Center.)

Two recently funded computing projects are aiming to develop scientific applications for the next generation of supercomputers. Exascale supercomputers will perform a billion billion operations per second, up to a hundred times more than the most powerful supercomputers today. The projects are among 22 proposals that have received either full or seed funding by the U.S. Department of Energy’s (DOE’s) Exascale Computing Project.

Getting a Handle on Big X-Ray Data

There’s a lot of data being generated nowadays, and managing it isn’t always easy.  One particularly large source of data can be found at the DOE’s SLAC (Stanford Linear Accelerator Center) National Accelerator Laboratory: X-ray lasers. SLAC’s Linac Coherent Light Source (LCLS) X-ray laser typically flashes 120 times per second, generating hundreds of thousands of gigabytes of data per experiment.

It’s already extremely difficult to sift through such a large amount of data. But that’s only the tip of the iceberg: the next-gen LCLS-II will deliver 8000 times more X-ray pulses per second, and produce a comparable increase in generated data: the data flow from the LCLS-II is estimated to exceed a trillion bits per second.

“This is a real problem because you might only find out days or weeks after your experiment that you should have made certain changes,” said Berkeley Lab’s Peter Zwart. “If we were able to look at our data on the fly, we could often do much better experiments.”

Data generated with the LCLS X-ray laser (pattern in background). (Image courtesy of N. Sauter/Berkeley Lab.)

Data generated with the LCLS X-ray laser (pattern in background). (Image courtesy of N. Sauter/Berkeley Lab.)

This is where Project ExaFEL (Exascale for Free Electron Lasers) comes in. Its goal is to utilize exascale supercomputers to analyze the massive amounts of X-ray data in real time and provide immediate experimental feedback. However, achieving this goal will take some innovation, as X-ray data processing isn’t a typical supercomputer task.

“Traditionally these high-performance machines have mostly been used for complex simulations, such as climate modeling, rather than processing real-time data,” said principal investigator Amedeo Perazzo. “So we’re breaking completely new ground with our project, and foresee a number of important future applications of the data processing techniques being developed.”

The Future of Particle Accelerators

The second exascale computing project seeks to simulate a type of particle acceleration technology called plasma wakefield acceleration, which involves charged particles gaining energy by “surfing” on a wave of plasma created with a powerful laser beam. The project will take an existing accelerator simulation code called “Warp” and combine it with a refinement tool called “BoxLib” to create “WarpX”, specifically designed for exascale computing architectures. 

A plasma wakefield simulation. (Image courtesy of Jean-Luc Vay/Berkeley Lab.)

A plasma wakefield simulation. (Image courtesy of Jean-Luc Vay/Berkeley Lab.)

Plasma wakefield acceleration could potentially lead to more compact and less expensive particle accelerators. However, designing and understanding plasma accelerators requires extensive computer simulation.

“Simulations of a single plasma stage, which do not even take into account all complexity, take days to weeks on today’s supercomputers,” explained physicist Jean-Luc Vay. “If we ever want to be able to run simulations for future particle colliders, which will require up to 100 stages chained together, exascale computing is absolutely necessary.”

 

The Challenges of Exascale Engineering

Both projects have been fully-funded and will receive $10 million over four years. However, it may yet be a bumpy road, as exascale computing requires exascale ingenuity.

“Supercomputers are very complicated, with millions of processors running in parallel,” said SLAC’s Alex Aiken. “It’s a real computer science challenge to figure out how to use these new architectures most efficiently.”

You can find a full list of the funded exascale projects on the DOE website

For more supercomputing news, check out Cray Lands $26M Supercomputer Deal with DoD.

Written by

Michael Alba

Michael is a senior editor at engineering.com. He covers computer hardware, design software, electronics, and more. Michael holds a degree in Engineering Physics from the University of Alberta.