Scientific Computing
Located in the heart of Silicon Valley, SLAC researchers are contributing significantly to enhancing the scientific value of computing. SLAC scientific computing innovations fall into the areas of “big data” or “scientific simulation.”
Today’s scientific experiments often collect huge amounts of data. SLAC plays a leading role in managing and analyzing these torrents of data, as well as developing special tools and software for handling data and presenting them to the public:
SLAC has a long history of creating advanced data acquisition systems for collecting, rapidly processing and storing enormous bursts of data from physics experiments. Most recently, SLAC designers drew on their experiences with the BaBar and ATLAS particle physics experiments to create data acquisition and storage systems for the Linac Coherent Light Source (LCLS), which can handle data streams exceeding 2 gigabytes per second and has a capacity of 4 petabytes.
The Large Synoptic Survey Telescope’s (LSST) digital camera, now being designed by a SLAC-led team, will take a 3-billion-pixel image of a portion of the heavens every 15 seconds. The project will accumulate 6 million gigabytes of astronomical data every year. SLAC is building the software that will make this data available to scientists and the public. The lab’s Scientific Computing Applications division is also contributing to the LSST’s camera control system.
SLAC runs the Instrument Science Operations Center for processing data from the Large Area Telescope, the main instrument on the orbiting Fermi Gamma-ray Space Telescope. SLAC staff also contributes to the software that makes Fermi LAT data available to scientists.
Although the BaBar experiment’s nine-year run ended in 2008, scientists are still gleaning discoveries and insights from its 22 billion electron-positron collisions. Recently, a custom-made cloud computing system was created on a dedicated cluster of computers that will provide long-term access to BaBar data independent of software or hardware upgrades.
A recent overhaul of the SPIRES database – the first database available on the Web and the go-to source for information on particle physics literature for more than 40 years – offers even more functionality and interactivity. Now known as INSPIRE, it is a collaboration among SLAC, CERN, DESY and Fermilab.
Since 2007, SLAC has hosted annual international conferences on extremely large databases that bring together a growing community of scientific and industrial big-data users with vendors and academic researchers who are working on ways to handle huge data sets. The first of these conferences led to the creation of the SciDB open source database, which is designed to satisfy the demands of data-intensive scientific problems.
Computer simulations are powerful, cost-effective tools for designing and previewing experiments, as well as invesftigating and displaying spaces and conditions that can’t be achieved in a lab. SLAC simulations range from the subatomic to the cosmic.
Several SLAC/Stanford research groups use and develop innovative computer simulation techniques to better understand the fundamentals of important energy-related materials:
- One research group uses an in-house cluster of conventional and graphic processing unit computers, along with supercomputers at the National Energy Research Scientific Computing Center, to simulate the behavior of complex "strongly correlated" materials, such as high-temperature superconductors and topological insulators.
- Another research team develops simulations of molecular systems that bridge the gap between traditional molecular dynamics (what are the atoms doing?) and quantum chemistry (what are the electrons doing?), with an ultimate goal of designing molecular devices powered by either light or force.
- Researchers at SLAC and Stanford’s SUNCAT Center for Interface Science and Catalysis use simulations to improve their understanding of chemical processes occurring at solid-gas and solid-liquid interfaces. The goal is to discover new catalysts for energy conversion and storage.
SLAC's Computational Astrophysics group seeks to bridge theoretical and experimental physics communities to bring their combined strength to bear on some of the most challenging and fascinating problems in particle astrophysics and cosmology. Stunningly beautiful and rich in scientific insights, their simulations span the history of the universe and billions of light years. Visualizations of their data enable scientists to clearly understand the implications of their calculations and help to educate public audiences in planetarium shows worldwide.
SLAC researchers are key contributors to an internationally developed software toolkit, Geant4, for simulating particles passing through and interacting with matter. This versatile toolkit is used in high-energy, nuclear and accelerator physics, as well as medical and space science research. A SLAC group is leading the Geant4 work on hadronic physics, visualization and overall software architecture.
Simulations run on SLAC’s 300-CPU cluster are central to the lab’s ability to understand, improve and design particle accelerators and free-electron lasers. They have been essential in designing and operating the LCLS, as well as designing LCLS-II and SLAC’s PEP-X proposal for an “ultimate storage ring” - a high-brightness, ring-based light source.