New energy-saving facility boosts Stanford's computing prowess
Green technology meets high-powered computing at the new Stanford Research Computing Center, which supports the growing computational needs of researchers on campus while saving energy.
There was a time when scientists answered questions with slide rules and microscopes. Then, understanding our world was as simple as making observations.
Not anymore. Today, expanding the frontiers of knowledge requires serious computing power. For many Stanford researchers, that computing power now resides in a state-of-the-art Stanford facility – the Stanford Research Computing Center (SRCC).
Located at SLAC National Accelerator Laboratory and set on a grassy hillside amidst buildings housing high-energy beams and particle-smashing equipment, the SRCC could eventually contain as many as 180 refrigerator-sized racks of servers in its cavernous interior. This volume enables Stanford faculty to store the massive amounts of data being generated in research and also carry out complex computation to understand the world and our future.
Examples of research that requires this level of computing power include understanding the origins of stars, studying how human populations evolve, modeling climate change, efficiently delivering energy, making jet travel more efficient, solving the mysteries of the brain, constructing models of the molecules that make up our bodies and mining the secrets contained in our DNA.
At an opening celebration for the building, Provost John Etchemendy said that computation is playing a growing role in faculty research. "Everyone we're hiring is computational, and not at a trivial level. It is time that we have this facility to support those faculty."
Etchemendy purchased 125 servers for SRCC to be used as shared computational space. In addition to that communal space, simply having servers located together can create additional computation power.
"By putting equipment together you can leverage computing that isn't being used," said Ruth Marinshaw, chief technology officer for research computing, who oversees SRCC.
Stanford Vice Provost and Dean of Research Ann Arvin supported SRCC as a resource for faculty across Stanford schools.
"There's really very little research that isn't dependent on computing," said Arvin, adding that the distance would not slow computation speed with the high-speed network cable running around campus and up to SLAC.
A campus-wide resource
Computing space at SRCC is allocated in terms of power. The facility can support 3 megawatts, of which roughly one-third will go to the School of Medicine, one-sixth to SLAC, and the remaining power will be divided among other interested Stanford faculty. This allocation reflects both the breadth of research requiring high-speed computation and the growing need for supercomputing in medical research.
Case in point, the School of Medicine has a joint big data initiative with Oxford University and is hosting an international Big Data in Biomedicine meeting May 21-23. The meeting's organizer, Euan Ashley, associate professor of cardiovascular medicine who directs Stanford's arm of the collaboration, said the initiative to improve health care worldwide benefits from the university's computation strengths.
"We're extremely lucky at Stanford to have expertise in data science in every school in the university," Ashley said. Additional big data and computation initiatives are underway to support engineering, computer science and the social sciences.
In addition to expanding Stanford's computational resources, SRCC is as green as the nearby hillside. It uses an entirely air-driven system to keep the servers in their 60-80 degree comfort zone. This system could save as much as $1 million a year in energy costs spent cooling server rooms across campus, according to Arvin.
The cooling system relies on nothing more than outside air, fans and cold water pipes to cool the room of sensitive heat-producing servers. Air comes in through the roof, then passes through towering, industrial-sized fans and into the server room. Back-to-back rows of servers optimized for efficient air flow take the cool air in through their front, then send heated air out into a sealed alleyway between rows. That space opens to an outlet in the building's roof.
On the off day when local temperatures dip below 60 degrees, some of that heated air can be mixed with the air intake to warm the air blowing into the server room. On hot days, cold water chills the air before it flows over the equipment.
Space for research
Currently, faculty on campus with high computation needs have servers squirreled away in closets, in clusters located within buildings or departments, or in dedicated space in Forsythe Hall.
One of those centers, the Center for Computational Earth & Environmental Science, served as a model for the SRCC. "The computing model employed by CEES gives researchers the opportunity to scale problems and models beyond what could have been done with silos of individually purchased and managed servers," Marinshaw said.
Despite the value of having some computing on campus, other buildings aren't properly designed to keep servers cool, and the computing takes up room that could be dedicated to research.
"We free up very valuable space for research by moving computers out of campus buildings," Arvin said. She points to the Clark Center, which houses Stanford Bio-X, as an example. They have regained enough space for a new lab by moving servers to the Forsythe shared cluster.
With SRCC available, new buildings on campus will not likely need to carve out space for computing. The Stanford Neurosciences Institute and Stanford ChEM-H are in the process of designing adjacent buildings that will house faculty – many of whom have high computational needs. Despite that, the building designs don't include designated server rooms.
William Newsome, director of the Stanford Neurosciences Institutes and Harman Family Provostial Professor, said doing computing in the centralized resource will benefit neuroscientists.
"Centralization of powerful computing resources will be a boon to neuroscientists, reducing the need to keep solving the same computing problems 'locally' time-after-time at numerous places on campus," Newsome said.
Marinshaw said that during the building's first test of 90-plus degree temperatures the servers are keeping cool on air alone and are busily computing big questions in fields across disciplines.
Amy Adams, University Communications: (650) 796-3695, email@example.com