Purdue moves supercomputer to cutting-edge portable data center --with the computer running
WEST LAFAYETTE, Ind. - What may look like a shipping container dropped off a freight train onto a lot near the Purdue power plant is really a new home for the university's Steele supercomputing cluster.
Purdue is moving the Steele cluster to a portable, self-contained, modular computer center. Industry giants such as Amazon, Google and Microsoft make significant use of quickly deployed, energy-efficient containerized data centers. But Purdue is one of the first universities to do so and to use HP's "POD" (Performance-optimized Data Center).
"Other universities have expressed a lot of interest in what we're doing," says Lon Ahlen, facilities manager for Information Technology at Purdue (ITaP), Purdue's central information technology organization.
The containerized data center offers advantages in cost, energy consumption and flexibility. The facility is energy efficient, even making use of Indiana's winter weather for air conditioning. The moving of Steele, which should be complete in September, will reduce the load on the campus chilled water system.
"We're using natural cooling much of the year," Ahlen says. "This should be the most efficient data center we've put together."
Rather than new construction or extensively reconfiguring existing buildings, both expensive options, little more than a concrete pad to serve as a foundation for the unit and a connection to nearby campus power and data lines was needed. Inside, the POD comes fully equipped and ready for computers to be installed.
The site is designed for two of the containers, and there's room for up to six at the location with further site improvements.
"This gives us an expansion capability we have not had," Ahlen says.
In addition, moving Steele to the containerized facility makes room for two to three new supercomputers in the building where ITaP's Rosen Center for Advanced Computing houses Purdue's campus-wide supercomputing clusters now.
"With the POD, we'll have deployed an entire new data center in a matter of months at a fraction of the cost of a traditional data center while being able to support all our current, as well as anticipated near-future faculty demand," says John Campbell, associate vice president for academic technologies, who's in charge of research computing for ITaP.
Purdue's centrally managed supercomputers support faculty from aeronautics, agronomy, climate science, communications, medicinal chemistry, molecular pharmacology, biology, engineering, physics, statistics and more. The high-performance systems are used in diverse ways -- from modeling climate change and developing new medicines -- to engineering more efficient rocket engines and designing next-generation nanoscale electronics. Purdue has been adding systems every summer for the last three years to meet growing faculty demand.
"As one of the nation's leading research universities, Purdue University has set a precedent for innovation in high-performance computing," says Madhu Matta, vice president and general manager, Industry Standard Servers, HP. "The POD allows research institutions like Purdue to increase their computer capacity in a way that is cost-effective and enables productivity and scientific breakthroughs without the need to expand their existing data centers."
Steele -- made up of more than 900 individual computers, or nodes, containing more than 7,200 processing cores -- has remained at work during its move. ITaP is moving the supercomputer in phases, with parts of it still running in the main research data center on campus while other parts are moved to the container and then brought back online shortly after being installed there.
"We're making every effort to minimize the impact to Steele's hundreds of users," says Bill Whitson, ITaP and Rosen Center director of research support.
Writer: Greg Kline, science and technology writer, Information Technology at Purdue (ITaP), 765-494-8167, firstname.lastname@example.org
Sources: Gerry McCartney, Purdue vice president for information technology and chief information officer, (765) 496-2270, email@example.com
John Campbell, 765-494-1289, firstname.lastname@example.org
Lon Ahlen, 765-496-8230, email@example.com
Bill Whitson, 765-496-8227, firstname.lastname@example.org