567 views 3 comments

NICS Receives $10M from NSF for Remote Visualization

by on September 28, 2009
 

ornl-ut-nsfHot on the heels of the TACC announcement, University of Tennessee’s National Institute for Computational Science (NICS) have announced they will receive $10M from the NSF over the next 4 years to build a new “Center for Remote Data Analysis & Visualization (RDAV)”.  Just like TACC, first order of business in a new machine:

Much of RDAV will rely on a new machine named Nautilus that employs the SGI shared-memory processing architecture. The machine will feature 1,024 cores, 4,096 gigabytes of memory, and 16 graphics processing units. The new SGI system can independently scale processor count, memory, and I/O to very large levels in a single system running standard Linux. This flexibility will allow the RDAV team to configure a system uniquely capable of analyzing and visualizing petascale data sets, promising TeraGrid users new levels of scientific understanding.

And this impressive quote from Sean Ahern, research associate professor at the University of Tennesse and visualization task lead at ORNL where the machine will sit:

“I believe this will be the largest shared-memory machine for analysis on the planet,” said the project’s Principal Investigator (PI) Sean Ahern, who is currently the visualization task lead at ORNL and will serve as director of RDAV. “No one has ever done this before. The new system will handle data analysis algorithms that can’t be deployed on more traditional distributed memory systems.”

Of course, hardware isn’t all.  The center will also feature a full staff of visualization & analysis experts to aid researchers, and be available to TeraGrid researchers.

Read the full press release from the University of Tennessee and find some information about the organization of the RDAV center and some of the other individuals involved in this project.after the break, or read the announcement from their National Institute for Computational Sciences.


rdav-chart

Some of the people involved:

  • Miron Livny (University of Wisconsin) and Scott Klasky (ORNL) will be providing workflow services.
  • George Ostrouchov (ORNL/UT) will be providing data analysis and statistical analysis.
  • Wes Bethel (LBNL) will be providing the remote vis services.
  • Sean Ahern (ORNL) will manage the RDAV

University of Tennessee’s Press Release:

UT to develop $10 million system analyzing supercomputer output

KNOXVILLE, Tenn. — The National Science Foundation has awarded the University of Tennessee $10 million to develop a computer system that will interpret the massive amounts of data created by the current generation of high-performance computers in the agency’s national computer grid.

Sean Ahern, a computer scientist with Oak Ridge National Laboratory and UT’s College of Engineering, will create and manage the Center for Remote Data Analysis and Visualization, which will store and examine data generated by computer simulations, large experimental facilities like the Spallation Neutron Source, and widely distributed arrays of sensors.

“Next-generation computing is now this-generation computing,” Ahern said. “What’s lacking are the tools capable of turning supercomputer data into scientific understanding. This project should provide those critical capabilities.”

Ahern and colleagues at UT’s National Institute for Computational Science will develop Nautilus, a shared-memory computer system that will have the capability to store vast amounts of data, all of which can be accessed by each of its 1,024 core processors. Nautilus will be one of the largest shared-memory computers in the world, Ahern said. It will be located at the NICS facility at Oak Ridge National Laboratory.

Nautilus will be used for three major chores: visualizing data results from computer simulations with many complex variables, such as weather or climate modeling; analyzing large amounts of data coming from experimental facilities like the SNS; and aggregating and interpreting input from a large number of sensors distributed over a wide geographic region. The computer will also have the capability to study large bodies of text and aggregations of documents.

UT release -- NSF data visualization award -- illustration

A visualization of the turbulent core of a simulated supernova. The high-resolution astrophysics simulation code, Chimera, was run on Kraken to explore stellar disruption. UT's new Data Analysis and Visualization Center will enable creation of such visualizations. Credit: Oak Ridge National Laboratory

“Large supercomputers like Kraken working on climate simulation will run for a week and dump 100 terabytes of data into thousands of files. You can’t immediately tell what’s in there,” Ahern said. “This computer will help scientists turn that data into knowledge.”

Nautilus will be part of the TeraGrid XD, the next phase of the NSF’s high-performance network that provides American researchers and educators with the ability to work with extremely large amounts of data.

Kraken, the world’s largest academic computer, is part of the TeraGrid and is operated by the University of Tennessee. Like Kraken, Nautilus will be part of  UT’s Joint Institute for Computational Sciences on the ORNL campus.

The new machine, manufactured by high-performance computing specialist SGI, will employ the company’s new shared-memory processing architecture. It will have four terabytes of shared memory and 16 graphics processing units. The system will be complemented with a one-petabyte file system.

Through Ahern and co-principal investigator Jian Huang, the University of Tennessee is the lead institution on the project. Oak Ridge National Laboratory will provide statistical analysis support, Lawrence Berkeley National Laboratory will provide remote visualization expertise, the National Center for Supercomputing Applications at the University of Illinois will deploy portal and dashboard systems, and the University of Wisconsin will provide automation and workflow services. Huang is on the faculty of UT’s Department of Electrical Engineering and Computer Science.

Nautilus will be joined by another NSF facility at the University of Texas that will use another data-access technique for analysis. The NSF funded both projects under the American Recovery and Reinvestment Act of 2009.

“For many types of research, visualization provides the only means of extracting the information to understand complex scientific data,” said Barry Schneider, NSF program manager for the project. “The two awards, one to the Texas Advanced Computing Center at the University of Texas at Austin and the other to NICS at the University of Tennessee, will be deploying new and complementary computational platforms to address these challenges.”

FOR MORE INFORMATION, CONTACT
Sean Ahern, principal investigator and RDAV director
865-241-3748, [email protected]
Jian Huang, co-PI and UT associate professor in electrical engineering and computer science
865-974-4398, [email protected]
Bill Dockery, research communications, UT Office of Research
865-974-2187, [email protected]