snvolrend0012

HPCWire is carrying a story that a group of researchers have pushed VisIt into record-breaking territory by running visualizations of 500billion to 2trillion gridpoint datasets (that’s 2-terapoints in size).

The team ran VisIt using 8,000 to 32,000 processing cores to tackle datasets ranging from 500 billion to 2 trillion zones, or grid points. The project was a collaboration among leading visualization researchers from Lawrence Berkeley National Laboratory (Berkeley Lab), Lawrence Livermore National Laboratory (LLNL) and Oak Ridge National Laboratory (ORNL).

Seems a good portion of the Department of Energy‘s Visualization staff was involved: Wes Bethel, Sean Ahern, Mark Howison, Dave Pugmire, and others.

The test runs created three-dimensional grids ranging from 512 x 512 x 512 “zones” or sample points up to approximately 10,000 x 10,000 x 10,000 samples for 1 trillion zones and approximately 12,500 x 12,500 x 12,500 to achieve 2 trillion grid points.

“This level of grid resolution, while uncommon today, is anticipated to be commonplace in the near future,” said Ahern. “A primary objective for our SciDAC Center is to be well prepared to tackle tomorrow’s scientific data understanding challenges.”

The tests consisted of isosurfaces and volume renderings.  While they don’t mention the results of the tests, they did carefully monitor them and collect benchmarking data that can be used in further development of VisIt.

Click the banner image above for larger Volume Rendering from the test.

via HPCwire: DOE Researchers Test Limits of Visualization Tool.