Over at the HPC-CH blog, they’ve got an interview with Argonne’s Venkartram Vishwanath on how he’s dealing with some extremely large simulation datasets.  They’ve found that the visualization isn’t just good for finding insight in their data, but also a great workaround for IO bottlenecks that arise from the huge filesizes.

Venkartram agrees that one challenge of next generation simulations is that I/O will not keep up with the growth rate of computing capability. In his group at Argonne they are now working on efficient infrastructure and software to reduce the amount of data being written to storage to perform analysis, as well as in-situ visualization while the simulation is progress. This will facilitate the transformation of the data into insight.

via hpc-ch: Blog.