TACC’s Kelly Gaither gave a nice presentation in the Dell booth at SC on the trials and tribulations of performing data analysis and visualization “At scale”.  In her context, “at scale” means on large HPC-scale datasets.

Visualization is one of the most important and commonly used methods of analyzing and interpreting digital assets. For many types of computational research, it is the only viable means of extracting information and developing understanding from data. However, non-visual data analysis techniques—statistical analysis, data mining, data reduction, etc.—also play integral roles in many areas of knowledge discovery.

TACC is using technology that I’ve begun deploying at my employer combining dedicated visualization resources with large-shared filesystems (eliminating file transfers) and client-server tools.  Her talk focuses on their software (Longhorn Portal) & hardware (Longhorn & Stallion)  deployments, unfortunately lacking much detail on Impact of the system beyond fuzzy “works great” remarks.  It’s a good talk if you’re unfamiliar with the problems of interactive visualization at the tera/petascale, and Kelly is always fun to listen to.

via Video: Interactively Visualizing Science at Scale at SC11 | insideHPC.com.