Many people don’t realize just how much data goes into finding the next big oil field.  They spend millions scouring the globe and running seismic surveys to find what’s under our feet, and then spend days, weeks, even months to analyze the surveys to find something useful.  Take this example:

For example, the average ship running seismic gear has between 20,000 and 25,000 sensors on board, and you typically use several ships in concert to survey an area. This will yield anywhere from 50 to 200TB of data per run and take five to seven days of solid processing on a large number of systems to get results. If you ramp up the resolution, it can take 15,000-20,000 compute nodes running days or weeks to complete the job.

Here at GTC in the “Oil & Gas” track, they had a presentations discussing how they have had success integrating GPUs into their workflow to great effect.  They’ve come up with a 5-fold increase in performance, resulting in a 6-fold decrease in overall cost, just by porting their already embarrassingly parallel codes to CUDA.

via GPUs slick up with oil sleuths • The Register.