If you’ve ever wondered how these giant world-wide high-resolution maps come together, Wired has a great new article with the creators of “MapBox”. MapBox is taking continuous streams of satellite data from the likes of NASA to construct giant near-realtime images of the entire globe at staggering resolutions.
“For the new release we’re processing two years of imagery, captured from January 1, 2011 through December 31, 2012,” says Loyd, “this amounts to over 339,000 16-megapixel+ satellite images, totaling more than 5,687,476,224,000 pixels. We boil these down to a mere 5 billion or so.”
The first problem is even getting the data. It’s all available in the public domain, but just transferring it over to MapBox’s servers was a major task because of the volume. To do this render, they needed to download two thirds of a terabyte of compressed data. “We’ve got 30 to 40 servers pulling down data from NASA,” says Herwig. “We called them up and said, ‘hey we’re going to hit you hard, what’s the best way we can do it for you?’”
Over at the Visual.ly blog, they have a public plea to NASA (and anyone else reading) to finally give up on rainbow colormaps in favor of other more appropriate ones.
Dear NASA,The visualization community has noticed your insistence on using rainbow color scales for representing continuous data. This is a plea to you and anyone else doing the same thing to stop.On the surface, the logic behind using a rainbow color scale makes sense: the more colors there are, the easier you would expect it to be to see detail in a huge range of data. However, when perceptual issues are taken into account, rainbow color schemes are one of the worst ways to represent continuous data.
More than just latching onto the community hate for rainbow colormaps, they actually do a nice job describing and showcasing the many problems both perceptually and scientifically with rainbow colormaps. Definitely worth a check.
NASA’s Scientific Visualization Studio has released some beautiful visualizations of ocean flow called “Perpetual Ocean”. Meant as a submission for the SIGGRAPH2011 computer animation festival, it wasn’t accepted.
This visualization was produced using NASA/JPL’s computational model called Estimating the Circulation and Climate of the Ocean, Phase II or ECCO2.. ECCO2 is high resolution model of the global ocean and sea-ice. ECCO2 attempts to model the oceans and sea ice to increasingly accurate resolutions that begin to resolve ocean eddies and other narrow-current systems which transport heat and carbon in the oceans.The ECCO2 model simulates ocean flows at all depths, but only surface flows are used in this visualization. The dark patterns under the ocean represent the undersea bathymetry. Topographic land exaggeration is 20x and bathymetric exaggeration is 40x.
The fact that this wasn’t accepted I think is more evidence that SIGGRAPH has been steadily moving away from scientific visualization and computer graphics research work, and more towards the Visual Effects and Computer Animation industries for the last several years. The videos are beautiful and mesmerizing, as well as fairly computationally complex.
NASA has made lots of fans in recent years thanks to their beautiful panoramic and extreme-resolution images of the little ball we all live on. Stitching together thousands of images seems like a percent automated process, but the reality is a mind-blowing combination of hand-implemented artistry.
The photos follow in the footsteps of NASA’s other great Earth images. The original Blue Marble — one of the most famous pictures of all time — was captured by the crew of Apollo 17 from a distance of 28,000 miles. Since 2002, the agency has stitched together up to 10,000 satellite images to produce other incredible detailed images. One of the most recent, from 2007, had a mind-boggling resolution of 86,400 pixels by 43,200 pixels.
NASA has taken the extensive data from their MODIS satellites and created an impressive visualization of large fires across the world from 2002 to 2001, and combines it with snowfall and seasonal changes.
The tour begins by showing extensive grassland fires spreading across interior Australia and the eucalyptus forests in the northwestern and eastern part of the continent. The tour then shifts to Asia where large numbers of agricultural fires are visible first in China in June 2004, then across a huge swath of Europe and western Russia in August. It then moves across India and Southeast Asia, through the early part of 2005. The tour continues across Africa, South America, and concludes in North America.
Surprisingly, even with all the recent fires in the US MidWest, only 2% of the fires in the world occur in the US. Most fires occur in the African savanna from agricultural activity and lightning strikes.
NASA is fairly well known for the many conceptual animations they produce to showcase upcoming launches and projects, and the latest Mars Rover “Curiosity” is no exception. This time, the animation was produced by California’s Bohemian Grey, and is available on the JPL website.
Said Kevin K. Lane, President, Bohemian Grey Inc., “We are tremendously excited to be a part of this historic event. Helping NASA and the public to visualize ‘Curiosity’s’ trek was the type of animation project our company truly excels in both conceptualizing and producing.”
The latest issue of Scientific Computing has a great article from some NASA researchers on analyzing and visualizing airflow around landing gear, in hopes of redesigning them to reduce vibration and “aeroacoustic” effects (eg, Loud rumbling). If you see their “Digital magazine” Version you can see some movies of their visualizations.
To generate the flow animations presented here required saving a small portion (12,000 snapshots or time steps) of the flow simulation record. With each snapshot resulting in a file size on the order of 4 to 5 gigabytes, the total time record saved is in excess of 50 to 70 terabytes of data. Although such an aggregated file size is not excessively large by today’s standards, it is still too large for routine visualization of the results. The push toward much larger simulations (a nose gear computation on a grid twice as large as the current grid is ongoing) precludes relying on traditional methods for post-processing of CFD data; that is, saving the volumetric information at each time step for analysis at a later time, as these are highly inefficient and no longer practical. Such large datasets demand concurrent real-time simulation, analysis and visualization of the flow field without the need to save countless terabytes of information that would soon tax the storage capacity of even the largest supercomputers.
Scientific visualization of high-fidelity, large-scale flow simulations such as these has become an indispensable tool for providing global insights and knowledge that enable the development of viable engineering solutions to pressing environmental issues affecting the public good. The landing gear simulations, for example, together with those from other disciplines relevant to aircraft design, will soon be used to help develop a new breed of subsonic aircraft that will not only reduce noise pollution, but will burn less fuel and produce fewer harmful emissions — all to improve life on our planet.
Earlier today (2:41AM EDT), the sun kicked up a massive solar flare, and luckily NASA’s Solar Dynamics Observatory caught the whole thing on film.
NASA’s Solar Dynamics Observatory spacecraft caught high-definition video of the flare in different wavelengths. The event registered as a Class M-2 solar flare, which is a medium-class sun storm that should not pose a danger to satellites or infrastructure on Earth.
The images are beautiful, but tomorrow this could wreak a little havoc when the resulting magnetic wave hits earth.
I think everyone knows by now that the amazing images NASA shows from the Hubble Telescope are actually composites made from dozens, sometimes hundreds of images. In a rare behind-the-scenes, NASA has released a timelapse of someone doing the work in the greatest of all image editors, Photoshop.
Hubble images are made, not born. Images must be woven together from the incoming data from the cameras, cleaned up and given colors that bring out features that eyes would otherwise miss. In this video from HubbleSite.org, online home of the Hubble Space Telescope, a Hubble-imaged galaxy comes together on the screen at super-fast speed.
VizWorld.com We cover visualization and graphics news from around the internet, including Scientific Visualization, Visual Effects, and Graphics Hardware. Read more on our About Page or learn about our Advertising Options Get updates via twitter from @VizWorld.