NVidia Announces iray Realtime Raytracer for 3dsMax
On stage at GTC2010, Ken Pimental (Autodesk), Micheal Kaplan (mental images) and Jen-Hsun Huang (NVidia) stood together to announce the next generation rendering technology they’ve integrated. They showed a simple scene of a few chairs around a table that had been rendering the scene for over an hour, and was approximately 10% in when they were on stage. Mainly, the delay would due to the fast that the scene has no direct lighting, all lighting was indirect through the set of double-glass windows in the back and then reflected off the various elements in the scene. They then paused it to show the various Render panels, showing the details of Final Gather and illumination settings, about how they have had to include many options to control the various shortcuts.
Starting next week, subscription owners will be able to download iRay, mental image’s real-time GPU-accelerated ray-tracing solution, for use directly within 3dsMax as a new renderer. The result is a far simpler control panel, far faster renders, and vastly more accurate images at the end. On the right, you can see the control panel (click for larger-size), and you can see how trivial the new setup is.
After that, they clicked the “Render” button. It took approximately 10s to translate and pad the scene into the proper formats, then within seconds a near-complete render was visible. iRay is an interative renderer so the longer you let it go, the better the result will be, however the initial result is usually good-enough (to tell that something is wrong, needs to be corrected, etc) making the iterative process MUCH faster.
This will be available to 3dsMax Subscription members next week.
Next, they showed the “cloud rendering” capabilities on stage, run by Peer 1 hosting, showing how the iray rendering (on 32 GPUs) offers interactive real-time rendering. In addition to simply rendering the image faster, it enables the user to actually “walk” around the scene and see it rendered interactively as they use it. Via a custom web interface, they could then place furniture and art within the room, all rendered realtime. In addition, since there was no direct lighting they could change the time of day to see completely different lighting environments.
The web-interface is all research, and I’ve seen it several times before, but the “official” presentation is new.