arhrrrWe’ve talked about ARhrrr from the Graz University & Georgia Tech, but it seems they’ve recently made another live demonstration at the NVidia GTC, and revealed some of the details on how it works on the Nvidia Tegra hardware.

The ARhrrr pipeline is simple: the camera uses OpenMAX [multimedia acceleration API] to take in the image featuring AR Tracker. Second step is using the GPU for creating the 640×480 EGLImage Textures [OpenGL|ES 2.0 API] which are then further processed inside the GPU to downsize the images to 320×240 at 8bpp and then sent to the CPU for the AR system [NFT tracker] to make the transformation from camera to real world.

Third step is game engine calculating and sending the needed 3D game objects back to the GPU so that it creates the final rendered image combining the camera stream with the 3D game objects. As you can imagine, the amount of video processing is significant.

Still no work on when, or if, it will ever come to a playable device.  I can hardly wait to start blasting some zombies.

via Zombies hit nVidia’s Tegra thanks to Augmented Reality – Bright Side Of News*.