nvidia-logoAbout two weeks ago, news made a splash all over the internet that NVidia wasn’t going to implement DirectX11, shunning and infuriating video gamers around the world.  This game from the following quote made by Mike Hara at an analysts conference:

“DirectX 11 by itself is not going be the defining reason to buy a new GPU. It will be one of the reasons.”

I’m sure that NVidia will implement DirectX11, probably as soon as it’s available.  The focus of his comment was that DirectX and Video Games are not the powerful driving force behind video cards that they once were.  In this feature I take a look in-depth at what NVidia is up to and what recent changes could mean for the gaming industry, and come up with 3 main reasons why DirectX is not the industry-driver it once was.

Embedded vs Standalone

nvidia-tegraFor years, NVidia made its money by licensing the GPU out to other companies like BFG and PNY to integrate into special cards for video.  A new market is growing rapidly tho for embedded systems, powered by Atom, Ion, and the growing smartphone market.  These systems don’t have space, power, or cooling characteristics for the classic PCI-Express graphics card and that has led to a boom in embedded graphics.

NVidia’s newest products specifically target this space: ION and Tegra.  If you haven’t heard, the NVidia ION (ION Wikipedia Page) is an integrated motherboard/graphics chipset solution from NVidia that offers the Geforce 9400M and the Intel Atom processor in an extremely tiny form factor, making it perfect for laptops and set-top boxes.  It’s been certified by Microsoft as Vista Capable, and is slowly cropping up in more and more places as a netbook.  The Tegra (Tegra Wikipedia Page) is a unique processor design from Nvidia that combines an ARM processor with extra modules for graphics and video decoding, perfectly designed for tiny embedded systems.  The widest deployment of the Tegra to date is in Microsoft’s ZuneHD.

As netbooks and mobile devices grow, the need for these embedded systems will become a more significant part of NVidia’s bottom line.

DirectX vs OpenGL?

Over the last several years, OpenGL has been in slow decline.  It’s still the foundation of graphics technology on *nix & Mac’s, and still used by several scientific visualization programs.  But if you’re on Windows (like most computer users in the world) playing games (like so many people), then you’re using DirectX.  The interfaces for DirectX have really grown into a powerful collection of libraries combining inputs (joysticks, mice), network connectivity, graphics, and sound all into one nice easy package.

OpenGL hasn’t been sitting still, tho.  While DirectX remains solid in the Windows World, a whole new world has arisen on the horizon.  Mobile devices like the iPhone have begun to redefine the video gaming market through their powerful feature set and slick interfaces.  Afterall, why buy a portable game player for $150 (Nintendo DSi) when you could buy a combination phone, game player, music player, and internet browser for $99 (iPhone 3G 8G)?  No doubt the embedded device market will continue to grow, and these devices don’t run DirectX, they run OpenGL derivatives like MiniGL and OpenGL|ES.

crysis-vistaAnd another reason to think that DirectX11 won’t be the “big splash” that some think is to simply look at DirectX10.  DirectX10 was released exclusive to Vista, much like DirectX11 will be released exclusive to Windows7.  Microsoft Vista was a flop for many reasons, most notably the included DRM, but also because users simply had little reason to upgrade.  The big game provoking people to upgrade to Vista was Crysis.  Crysis, the “sequel” to FarCry, came in both a DirectX 10 (Vista) Version and a DirectX9 (XP) version, but the best graphics were only available on DirectX10.  Sites like GameSpot made great comparisons on the graphics quality between the two versions.

But what many people found was that the graphics didn’t really improve the game any.  Yes, sand and rocks looked more realistic but you know what that meant?  Nothing.  Enemies reacted the same, stuff blew up the same, it was the same game.  So you had two options:

  • Go drop a few grand on a new computer with a DX11 compatible video card, buy Vista, and build a whole new system.
  • Keep all your existing equipment, and go buy the $50 game.

Most people opted for the cheaper alternative, and just bought the $50 game and played it on XP.

A similar situation is brewing for Windows7.  The reviews so far are very positive about Windows7, but only time will tell if customers are willing to go fork out the money to actually upgrade.  Surfing the web and checking email is the same either way, so how many people will actually commit?  Given the economic situation of the world right now, how many people will upgrade?

As more and more people drop hardcore PC gaming for more leisurely (web-based, flash, etc) games and pick up consoles or mobile devices, DirectX11 is losing ground to a resurgence in OpenGL.

Gaming VS HPC?

Also, games alone don’t define the GPU market anymore.  Increasingly users (and software developers) are taking advantage of GPU performance for non-graphics tasks like Video encoding and physics simulations.   The massively parallel nature of the GPU fits nicely with what has been a growing trend in HPC for several years in parallelising codes for cluster operation.  Instead of buying a 200 core supercomputer, they can simply buy a single GPU and see almost identical performance boosts (in some codes).

ibm_7071This is opening a whole new market for AMD and NVidia, which NVidia is taking huge advantage of with their CUDA toolkit.  CUDA has been around for years and became the foundation of the now open-standard OpenCL, whose name bears a striking (intentional) resemblance to OpenGL.  Research institutes, laboratories, and supercomputing centers are buying massive quantities of graphics cards for use in massively parallel codes, making them more than a minor percentage in NVidia sales.  As the PC gaming market shrinks, both from a shift in focus and the recession, NVidia is looking for new markets to branch into to make up the difference and the HPC market kinda fell in their lap.

Now Fermi breaks onto the scene and it looks even more like HPC will become a major focus.  I think it’s a safe bet that Fermi will render some awesome graphics, but no doubt it’s targeted specifically at people using the GPU for computational work.  Those people will make the Fermi as attractive a resource as the Cell was a few years ago, which will make it even more profitable for NVidia.