About two weeks ago, news made a splash all over the internet that NVidia wasn’t going to implement DirectX11, shunning and infuriating video gamers around the world. This game from the following quote made by Mike Hara at an analysts conference:
“DirectX 11 by itself is not going be the defining reason to buy a new GPU. It will be one of the reasons.”
I’m sure that NVidia will implement DirectX11, probably as soon as it’s available. The focus of his comment was that DirectX and Video Games are not the powerful driving force behind video cards that they once were. In this feature I take a look in-depth at what NVidia is up to and what recent changes could mean for the gaming industry, and come up with 3 main reasons why DirectX is not the industry-driver it once was.
Embedded vs Standalone
For years, NVidia made its money by licensing the GPU out to other companies like BFG and PNY to integrate into special cards for video. A new market is growing rapidly tho for embedded systems, powered by Atom, Ion, and the growing smartphone market. These systems don’t have space, power, or cooling characteristics for the classic PCI-Express graphics card and that has led to a boom in embedded graphics.
NVidia’s newest products specifically target this space: ION and Tegra. If you haven’t heard, the NVidia ION (ION Wikipedia Page) is an integrated motherboard/graphics chipset solution from NVidia that offers the Geforce 9400M and the Intel Atom processor in an extremely tiny form factor, making it perfect for laptops and set-top boxes. It’s been certified by Microsoft as Vista Capable, and is slowly cropping up in more and more places as a netbook. The Tegra (Tegra Wikipedia Page) is a unique processor design from Nvidia that combines an ARM processor with extra modules for graphics and video decoding, perfectly designed for tiny embedded systems. The widest deployment of the Tegra to date is in Microsoft’s ZuneHD.
As netbooks and mobile devices grow, the need for these embedded systems will become a more significant part of NVidia’s bottom line.
DirectX vs OpenGL?
Over the last several years, OpenGL has been in slow decline. It’s still the foundation of graphics technology on *nix & Mac’s, and still used by several scientific visualization programs. But if you’re on Windows (like most computer users in the world) playing games (like so many people), then you’re using DirectX. The interfaces for DirectX have really grown into a powerful collection of libraries combining inputs (joysticks, mice), network connectivity, graphics, and sound all into one nice easy package.
OpenGL hasn’t been sitting still, tho. While DirectX remains solid in the Windows World, a whole new world has arisen on the horizon. Mobile devices like the iPhone have begun to redefine the video gaming market through their powerful feature set and slick interfaces. Afterall, why buy a portable game player for $150 (Nintendo DSi) when you could buy a combination phone, game player, music player, and internet browser for $99 (iPhone 3G 8G)? No doubt the embedded device market will continue to grow, and these devices don’t run DirectX, they run OpenGL derivatives like MiniGL and OpenGL|ES.
And another reason to think that DirectX11 won’t be the “big splash” that some think is to simply look at DirectX10. DirectX10 was released exclusive to Vista, much like DirectX11 will be released exclusive to Windows7. Microsoft Vista was a flop for many reasons, most notably the included DRM, but also because users simply had little reason to upgrade. The big game provoking people to upgrade to Vista was Crysis. Crysis, the “sequel” to FarCry, came in both a DirectX 10 (Vista) Version and a DirectX9 (XP) version, but the best graphics were only available on DirectX10. Sites like GameSpot made great comparisons on the graphics quality between the two versions.
But what many people found was that the graphics didn’t really improve the game any. Yes, sand and rocks looked more realistic but you know what that meant? Nothing. Enemies reacted the same, stuff blew up the same, it was the same game. So you had two options:
- Go drop a few grand on a new computer with a DX11 compatible video card, buy Vista, and build a whole new system.
- Keep all your existing equipment, and go buy the $50 game.
Most people opted for the cheaper alternative, and just bought the $50 game and played it on XP.
A similar situation is brewing for Windows7. The reviews so far are very positive about Windows7, but only time will tell if customers are willing to go fork out the money to actually upgrade. Surfing the web and checking email is the same either way, so how many people will actually commit? Given the economic situation of the world right now, how many people will upgrade?
As more and more people drop hardcore PC gaming for more leisurely (web-based, flash, etc) games and pick up consoles or mobile devices, DirectX11 is losing ground to a resurgence in OpenGL.
Gaming VS HPC?
Also, games alone don’t define the GPU market anymore. Increasingly users (and software developers) are taking advantage of GPU performance for non-graphics tasks like Video encoding and physics simulations. The massively parallel nature of the GPU fits nicely with what has been a growing trend in HPC for several years in parallelising codes for cluster operation. Instead of buying a 200 core supercomputer, they can simply buy a single GPU and see almost identical performance boosts (in some codes).
This is opening a whole new market for AMD and NVidia, which NVidia is taking huge advantage of with their CUDA toolkit. CUDA has been around for years and became the foundation of the now open-standard OpenCL, whose name bears a striking (intentional) resemblance to OpenGL. Research institutes, laboratories, and supercomputing centers are buying massive quantities of graphics cards for use in massively parallel codes, making them more than a minor percentage in NVidia sales. As the PC gaming market shrinks, both from a shift in focus and the recession, NVidia is looking for new markets to branch into to make up the difference and the HPC market kinda fell in their lap.
Now Fermi breaks onto the scene and it looks even more like HPC will become a major focus. I think it’s a safe bet that Fermi will render some awesome graphics, but no doubt it’s targeted specifically at people using the GPU for computational work. Those people will make the Fermi as attractive a resource as the Cell was a few years ago, which will make it even more profitable for NVidia.
I wonder how well wolfinsten, UT3 , fear 2 and other console lead games have done.
here is a few links about who is trying there real best to destory pc gaming. here is the real truth
http://www.shacknews.com/laryn.x?story=60207
http://news.bigdownload.com/2009/07/12/has-microsoft-killed-pc-gaming-in-retail-stores-one-site-says/
http://www.shacknews.com/onearticle.x/53403
http://downloadablesuicide.com/2009/07/16/pc-gaming-its-problems-stem-from-mistreatment/
http://www.edge-online.com/news/microsoft-pc-gaming-still-a-priority
http://www.edge-online.com/features/is-microsoft-really-committed-pc
mirco softs game devs that spread lies about pc gaming
http://kotaku.com/5058402/molyneux-says-pc-gaming-market-in-tatters
http://www.tgdaily.com/content/view/36390/118/
http://www.wired.com/gamelife/2008/02/gears-of-war-cr/
http://www.digitalbattle.com/2008/09/30/why-gears-of-war-2-isnt-on-pc-piracy/
yea Ivnda’s stock went chraching down after crysis. It’s pretty AMD who has 80% of the their hardware in consoles was going bankuprt!yea it’s pretty sad, AMD had to go steal money from intel to stay afloat!
yea them console sure make alot of money. ahahahaha what a joke!
I hear microsft and sony like big bright red numbers. the only company that knows their place in the food chain is nintendo.
becareful where you get your info this site is full of it.
high production costs are killing game devs on consoles
yea sony and microsoft have also lost billions on consoles.
then you have have the gaming payed by microsoft to trash talk pc gamers and pc gaming.
lets take a look at crysis shall we
http://www.youtube.com/watch?v=gyQTCeobZlg
yea lets not forget the consoles fail rates, both sony and microsofts fail rate are at 54%
here is the state of console gaming
http://bits.blogs.nytimes.com/2009/03/31/video-game-makers-seeing-red/
http://www.istockanalyst.com/article/viewarticle/articleid/3277020
http://www.edge-online.com/news/japanese-ps3-sales-tumble-ahead-of-slim-launch
http://www.wired.com/gamelife/2009/06/pachter/
http://www.wired.com/gamelife/2009/05/e3-predictions/
http://business.timesonline.co.uk/tol/business/industry_sectors/retailing/article6628517.ece
http://www.gamesindustry.biz/articles/consoles-could-soon-become-niche-products-playfish
yea thanks for yet another bais untruthfull article.
we all know how much the media hates pc gamers and pc gaming.
The focus of this article wasn’t in the demise of PC gaming. It was that PC gaming isn’t the focus of NVidia’s business like it used to be. They’ve begun to branch out into HPC (GPU) computing, consoles, and embedded systems.
I don’t think PC gaming will ever “go away”. In fact, PC Gaming usually sets the mark by which the next generation of Consoles are measured. But consoles are huge-business because they are easier on consumers and developers (hardware uniformity), provided that developers don’t spent all their time cranking out crap sequels to crap games.