Of course, GTC is going on right now and there’s lots of discussions and vendors talking about the future of GPU computing and technology. Today at 10:30 PT (12:30 Central), NVidia CEO Jen-Hsun Huang will be presenting the Keynote and from what I hear, making some major announcements.
This week is the NVidia GPU Technology Conference, and NVidia is kicking it off with a huge announcement of interest to anyone doing GPU development: the world’s first Eclipse-based Integrated Development Environment that supports Linux and MacOS development. For the last several years, Linux GPU programmers had to deal with basic commandline tools while the Windows world delighted in tools like Visual Studio and NSight. With this new development, CUDA and NVidia GPU programming is now on even (or at least closer to even) footing.
“Previously, debugging required dedicated systems that were often expensive and time consuming to configure,” said Tony Tamasi, senior vice president of content and technology at NVIDIA. “Now, any system with an NVIDIA GPU that supports debugging can be used without any additional cost or system upgrades, resulting in significant cost and time savings.”
Get a free demo on the show floor or at www.nvidia.com/paralleldeveloper. Get the full press release after the break.
It’s only 3 months until the next GTC2012 in San Jose, and the website is now online with registration details, an agenda, and much more.
GTC 2012 will feature hundreds of hours of technical sessions, tutorials, panels, and moderated roundtables, presented by senior programmers, researchers and thought leaders from across a broad range of fields. Explore the growing list of confirmed sessions on the sessions page.
Be sure to check out their “Convince your Manager Toolkit” as well, if you think you’ll need some help getting work to expense your trip.
via Agenda – NVIDIA.
NVidia’s getting a nice 12-month head start on GTC2012 with an announcement that’ll be colocated with LANL’s Accelerated High Performance Computing Symposium next May in the San Jose McEnery Convention Center. Of course, if you can’t wait, then you can always review the GTC2010 information on their brand new year-round site.
A new online resource for attendees of GTC 2012, and entire GPU computing community, also went live today at www.gputechconf.com. The site is a year-round resource, featuring details of all keynotes, technical sessions and events, conference scheduling tools, social media resources and much more.
Get all the details on their site, and the details on some upcoming webinars.
NVidia has just announced that GTC2011 will be this Octobery in the San Jose McEnery Convention Center, with several of last year’s sponsors and exhibitors already lining up for a shot. And to make this year’s event even bigger, it will be colocated with the Accelerated High Performance Computing Symposium sponsored by Los Alamos National Laboratory.
A leading U.S. national security research institution, Los Alamos National Laboratory has been hosting the Accelerated HPC Symposium as a stand-alone event with the goal of bringing together world leaders in supercomputing to share knowledge and help solve the world’s most crucial technology challenges. This event will now take place during GTC 2011, and will be co-hosted by Los Alamos National Lab and NVIDIA.
Ben Bergen, research scientist at Los Alamos National Laboratory, said, “The growing success of GTC makes it a natural venue for co-hosting the Accelerated HPC Symposium. This event draws senior scientists from national research labs across the globe, and their interests in hardware and software development make for a perfect match with GTC.”
via NVIDIA Newsroom.
Wanted to go to GTC but couldn’t? Never fret, NVidia comes to your rescue with just about every single presentation, keynote, and breakout available online for Download or Streaming. Get the PDF’s, the Videos, and more at this massive page on NVidia’s site.
Just hit up www.nvidia.com/gtc2010-content and let the knowledge wash over you, as the 300 hours of content fries your brain.
Last month we published our wrap-up of NVIDIA’s GPU Technology Conference. VizWorld covered the opening keynote speech from Jen-Hsun Huang, the announcement of the iray Realtime Raytracer for 3dsMax, the announcement of CUDA-x86, and especially the new NVIDIA Roadmap for Kepler and Maxwell. If you want to see our full list of articles, you can click on the gtc tag below.
Similarly, AnandTech has posted their NVIDIA GPU Technology Conference wrap-up on their site, and it is worth reading as well.
Today we’re wrapping up our coverage of last month’s NVIDIA GPU Technology Conference, including the show’s exhibit hall. We came to GTC to get a better grasp on just where things are for NVIDIA’s still-fledging GPU compute efforts along with the wider industry as a whole, and we didn’t leave disappointed. Besides seeing some interesting demos – including the closest thing you’ll see to a holodeck in 2010 – we had a chance to talk to Adobe, Microsoft, Cyberlink, and others about where they see GPU computing going in the next couple of years. The GPU-centric future as NVIDIA envisioned it may be taking a bit longer than we hoped, but it looks like we may finally be turning the corner on when GPU computing breaks in to more than just the High Performance Computing space.
If you’ve ever seen or worked much in video production, you’ll quickly find yourself overwhelmed at the sheer volume of equipment they carry around. What you would think should be a simple process of translating a video from one physical format to another turns into a lengthy drawn out process of hundreds of thousands (sometimes millions) of dollars of hardware dedicated to reading formats, scaling, transcoding, and writing back. It’s not uncommon for a studio to take their analog reels and prepare them for digital distribution, and find themselves overwhelmed with the process. Imagine this:
- Digitize the film into the computer
- Clean it up (removing dust, scratches, and various other side-effects of the analog space)
- Pipe it back out onto an Analog line to some unknown device
- Unknown device scales it to the desired format (DVD, BluRay, iPad, iPod, NetFlix, Hulu, etc) which means physically scaling it to the right resolution and possibly retiming it for the proper framerate
- Unknown device pumps it back into the computer
- Computer transcodes it into the proper container format
And tada you’re done! The entire process took 3 analog transfers (From film, to Device, and back from Device), and has to be repeated (probably using a different specialized piece of conversion equipment) for each of the many digital distribution formats in use. Just off the top of my head there’s DVD, BluRay, SD television, HD television, NetFlix, Hulu, and the various digital cinema distribution channels in use.
It really is a nightmare of a system. But a solution is coming, care of a small company presenting at GTC called ‘Cinnafilm‘. They’ve created a pure digital system capable of everything the analog system did, and doing it faster.
During Nvidia’s GPU Technology Conference last week, Randall took some photos of the conference posters and placed them on VizWorld for you to view. Now Nvidia has taken all the conference posters and made them available on-line as PDFs. I could not find my favorite one on-line, so it is the feature image to the right. Can you imagine? A whole room full of monitors on which to visualize your data. Or it is nothing more than a tour stop. Either way it is cool.
Now that GTC has come to a close, I wanted to look back and report on how the event went. Overall, it was probably one of the best technical conferences I’ve ever been to. Several things contributed to this, so let me break it down.
The exhibition hall
SuperComputing, SIGGRAPH, VisWeek, and pretty much every other conference I’ve ever been to that contained a vendor display area or exhibition hall suffers from the same problem: maintaining a useful concentration of people in both the exhibits and the technical tracks. At every show it’s the same story: the one interesting talk or keynote begins, and the exhibition hall is deserted save the vendors who must man their booths while waiting for the one latecomer to arrive and troll the floor, then the exhibition floor is flooded with so many people that presenters are talking to empty rooms. It devalues the entire event and frustrates vendors and presenters.
GTC solves this problem in an unusual way. I’ve seen other smaller conferences tackle this problem similarly, but never as successfully as NVidia did. The exhibition hall was only open from noon-2 and again from 6-8, and during both times food and drink were served. The rest of the time it was locked. Having the food there prevents newcomers having to wander out in search of meals, and convinces everyone to at least look around at the vendors. Vendors have the opportunity to venture out and check out some talks, and attendees don’t feel any pressure to choose between food, vendors, or technical papers.
Although it did have a downside. Since the hall was only open for 2 hours at a time, some of the more popular booths were swarmed with people, all hoping to get some hands-on time with the latest gadget. I’m sure vendors loved all the attention, but it made some booths a bit crowded. Also, the few technical talks on the last day fell into the trap of “If you want to know more, feel free to stop by our booth and…oh wait” as they realized the exhibition hall was closed and wouldn’t be opening again.
- Type 1: presenting some proprietary technology that, while really kewl, you’ll never see. This frequently comes from the ILMs and Pixars of the world.
- Type 2: presenting some piece of code or algorithm that while fascinating, it’s not publicly available in any form that a normal person could use. It’s available as an obscure library or MATLAB scripts, completely lacking any documentation.
- Type 3: presentations so confoundingly technical that you need a PhD even to stand in the room.
- Type 4: presentations of readily available and useful technology.
While some of the first three is to be expected, even desirable, all too often the number of “Type 4″ talks are so tiny as to be counted on one hand.
GTC pleasantly flips the usual, and is almost entirely these real-world uses and products. The few algorithmic and pie-in-the-sky talks were easy to find, and in just the right balance to show you what’s possible today and what’s on he horizon.
The big difference: the focus
Perhaps the biggest difference between GTC and a conference like VisWeek is that GTC is not a profit machine, and NVidia shows no signs of making it such. You’re not hounded with coupons, discounts, and sales pitches. The registration packet had a coffee mug, the proceedings, and a few pages on after-events like the charity dinner Thursday night. A sharp contrast to the 20 pound bags you get from other conferences that fill you up with free magazines and postcards.
The conference is not run to push sales or increase enrollment, but rather to push CUDA anf GPGPU technology. The fact that NVidia is sub an engineering-heavy company means they know exactly what’s important in a conference and stop there. They know the big 3 of every geek: free food, cool gadgets, and minimal effort. The RFID badges made sure every technical talk had plenty of room (no standing in the back, juggling your laptop and proceedings), and nVidia brought some serious projection and display technology to the event, making sure every single room had a massive screen that was crystal clear from anywhere in the room.
NVidia knows that the value of the conference is the information , not the sales. Pushing the edge of GPU technology isn’t just good for the industry, it’s good for nVidia too, so they don’t need to gouge you for wifi (free), food (free), drinks (free) or registration (not free, but have you compared it to anything else? Cheap).
This was my first GTC , but only the most recent in a long series of technical conferences I’ve attended. I have to admit it wad the best organized, most entertaining, and most educational conference I’ve attended in almost a decade. With over 250 hours of technical talks, and roughly 2000 attendees (my own informal estimate), it had the kind of small informal atmosphere that makes these conferences great.
GTC2011 can’t get here soon enough.
Note: Some of these photos come from NVidia’s Flickr Photostream.