TechCrunch brings us news of a new startup called “Lucky Sort” based out of Portland, Oregon that aims to bring live data import and analysis to the iPad.
That where TopicWatch comes in. With the new service, Lucky Sort’s first product, the company wants to enable users to sift through social media, government filings, news and commentary in real time, in order to find, summarize and analyze any text-based content. To be clear, TopicWatch is not yet another “sentiment analysis” or “social listening” platform – those are just subsets of what can be done on top of its platform.
While their first app “TopicWatch” pulls in lots of social media inputs like twitter and others for live analysis, the real tool can be mapped to any live input stream making it an interesting option for those looking to build data analysis tools. I can imagine they’ll be getting a close look from those deep in government intelligence industries.
For folks who run large datacenters or server-farms, systems monitoring is a typical boring day-to-day thing. However, there are mountains of data available that are typically forgotten or lost due to a severe lack of good analysis and visualization tools. Brendan Gregg has a great writeup of alternative ways to visualize real-time performance of multiple parameters across multiple systems.
For any given device type (CPUs, disks, network interfaces), and any number of devices (from a single device to a cloud of servers), we’d like to identify the following:
single or multiple devices at 100% utilization
average, minimum and maximum device utilization
device utilization balance (tight or loose distribution)
By including the time domain, we can identify whether utilization is steady or changing, and various finer details. These may include short bursts of high utilization, where it is useful to know the length of the bursts and the interval between them.
Square Enix, video game publisher behind titles like Final Fantasy and Kingdom Hearts, has been demonstrating their new “realtime rendering engine” that they claim will to what they call “quality live action” graphics. The photos and video they’ve shown are truly impressive.
As examined in the screenshots below, Square Enix is trying to get as close to absolute true fidelity as possible with its Direct-X 11 supported platform, known as Luminous.
At the press conference the company displayed source material pictures and their corresponding game maps. The engine allowed engineers to build navigable spaces rendered in real-time, as suggested in the video further down below.
While the environments look amazing, there’s little to no information on what they’ve done for rendering people. Hopefully the Uncanny Valley problems won’t derail their entire project. Look below for a sadly uninspiring piece of video. Uninspiring until you realize it’s actually rendering in-game in real-time.
The ACM has just put out the call for entries in this year’s SIGGRAPH2011 Real-Time Live event, where people submit examples of Real-Time rendering work and get a chance to demonstrate it live on-stage.
Building on its debut at the SIGGRAPH 2009 Computer Animation Festival, Real-Time Live! expects to continue its trend of growth and excellence at SIGGRAPH 2011. We are seeking cutting-edge examples of real-time graphics and simulations, including:
Interactive data visualization and information graphics
As long as the submission is interactively controlled, rendered in real time, and repeatable for a live audience, it will be considered. Accepted work will be demonstrated live on a PC or game console.
Even if you don’t plan to submit anything, make sure to check out last year’s demos in this short roundup video.
A group of researchers demonstrated a powerful Stereoscopic 3D photo editor at SIGGRAPH2010 Asia, and the demonstration video is now available online. In the video, they show first a nice object classification system that works on stereo pairs, and then show some amazing cut-n-paste tools they’ve created. The tools can properly correct perspective as you move the objects around, properly handle depth and occlusion, and even add shadows via a real-time ambient occlusion algorithm.
The results are impressive, and really have to be seen to be believed. (PS: You’ll want some Anaglyph glasses handy to really appreciate the later half of the video). Read their paper here.
Scientific publisher ‘Springer’ has released a collection of free analytics and visualization tools for traffic on their website, showing which publications are the most popular and allowing you to track how often they are downloaded.
The interactive visualizations include a map showing where the downloads are coming from, a constantly updated keyword tag cloud, and a graphical and textual display of real-time downloads. There’s also a search feature that shows you a chart of the downloads, as well as a “Top Five Most Downloaded” list for every journal and book.
Real Time Tomography has a new product called ‘AdaraGPU’ that uses the power of NVidia Quadro’s to process and enhance digital mammography images 10x faster than their CPU-based counterparts, improving image quality and reducing time to diagnosis.
“Reducing the time to display an image on the screen will improve throughput in the clinic,” says Dr. Catherine Piccolo, Director of Breast Imaging at South Jersey Radiology Associates in New Jersey. “It will also help improve biopsy procedures and reduce patient discomfort by minimizing the time that they are in compression.”
If you’re at RSNA this week, you can swing by their booth for a live demo of this and their new ‘Briona’ product, a real-time 3D image reconstruction software for Digital Breast Tomosynthesis.
With Briona, 3D tomographic images can be reconstructed in real-time from 2D projections at any depth, magnification and angle to the detector. Advanced features include the ability to change parameters dynamically to clinically target the reconstruction, and fast volume rendering. Briona is currently in beta development and for investigational use only.
An article over on GameDev.net from Dzmitry Malyshau starts off light with a discussion of Particle Systems and pitches a way to build Particle Systems that run entirely on the GPU for blinding speed and a huge jump in the number of concurrent particles. Then he adapts this to do Fur simulation, and finally adds lighting effects. In the end, he has a simulation of 50,000 connected hair strands running in real-time at 24fps.
Due to the fact that each fur strand is rendered independently, the vertex processing load may be higher than that of shells for some scenes. However, fur rendering operation is generally fill-rate limited (not counting the artificial test-case). This, combined with Unified Shading Architecture of the current-generation GPUs, balances increased vertex load with an absence of the pixel shader overhead on empty areas.
Get all the gory details and pseudocode at the link below.
Several years under development and first demo’ed at GDC, the NeutronE gaming engine brings real-time rendering via DirectX11 and GPU/CPU cooperation. The technology is still fairly young, but their website has several demonstration videos showing what it can do. A short bulleted list looks like:
Real-time Global Illumination
Real-time detail tessellation
Real-time time-of-day change
Compute shader for real-time ocean rendering
Underwater rendering with real-time caustics and fog
Soft Shadows with shadow blending
Real-time GPU/CPU Particle Effects
A pretty impressive list of features, no doubt. Hit their website for all the demos and details.
Illuminate Labs has followed up on their recent release of the “Turtle” lighting tool for Maya with a new tool for gaming: Beast. Fully integrated with the Unreal3 Engine, Gamebryo LightSpeed and Toolbench, and several other engines, it brings baked and dynamic lighting to real-time systems.
Beast is used to bake light maps, shadow maps and point clouds with advanced global illumination.
Color bounces and colored shadows from transparent objects.
Natural looking shadows affected by light bounces.
Soft shadows from point lights, directional lights and area lights.
Create light from a HDR environment giving natural lighting in the entire scene without adding a single light source.
Dynamically relight objects and characters using light probes precalculated by Beast. Lighting of characters and other dynamic objects is affected by emissive objects and light bouncing off other objects.
While they don’t come out and say it, it seems Beast may have been behind the incredible outdoor lighting in critically acclaimed “Mirror’s Edge“. Full details on their site.
VizWorld.com We cover visualization and graphics news from around the internet, including Scientific Visualization, Visual Effects, and Graphics Hardware. Read more on our About Page or learn about our Advertising Options Get updates via twitter from @VizWorld.