Realistic Cloth Simulation based on Live Video
First we showed you how you could modify actors in live video with MovieReshape, now we can dynamically regenerate their clothing as well thanks to work by Carsten Stoll of the Max Planck Institute in Germany. They’ve generated 3D laser scans of actors in costume and motion-tracked the silhouette and skeletons. Then analyzed the cloth and can map it onto newly detected skeletons in new video, allowing them to change clothes.
According to Stoll, the results are extremely realistic. When he and his team showed 52 people a video of a woman dancing in a skirt alongside a reconstruction that his software had produced, the majority of viewers said that the reconstruction was “almost the same” as the original.
They talk a lot about using it in Video Games, but I believe the technology would probably first be used in VFX. In fact, they’ve got Andy Lomax at The Foundry looking at it right now.
“This is exactly what people like me want,” says Andy Lomas, a software developer who produced digital effects for the film The Matrix and is based at computer graphics firm The Foundry in London. “I want to be able to capture the fundamental nature of an actor’s clothing, but also have the freedom to change the way he or she moves.”
This technology will be unveiled at SIGGRAPH Asia.