avatarThe Intelligent Systems Lab at the University of Central Florida (ISL at UCF) and the Electronic Visualization Lab at the University of Illinois at Chicago (EVL at UIC) are working on Project Lifelike, a project to create realistic human simulations (avatars).  One thing they’ve found, which is related to an earlier discovery, is that the rendering & display is only part of the problem, the bulk of the “realism” comes from behaviors which they’ve implemented with an AI system.

The Project LifeLike team demonstrated the technology this past winter at NSF’s headquarters in Arlington, Va. The team gathered motion and visual information on a NSF staff member, and gave the avatar system information about an upcoming NSF proposal solicitation. Other people were able to sit and talk to the avatar, which could converse with the speaker and answer questions about the solicitation. Colleagues of the NSF staffer were instantly able to recognize who the avatar represented and commented that it captured some of the person’s mannerisms.

via Epoch Times – Engineering Your Double.