Behavior Generation for Interactive Virtual Humans

David Vogt

Saturday 24 October Theater 30

In our research we address the problem of creating believable animations for virtual humans that need to react to the body movements of a human interaction partner in real-time. Our data-driven approach uses prerecorded motion capture data of two interacting persons and performs motion adaptation during the live human-agent interaction. With different inverse kinematics solvers we are able to control virtual characters in Blender and visualize their postures in a fully immersive virtual environment using a 50 Megapixel CAVE. The talk focuses on extensions we made to Blender including motion capture addons for optical A.R.T. tracking systems and Kinects as well as Matlab bindings to compute character responses.

https://youtu.be/TD-F60XxRZ4
https://youtu.be/2fsWeUyFMWM