Learning Neural Character Controllers from Motion Capture Data

Speaker:        Professor Taku Komura
                The Institute of Perception, Action and Behaviour
                School of Informatics
                University of Edinburgh

Title:          "Learning Neural Character Controllers from Motion
                 Capture Data"

Date:           Thursday, 29 August 2019

Time:           3:00pm - 4:00pm

Venue:          Room 4472 (via lift no. 25/26), HKUST

Abstract:

In this talk, I will cover our recent development of neural network-based
character controllers.  Using neural networks for character controllers
significantly increases the scalability of the system - the controller can
be trained with a large amount of motion capture data while the run-time
memory can be kept low.  As a result, such controllers are suitable for
real-time applications such as computer games and virtual reality systems.
The main challenge is in designing an architecture that can produce
movements in production-quality and also manage a wide variation of motion
classes. Our development covers low-level locomotion controllers for
bipeds and quadrupeds, which allow the characters to walk, run, side-step
and climb over uneven terrain, as well as a high level character
controller for humanoid characters to interact with objects and the
environment, which allows the character to sit on chairs, open doors and
carry objects. In the end of the talk, I will discuss about the open
problems and future directions of character animation.


***********************
Biography:

Taku Komura is a Professor at the Institute of Perception, Action and
Behaviour, School of Informatics, University of Edinburgh. As the leader
of the Computer Graphics and Visualization Unit his research has focused
on data-driven character animation, physically-based character animation,
crowd simulation, cloth animation, anatomy-based modelling, and robotics.
Recently, his main research interests have been the application of machine
learning techniques for animation synthesis. He received the Royal Society
Industry Fellowship (2014) and the Google AR/VR Research Award (2017).