Skip to content

unity-fun/AI4Animation

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 

Repository files navigation

AI4Animation

Copyright Information

This code implementation is only for research or education purposes, and (especially the learned data) not available for commercial use, redistribution, sublicensing etc. The intellectual property and code implementation belongs to the University of Edinburgh. For scientific use, please reference this repository together with the PFNN paper below. In any case, I would ask you to contact me if you intend to seriously use, redistribute or publish anything related to this code or repository.

Description

This project explores the opportunities of deep learning and evolutionary computation for character animation as part of my Ph.D. research at the University of Edinburgh in the School of Informatics, supervised by Taku Komura.

It extends the recent work using PFNN (Phase-Functioned Neural Networks) for character control (Video: https://www.youtube.com/watch?v=Ul0Gilv5wvY, Paper: http://theorangeduck.com/media/uploads/other_stuff/phasefunction.pdf), and aims learning task-specific motion manifolds as well as creating representations for different geometries. The development is done using Unity3D, and the implementation is made available for character animation research and games development during my Ph.D. progress.

The algorithmic framework is shown below. In addition to the extended PFNN version which utilises multiple phase modules, a memetic evolutionary algorithm for generic inverse kinematics (BioIK Asset: https://github.com/sebastianstarke/BioIK AssetStore: https://www.assetstore.unity3d.com/en/#!/content/67819 Video: https://www.youtube.com/watch?v=ik45v4WRZKI) is used for animation post-processing and motion editing.

Development Status

The only required script component that is needed for the animation is called 'BioAnimation'. The code for the PFNN is implemented using MathNet.Numerics, and uses the externally trained weights from Theano/TensorFlow to generate the character motion. The weights are supposed to be imported during edit time and are then serialised inside Unity. The trajectory estimation module handles the user input to control the character movements, and rejects paths which would collide with obstacles. The output of the joint positions, velocities, corrections etc. is then fed back to the character to update the posture.

Demo

To run the demo, simply start Unity and open the provided scene in the Assets folder. You then need to press the "Load Parameters" button in the attached 'BioAnimation' component, which can take a few seconds. When hitting play, the character can be controlled via W-A-S-D (move), Q-E (turn), LeftShift (run) and LeftCtrl (crouch). Jumping is handled automatically given the rise of the terrain.

Data Preprocessing

A 'BVHViewer' editor window has been developed with the aim to do animation preprocessing of BVH files to generate the training data for the deep learning. This enables defining the phase function, style function, generating trajectories as well as all other relevant data right inside Unity.

About

Character Animation in Unity3D using Deep Learning and Biologically-Inspired Optimisation

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • C# 74.7%
  • HLSL 15.1%
  • ShaderLab 9.4%
  • C++ 0.8%