GithubHelp home page GithubHelp logo

pfnn's Introduction

Phase-Functioned Neural Networks for Character Control

This project contains the code and data for Phase-Functioned Neural Networks for Character Control as presented at SIGGRAPH 2017, with the project page found below:

http://theorangeduck.com/page/phase-functioned-neural-networks-character-control

This paper presents a character controller driven by a special kind of network called a Phase-Functioned Neural Network which learns a mapping from the user input parameters, the character's previous state, and the surrounding environment to the pose of the character at a given frame. The rest of this README will assume you have read the paper and understand the basic workings and processes involved in this technique.

The code is essentially separated into two sub-projects.

The first project is a set of python scripts written in Numpy/Scipy/Theano which preprocess the motion data, generate a database, and train the phase functioned neural network to perform the regression. The output of this project are the network weights saved as simple binary files.

The second project (contained in the subfolder "demo") is a C++ project which contains a basic runtime that loads the network weights and runs an interactive demo which shows the results of the trained network when controlled via a game-pad.

Below are details of the steps required for reproducing the results from the paper from preprocessing and training all the way to runnning the demo.

Installation

Before you do anything else you will first need to install the following python packages numpy, scipy, Pillow, theano as well as CUDA, cuDNN etc. This project was built using Python3 but may work with Python2 given a few minor tweaks.

Preprocessing

The next step is to build the database of terrain patches which are later fitted to the motion. For this simply run the following.

python generate_patches.py

This will sample thousands of patches from the heightmaps found in data/heightmaps and store them in a database called patches.npz. This should take a good few minutes so be patient.

Now you can begin the process of preprocessing the animation data - converting it into the correct format and fitting the terrain to each walk cycle. For this you run the following:

python generate_database.py

This process can take some time - at least a couple of hours. It uses all the data found in data/animations and once complete will output a database called database.npz. If you want to change the parameterisation used by the network this is probably the place to look - but note that the preprocessing for this work is quite fiddily and complicated so you must be careful when you edit this script not to introduce any bugs. You will also have to remember to update the runtime to match.

Training

Once you've generated database.npz you can begin training. For this simply run the following:

python train_pfnn.py

Assuming you've installed theano, CUDA, and everything else successfully this should start training the neural network. This requires quite a lot of RAM as well as VRAM. If you get any kind of memory error you can perhaps try using less data by subsampling the database or even taking some of the clips out of the preprocessing stage by removing them from generate_database.py.

During the training process the weights of the network will be saved at each epoch to the location demo/network/pfnn so don't worry about stopping the training early. It is possible to achieve decent results in just an hour or so of training, but for the very best results you may need to wait overnight. For this reason it might be worth making a backup of the pre-trained demo weights in demo/network/pfnn before beginning training.

Runtime

With the network weights generated you're now ready to try the runtime. For instructions for this please look inside the demo folder.

PFNN

pfnn's People

Contributors

crazii avatar sreyafrancis avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

pfnn's Issues

got different result about .phase , _footstep.txt

hi,
thx for ur code, I'm trying to using your code for my own project. I found that when i running the generate_footsteps_phase.py, the output is different from your original .phase and _footstep.txt
could u plz tell me how to explain it? thx a lot.

image

values of p for equation 7 in paper

Hi,

Thanks for the code. I am trying to understand equation 7 in the paper http://theorangeduck.com/media/uploads/other_stuff/phasefunction.pdf
For that I need to know the four values of control points for the phase p that you used for deriving w and k_n.

I was thinking they should be p -> [0, pi, 2pi, 3pi] but then: (mod 1) in the equation for w would be confusing as mod 1 for any integer would be 0. Also, if you can point me to the code for equation 7 in your repo, it would be helpful.

Thanks
Rohit

[RESOLVED] memory error when running generate_patches.py

July 17 2018 Update: The problem was that the 32-bit version of python was installed on my machine. The 32-bit version is well known for it's memory limitations.
Upgrading to the 64-bit version of python solved the issue.

Hi,

I'm trying to work through the prepossessing stage of the PFNN weights and biases generation process.

I'm consistently hitting an error when I run generate_patches.py:

D:\Projects\HiFi\PFNN\PFNN-master>python generate_patches.py
Processing ./data/heightmaps/hmap_001_smooth.txt (504 x 302) 0 [676]
Processing ./data/heightmaps/hmap_002_smooth.txt (360 x 784) 424 [1254]
Processing ./data/heightmaps/hmap_003_smooth.txt (554 x 384) 1657 [945]
Processing ./data/heightmaps/hmap_004_smooth.txt (523 x 532) 2315 [1236]
Traceback (most recent call last):
  File "generate_patches.py", line 52, in <module>
    S = ndimage.interpolation.shift(H, (xi, yi), mode='reflect')[:size*2,:size*2]
  File "C:\Users\*****\AppData\Roaming\Python\Python37\site-packages\scipy\ndimage\interpolation.py", line 504, in shift
    output = _ni_support._get_output(output, input)
  File "C:\Users\*****\AppData\Roaming\Python\Python37\site-packages\scipy\ndimage\_ni_support.py", line 75, in _get_output
    output = numpy.zeros(shape, dtype=input.dtype.name)
MemoryError

D:\Projects\HiFi\PFNN\PFNN-master>

I have increased my paging file size to up to 100GB. From task manager, the script appears to bomb out at less than 1.6GB.

Please feel free to contact me should you require further information.

~ DaveDub

I'm trying to run the demo on Windows. How do I make the character move?

When I'm running pfnn.exe on Windows, I'm able to visualize the character and load different height maps but I'm unable to make the character move. It's written that we need to plug in an Xbox gamepad for controlling the character and we might have to play with the gamepad enums in the pfnn.cpp file. (I'm not familiar with C++ language)
I want to make the character move using Windows keyboard. Please help!

How to generate the gait file

Hey, we're trying to use our own motion capture database, but we're not very sure how to generate the phase & gait txt.

We visualized the bvh file using blender and can see that the phase is deciding by using the lowest foot height and then calculating steps, probably as described in the paper.
What about the gait file? we checked it a little bit but not quite sure. We'll go through the paper and check the gait file again. Mean while I think create an issue here may also help.

Thanks.

invalide syntax in skeletondef.py

Hi, I'm trying to run the generate_database.py, but get a syntaxError after import skeleton as skd in skeletondef.py, line 1: ROOT Hips.
Did I forget to install anything? I can't figure out what I did wrong.
I hope you can help me.

Help with computation of NN inputs relating to future trajectories from joystick

Hi, congrats for the great work and thanks for sharing!

I'm trying to replicate the results for research purposes, and need to compute the NN input at runtime. In particular, the computation of the future parts of the trajectory, as described in Section 6 of your paper, seems the most critical.

I'm wondering if you could kindly provide a pointer to existing code implementing this key input processing, and possibly some context/documentation on how to use it. (e.g., may it be somewhere in this file? https://github.com/sreyafrancis/PFNN/blob/master/demo/pfnn.cpp)

I'm also available by email, just in case (see profile).

Thank you.

cc @crazii @sreyafrancis

How can I use a new character?

Hello!

I try to change the character, which has different joint positions with the original character.
However, it seems to change joint positions of the new character to be similar with original one, and distort the character to fit this change. As a result, the character looks terrible.

I wonder how I can use a new character with different joint positions? Do I need to retrain the model with new character?
An other question is, can I reduce the number of joint points? Do I need to retrain the model?

Looking forward to your reply. Thank you very much.

Where is the JOINT_WEIGHTS from?

Such as the file PFNN/skeletondef.py:

JOINT_WEIGHTS = [
1,
1e-10, 1, 1, 1, 1,
1e-10, 1, 1, 1, 1,
1e-10, 1, 1,
1e-10, 1, 1,
1e-10, 1, 1, 1, 1e-10, 1e-10, 1e-10,
1e-10, 1, 1, 1, 1e-10, 1e-10, 1e-10
]

Given a 3D mesh along with skeleton weights, is this the corresponding whights ??But how can i extract those weight values?
"""

rewrite the python code for training

Exciting project!
I also found your training part uses the theano package with self-defined NN layer. Do you think it is a good idea if it can be rewrited with currently polpular package, .e.g tensorflow or pytroch?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.