GithubHelp home page GithubHelp logo

guolinjie007 / manopth Goto Github PK

View Code? Open in Web Editor NEW

This project forked from hassony2/manopth

0.0 1.0 0.0 241 KB

MANO layer for PyTorch, generating hand meshes as a differentiable layer

License: GNU General Public License v3.0

Python 100.00%

manopth's Introduction

Manopth

MANO layer for PyTorch (tested with v0.4 and v1.x)

ManoLayer is a differentiable PyTorch layer that deterministically maps from pose and shape parameters to hand joints and vertices. It can be integrated into any architecture as a differentiable layer to predict hand meshes.

image

ManoLayer takes batched hand pose and shape vectors and outputs corresponding hand joints and vertices.

The code is mostly a PyTorch port of the original MANO model from chumpy to PyTorch. It therefore builds directly upon the work of Javier Romero, Dimitrios Tzionas and Michael J. Black.

This layer was developped and used for the paper Learning joint reconstruction of hands and manipulated objects for CVPR19. See project page and demo+training code.

It reuses part of the great code from the Pytorch layer for the SMPL body model by Zhang Xiong (MandyMo) to compute the rotation utilities !

It also includes in mano/webuser partial content of files from the original MANO code (posemapper.py, serialization.py, lbs.py, verts.py, smpl_handpca_wrapper_HAND_only.py).

If you find this code useful for your research, consider citing:

  • the original MANO publication:
@article{MANO:SIGGRAPHASIA:2017,
  title = {Embodied Hands: Modeling and Capturing Hands and Bodies Together},
  author = {Romero, Javier and Tzionas, Dimitrios and Black, Michael J.},
  journal = {ACM Transactions on Graphics, (Proc. SIGGRAPH Asia)},
  publisher = {ACM},
  month = nov,
  year = {2017},
  url = {http://doi.acm.org/10.1145/3130800.3130883},
  month_numeric = {11}
}
  • the publication this PyTorch port was developped for:
@INPROCEEDINGS{hasson19_obman,
  title     = {Learning joint reconstruction of hands and manipulated objects},
  author    = {Hasson, Yana and Varol, G{\"u}l and Tzionas, Dimitris and Kalevatykh, Igor and Black, Michael J. and Laptev, Ivan and Schmid, Cordelia},
  booktitle = {CVPR},
  year      = {2019}
}

The training code associated with this paper, compatible with manopth can be found here. The release includes a model trained on a variety of hand datasets.

Installation

Get code and dependencies

  • git clone https://github.com/hassony2/manopth
  • cd manopth
  • Install the dependencies listed in environment.yml
    • In an existing conda environment, conda env update -f environment.yml
    • In a new environment, conda env create -f environment.yml, will create a conda environment named manopth

Download MANO pickle data-structures

  • Go to MANO website
  • Create an account by clicking Sign Up and provide your information
  • Download Models and Code (the downloaded file should have the format mano_v*_*.zip). Note that all code and data from this download falls under the MANO license.
  • unzip and copy the models folder into the manopth/mano folder
  • Your folder structure should look like this:
manopth/
  mano/
    models/
      MANO_LEFT.pkl
      MANO_RIGHT.pkl
      ...
  manopth/
    __init__.py
    ...

To check that everything is going well, run python examples/manopth_mindemo.py, which should generate from a random hand using the MANO layer !

Install manopth package

To be able to import and use ManoLayer in another project, go to your manopth folder and run pip install .

cd /path/to/other/project

You can now use from manopth import ManoLayer in this other project!

Usage

Minimal usage script

See examples/manopth_mindemo.py

Simple forward pass with random pose and shape parameters through MANO layer

import torch
from manopth.manolayer import ManoLayer
from manopth import demo

batch_size = 10
# Select number of principal components for pose space
ncomps = 6

# Initialize MANO layer
mano_layer = ManoLayer(mano_root='mano/models', use_pca=True, ncomps=ncomps)

# Generate random shape parameters
random_shape = torch.rand(batch_size, 10)
# Generate random pose parameters, including 3 values for global axis-angle rotation
random_pose = torch.rand(batch_size, ncomps + 3)

# Forward pass through MANO layer
hand_verts, hand_joints = mano_layer(random_pose, random_shape)
demo.display_hand({'verts': hand_verts, 'joints': hand_joints}, mano_faces=mano_layer.th_faces)

Result :

random hand

Demo

With more options, forward and backward pass, and a loop for quick profiling, look at examples/manopth_demo.py.

You can run it locally with:

python examples/manopth_demo.py

manopth's People

Contributors

hassony2 avatar lesteve avatar

Watchers

James Cloos avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.