GithubHelp home page GithubHelp logo

mknoche / warp3d_reposing Goto Github PK

View Code? Open in Web Editor NEW
27.0 4.0 6.0 1.78 MB

Reposing Humans by Warping 3D Features

Python 100.00%
person-reposing human-reposing feature-warping human-pose pose-change pose-conditioned image-generation image-synthesis 3d-convolutions 3d-features computer-vision machine-learning tensorflow python

warp3d_reposing's Introduction

Reposing Humans by Warping 3D Features

Markus Knoche, István Sárándi, Bastian Leibe (RWTH Aachen University)

Project | Paper | Video

alt text

This repository contains code for our paper “Reposing Humans by Warping 3D Features”. We warp implicitely learned volumetric features with explicit 3D transformations to change the pose of a given person into any desired target pose.

Setup

Using conda:

conda create -n warp3d_reposing
conda activate warp3d_reposing
conda install tensorflow-gpu=1.12
conda install scikit-image
conda install -c menpo opencv3
conda install numpy=1.16

Code Overview

  • parameters.py contains global parameters for training, model and dataset
  • dataset.py loads pairs of images and their respective poses, estimates the transformations and creates the bodypart-masks
  • model.py contains generator and discriminator as well as the warping block
  • train.py glues everything together

Datasets

The parameter params['data_dir'] points to the root directory which includes all dataset directories. Each dataset directory contains a directory images with all images organized in subfolders. It further must include a pickle file poses.pkl containing nested dictionaries with the same hierarchy as the image directory. Each image img.png is replaced by a dictionary key img which points to a numpy array with one row per joint and three columns for the coordinates in pixel space. Height and width are between 0 and 256, the depth is centered around 0.

Estimated Poses

poses.pkl for deepfashion, save as <data_dir>/fashion3d/poses.pkl

poses.pkl for iPER, save as <data_dir>/iPER/poses.pkl

Training

First, adapt the paths for the data root directory, the checkpoint directory and the tensorboard directory in parameters.py according to your system structure.

Second, download the pose estimator checkpoint which is used for validation. Extract the three files into pose3d_minimal/checkpoint/.

You now have two options to start a training. You can run train.py with command-line parameters, for example

python3 train.py name feat_weight_5 feature_loss_weight 5.

Make sure to always include a unique name, because a new directory with this name will be created in the checkpoint directory and the tensorboard directory.

To keep track of different parameter combinations, you can also define a JOB_ID in parameters.py and then for example use

python3 train.py JOB_ID 5.

For each ablation model of our paper, a JOB_ID is already defined.

Pretrained Models

Extract the checkpoints into <check_dir>/<model_name>/.

iPER

name 3D warping 3D target pose checkpoint
iPER-3d_w-3d_p ✔️ ✔️ iPER-3d_w-3d_p.zip
iPER-3d_w-2d_p ✔️ iPER-3d_w-2d_p.zip
iPER-2d_w-3d_p ✔️ iPER-2d_w-3d_p.zip
iPER-2d_w-2d_p iPER-2d_w-2d_p.zip

deepfashion

name 3D warping 3D target pose checkpoint
fash-3d_w-3d_p ✔️ ✔️ fash-3d_w-3d_p.zip
fash-3d_w-2d_p ✔️ fash-3d_w-2d_p.zip
fash-2d_w-3d_p ✔️ fash-2d_w-3d_p.zip
fash-2d_w-2d_p fash-2d_w-2d_p.zip

BibTeX

@inproceedings{Knoche20CVPRW,
 author = {Markus Knoche and Istv\'an S\'ar\'andi and Bastian Leibe},
 title = {Reposing Humans by Warping 3{D} Features},
 booktitle = {CVPR Workshop on Towards Human-Centric Image/Video Synthesis},
 year = {2020}
}

warp3d_reposing's People

Contributors

mknoche avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar

warp3d_reposing's Issues

3D Affine Parameters

Thanks for sharing your nice work!
I have a question about the scaling (affine_mul) operation of 3D affine parameters (transform_params) :

affine_mul = [[1, 1, 1 / z_scale, v_scale],
[1, 1, 1 / z_scale, v_scale],
[z_scale, z_scale, 1, v_scale * z_scale],
1, 1, 1 / z_scale, 1]]
affine_mul = np.array(affine_mul).reshape((1, 4, 4))
affine_transforms = transform_params * affine_mul
affine_transforms = affine_transforms.reshape((-1, 4, 4))

I tried to use pytorch to warp 3D features but the affine parameters in pytorch should be 3X4 instead of 4*4.

  1. I wonder if transform_params is a 3X4 matrix, how do I modify the affine_mul? dirtectly delete the last row?
  2. Why did you use multiply operation instead of dot product when scaling the 'transform_params '?

Thanks in advance! @MKnoche @isarandi

How to generate pose.pkl for other datasets?

Hi, thank you for releasing the code.

I have a few questions on 3D pose data preparation. According to the paper, the 3D pose of images in DeepFashion and iPER datasets are predicted by using “MeTRAbs, Metric-Scale Truncation-Robust Heatmaps for Absolute 3D Human Pose Estimation“, I wonder how to generate the same data structure as you did on DeepFashion and iPER. For example, when I load the poses.pkl of iPER dataset, and I use the following script to read the data,

import pickle
with open('poses.pkl', 'rb') as fb:
pose_data = pickle.load(fb)

print(pose_data.keys())
print(pose_data['001'].keys())
print(pose_data['001']['24'].keys())
print(pose_data['001']['24']['1'].keys())
print(len(pose_data['001']['24']['1']['0001']))

Output:
dict_keys(['001', '002', '003', '004', '005', '006', '007', '008', '009', '010', '011', '012', '013', '014', '015', '016', '017', '018', '019', '020', '021', '022', '023', '024', '025', '026', '027', '028', '029', '030'])
dict_keys(['10', '11', '12', '13', '14', '15', '16', '17', '18', '19', '1', '20', '21', '22', '23', '24', '25', '26', '27', '28', '29', '2', '30', '31', '32', '33', '3', '4', '5', '6', '7', '8', '9'])
dict_keys(['1', '2'])
dict_keys(['0001', '0002', '0003', '0004', '0005', '0006', '0007', '0008', '0009', '0010', '0011', '0012', '0013', '0014', '0015', '0016', '0017', '0018', '0019', '0020', '0021', '0022', '0023', '0024', '0025', '0026', '0027', '0028', '0029', '0030', '0031', '0032', '0033', '0034', '0035', '0036', '0037', '0038', '0039', '0040', '0041', '0042', '0043', '0044', '0045', '0046', '0047', '0048', '0049', '0050', '0051', '0052', '0053', '0054', '0055', '0056', '0057', '0058', '0059', '0060', '0061', '0062', '0063', '0064', '0065', '0066', '0067', '0068', '0069', '0070', '0071', '0072', '0073', '0074', '0075', '0076', '0077', '0078', '0079', '0080', '0081', '0082', '0083', '0084', '0085', '0086', '0087', '0088', '0089', '0090', '0091', '0092', '0093', '0094', '0095', '0096', '0097', '0098', '0099', '0100', '0101', '0102', '0103', '0104', '0105', '0106', '0107', '0108', '0109', '0110', '0111', '0112', '0113', '0114', '0115', '0116', '0117', '0118', '0119', '0120', '0121', '0122', '0123', '0124', '0125', '0126', '0127', '0128', '0129', '0130', '0131', '0132', '0133', '0134', '0135', '0136', '0137', '0138', '0139', '0140', '0141', '0142', '0143', '0144', '0145', '0146', '0147', '0148', '0149', '0150', '0151', '0152', '0153', '0154', '0155', '0156', '0157', '0158', '0159', '0160', '0161', '0162', '0163', '0164', '0165', '0166', '0167', '0168', '0169', '0170', '0171', '0172', '0173', '0174', '0175', '0176', '0177', '0178', '0179', '0180', '0181', '0182', '0183', '0184', '0185', '0186', '0187', '0188', '0189', '0190', '0191', '0192', '0193', '0194', '0195', '0196', '0197', '0198', '0199', '0200', '0201', '0202', '0203', '0204', '0205', '0206', '0207', '0208', '0209', '0210', '0211', '0212', '0213', '0214', '0215', '0216', '0217', '0218', '0219', '0220', '0221', '0222', '0223', '0224', '0225', '0226', '0227', '0228', '0229', '0230', '0231', '0232', '0233', '0234', '0235', '0236', '0237', '0238', '0239', '0240', '0241', '0242', '0243', '0244', '0245', '0246', '0247', '0248', '0249', '0250', '0251', '0252', '0253', '0254', '0255', '0256', '0257', '0258', '0259', '0260', '0261', '0262', '0263', '0264', '0265', '0266', '0267', '0268', '0269', '0270', '0271', '0272', '0273', '0274', '0275', '0276', '0277', '0278', '0279', '0280', '0281', '0282', '0283', '0284', '0285', '0286', '0287', '0288', '0289', '0290', '0291', '0292', '0293', '0294', '0295', '0296', '0297', '0298', '0299', '0300', '0301', '0302', '0303', '0304', '0305', '0306', '0307', '0308', '0309', '0310', '0311', '0312', '0313', '0314', '0315', '0316', '0317', '0318', '0319', '0320', '0321', '0322', '0323', '0324', '0325', '0326', '0327', '0328', '0329', '0330', '0331', '0332', '0333', '0334', '0335', '0336', '0337', '0338', '0339', '0340', '0341', '0342', '0343', '0344', '0345', '0346', '0347', '0348', '0349', '0350', '0351', '0352', '0353', '0354', '0355', '0356', '0357', '0358', '0359', '0360', '0361', '0362', '0363', '0364', '0365', '0366', '0367', '0368', '0369', '0370', '0371', '0372', '0373', '0374', '0375', '0376', '0377', '0378', '0379', '0380', '0381', '0382', '0383', '0384', '0385', '0386', '0387', '0388', '0389', '0390', '0391', '0392', '0393', '0394', '0395', '0396', '0397', '0398', '0399', '0400', '0401', '0402', '0403', '0404', '0405', '0406', '0407', '0408', '0409', '0410', '0411', '0412', '0413', '0414', '0415', '0416', '0417', '0418', '0419', '0420', '0421', '0422', '0423', '0424', '0425', '0426', '0427', '0428', '0429', '0430', '0431', '0432', '0433', '0434', '0435', '0436', '0437', '0438', '0439', '0440', '0441', '0442', '0443', '0444', '0445', '0446', '0447', '0448', '0449', '0450', '0451', '0452', '0453', '0454', '0455', '0456', '0457', '0458', '0459', '0460', '0461', '0462', '0463', '0464', '0465', '0466', '0467', '0468', '0469', '0470', '0471', '0472', '0473', '0474', '0475', '0476', '0477', '0478', '0479', '0480', '0481', '0482', '0483', '0484', '0485', '0486', '0487', '0488', '0489', '0490', '0491', '0492', '0493', '0494', '0495', '0496', '0497', '0498', '0499', '0500', '0501', '0502', '0503', '0504', '0505', '0506', '0507', '0508', '0509', '0510', '0511', '0512', '0513', '0514', '0515', '0516', '0517', '0518', '0519', '0520', '0521', '0522', '0523', '0524', '0525', '0526', '0527'])
19

So, the iPER dataset contains videos for 30 people, 2 videos are collected for each person. The first video contains 527 frames. For each frame, 19 3d keypoints are generated. What is the meaning of the second output? BTW, how to generate the poses.pkl data file for Deepfashion dataset or other custom dataset?

Sorry for the simple questions.

Thank you!

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.