GithubHelp home page GithubHelp logo

windingwind / seal-3d Goto Github PK

View Code? Open in Web Editor NEW
158.0 9.0 7.0 1.22 MB

The official implementation of the paper Seal-3D: Interactive Pixel-Level Editing for Neural Radiance Fields, the first interactive pixel-level NeRF editing tool.

License: MIT License

Python 77.09% C++ 4.81% Cuda 16.57% C 0.68% Shell 0.85%
3d editing iccv2023 instant-ngp interactive nerf tensorf

seal-3d's Introduction

Seal-3D: Interactive Pixel-Level Editing for Neural Radiance Fields

teaser

The official implementation of the paper Seal-3D: Interactive Pixel-Level Editing for Neural Radiance Fields, the first interactive pixel-level NeRF editing tool.

Accepted by ICCV 2023.

Project PagePaperArXivCode

This project is built on ashawkey/torch-ngp's NGP and TensoRF implementation.

Installation

To find more details about the development environment setup, please refer to torch-ngp#install.

git clone --recursive https://github.com/windingwind/seal-3d.git
cd seal-3d

Install with pip

pip install -r requirements.txt

# (optional) install the tcnn backbone
pip install git+https://github.com/NVlabs/tiny-cuda-nn/#subdirectory=bindings/torch

Install with conda

conda env create -f environment.yml
conda activate torch-ngp

Build extension (optional)

By default, we use load to build the extension at runtime. However, this may be inconvenient sometimes. Therefore, we also provide the setup.py to build each extension:

# install all extension modules
bash scripts/install_ext.sh

# if you want to install manually, here is an example:
cd raymarching
python setup.py build_ext --inplace # build ext only, do not install (only can be used in the parent directory)
pip install . # install to python path (you still need the raymarching/ folder, since this only install the built extension.)

Dataset

We use the same data format as instant-ngp. Please download and put them under ./data.

To find more details about the supported dataset, please refer to torch-ngp#usage.

Usage

Code Structure

Based on the implementation of the repo, we slightly modified the files in nerf (the NGP implementation) and tensoRF (the TensoRF implementation) to fit our needs.

The main entrances are main_SealNeRF.py (NGP backbone) and main_SealTensoRF.py (TensoRF backbone).

In SealNeRF:

  • trainer.py defines the trainer class dynamically depending on the backbone and character (student/teacher).

  • network.py defines the network class dynamically depending on the backbone and character (student/teacher).

  • provider.py defines the dataset update strategy under our two-stage local-global teacher-student framework.

  • seal_utils.py defines the proxy functions we proposed in the paper.

  • renderer.py defines how the proxy functions are applied to our pipeline.

Train

Follow the steps below to apply the editing operation on an existing NeRF model:

  1. Train an NGP/TensoRF model following the instructions of torch-ngp#usage. For example:
# NGP backbone, Lego
python main_nerf.py data/nerf_synthetic/lego/ --workspace exps/lego_ngp -O --bound 1.0 --scale 0.8 --dt_gamma 0
  1. Train Seal3D on the model you get in the previous step (headless mode).
# Headless mode, bounding shape editing, NGP backbone, Lego
# pretraining_epochs: pretraining stage epochs
# extra_epochs: total epochs (pretraining + finetuning)
# pretraining_*_point_step: pretraining sample step
# ckpt: the input student model checkpoint
# teacher_workspace: teacher model workspace
# teacher_ckpt: teacher model checkpoint
# seal_config: the editing config directory used in headless mode. the config file is $seal_config/seal.json.
# eval_interval & eval_count: control eval behavior
python main_SealNeRF.py data/nerf_synthetic/lego/\
    --workspace exps/lego_ngp_bbox -O --bound 1.0 --scale 0.8 --dt_gamma 0\
    --pretraining_epochs 100 --extra_epochs 150\
    --pretraining_local_point_step 0.005 --pretraining_surrounding_point_step -1\
    --pretraining_lr 0.05 --ckpt exps/lego_ngp/checkpoints/ngp_ep0300.pth\
    --teacher_workspace exps/lego_ngp --teacher_ckpt exps/lego_ngp/checkpoints/ngp_ep0300.pth\
    --seal_config data/seal/lego_bbox/\
    --eval_interval 100 --eval_count 10

The seal_config files used by examples in the paper can be downloaded from Google Drive link. The explanation for the parameters of seal_config can be found in the corresponding class of the proxy function in SealNeRF/seal_utils.

The full argument list and descriptions can be found in the corresponding entrance file (main_*.py).

To start in GUI mode, use --gui.

Currently, GUI mode supports Color, Anchor, Brush, and Texture editing.

BibTeX

@misc{wang2023seal3d,
      title={Seal-3D: Interactive Pixel-Level Editing for Neural Radiance Fields}, 
      author={Xiangyu Wang and Jingsen Zhu and Qi Ye and Yuchi Huo and Yunlong Ran and Zhihua Zhong and Jiming Chen},
      year={2023},
      eprint={2307.15131},
      archivePrefix={arXiv},
      primaryClass={cs.CV}
}

Acknowledgement

Use this code under the MIT License. No warranties are provided. Keep the laws of your locality in mind!

Please refer to torch-ngp#acknowledgement for the acknowledgment of the original repo.

seal-3d's People

Contributors

ashawkey avatar jingsenzhu avatar w-m avatar windingwind avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

seal-3d's Issues

CUDA out of memory

If I want to deploy the model on a low-end graphics card, what parameters do I need to change to reduce the level of memory required

Instant NGP false

I train the Instant NGP model with 360_v2 dataset, but the PSNR is low to 10。Can you afford 1 or 2 pretrained model with your dataset ?Thank you very much!email : [email protected]

headless mode fail, But gui mode works fine

When I don't use gui mode, it seems to fail during the test phase
CUDA11.6、gcc10.3、torch1.12.1 ubuntu18.04

Loading model from: /home/user/.conda/envs/torch-ngp/lib/python3.8/site-packages/lpips/weights/v0.1/alex.pth
[INFO] Trainer: ngp | 2023-08-29_11-03-50 | cuda | fp16 | exps/lego_ngp
[INFO] #parameters: 24490848
[INFO] Loading exps/lego_ngp/checkpoints/ngp_ep0150.pth ...
[INFO] loaded model.
[INFO] load at epoch 150, global step 30000
[INFO] loaded optimizer.
[INFO] loaded scheduler.
[INFO] loaded scaler.
[INFO] Trainer: ngp | 2023-08-29_11-03-50 | cuda | fp16 | exps/lego_brush_compare
[INFO] #parameters: 24490848
[INFO] Loading exps/lego_ngp/checkpoints/ngp_ep0150.pth ...
[INFO] loaded model.
[INFO] load at epoch 150, global step 30000
[INFO] loaded optimizer.
[INFO] loaded scheduler.
[INFO] loaded scaler.
Local x generation: 0.873894214630127
Surrounding x generation: 1.1920928955078125e-06
Global x generation: 4.76837158203125e-07
Loading train data: 100%|█████████████████████████████████████████| 100/100 [00:01<00:00, 59.49it/s]
Loading val data: 100%|███████████████████████████████████████████| 100/100 [00:01<00:00, 59.85it/s]
[INFO] Proxy train/eval/test: True/True/False
Proxying train data: 0%| | 0/100 [00:00<?, ?it/s]段错误 (核心已转储)

Areas not selected by the brush are edited

Brush editing commands 1:
lego_0_instruction

Edit results:
view 1
lego_0
view 2
lego_1
view 3
lego_15

Brush editing commands 2:
lego_instruction_new

Edit results:
view 1
lego_0_new
view 2
lego_1_new
view 3
lego_15_new

I'm wondering why the large area in the middle of the right wheel is also edited. Hope you can give some advice.

non-rigid blending support

Looking at section 4.2 in your excellent paper, I have noticed you mentioned a way for doing non-rigid blending, which I find very interesting. Are you planning to release support for this feature as well?

seal_config for custom data

Hi,

Thanks for the great work!
I'm trying to use this for my custom capture. May I ask how shall I obtain the seal_config file for my custom data? Thanks!

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.