GithubHelp home page GithubHelp logo

hzykent / vmnet Goto Github PK

View Code? Open in Web Editor NEW
95.0 95.0 11.0 496 KB

Implementation of ICCV2021(Oral) paper - VMNet: Voxel-Mesh Network for Geodesic-aware 3D Semantic Segmentation

License: MIT License

Python 100.00%

vmnet's People

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar

vmnet's Issues

about Matterport3D results

Hi~

Thanks for the great work. I find your work use Matterport3D datatset and get pretty good results showed in the paper.

But I don’t find any code related to Matterport3D in this repository. I wonder if you can share the Matterport3D related code and give a brief description of how to reproduce the results on Matterport3D.

Looking forward to your reply and this will help a lot.

Quadric Error Metrics: contraction of non-connected vertices possible?

Hi there,

Thanks for your amazing work!

I would be glad if you could please answer the following question about the usage of QEM in VMNet.
The publication seems to indicate that for vertex contraction only vertices connected by edges are considered. However, in the original QEM publication, the authors also propose selecting vertex pairs for contraction based on their Euclidean distance. They use a threshold value t for the Euclidean distance.

Using only vertices connected by edges would imply a threshold of t = 0. In prepare_data.py, Tridecimator from VCGlib is called. If I understand the call correctly, the optional argument -e is not passed which specifies the threshold. In the Tridecimator application, the threshold t then defaults to inf, meaning all pairs of vertices would be eligible for contraction.

Therefore I would like to know: Is contraction of non-connected vertices possible in VMNet?

Thanks a lot for your time, Benjamin

Issue of 'input voxels are not valid'

Hi, authors, thanks for sharing your work.

When I tried to train VMNet with my own training data, the valid_idxs is not meet the assert condition, like

assert sum(valid_idxs) == len(valid_idxs), 'input voxels are not valid'

I think the problem is related to my own data, but the specific reason for such an issue is unclear. Do you have any suggestions about that?

PS: I use the prepare_data.py script to preprocess my own data, as same as the preprocessing procedure for the ScanNet dataset.

The pretrain model has wrong state_dict keys

Hi, I get some issue about

  1. I test the the script with pretrain model
python run.py --test --exp_name test_split --data_path path/to/processed_data

The output logs are:

use_cuda: True
exp_name: test_split
#parameters 17463870
Traceback (most recent call last):
  File "run.py", line 279, in <module>
    test(exp_name, test_files)
  File "run.py", line 139, in test
    model.load_state_dict(checkpoint['model_state_dict'])
  File "/home/keroro/Program_Files/miniconda3/envs/tt/lib/python3.7/site-packages/torch/nn/modules/module.py", line 1052, in load_state_dict
    self.__class__.__name__, "\n\t".join(error_msgs)))
RuntimeError: Error(s) in loading state_dict for VMNet:
	Unexpected key(s) in state_dict: "Geo_branch.mid_geo.geo_0.lin_edge.weight", "Geo_branch.mid_geo.geo_0.lin_edge.bias", "Geo_branch.mid_geo.geo_1.lin_edge.weight", "Geo_branch.mid_geo.geo_1.lin_edge.bias", "Geo_branch.cd_5.geo_0.lin_edge.weight", "Geo_branch.cd_5.geo_0.lin_edge.bias", "Geo_branch.de5_geo.geo_0.lin_edge.weight", "Geo_branch.de5_geo.geo_0.lin_edge.bias", "Geo_branch.de5_geo.geo_1.lin_edge.weight", "Geo_branch.de5_geo.geo_1.lin_edge.bias", "Geo_branch.cd_4.geo_0.lin_edge.weight", "Geo_branch.cd_4.geo_0.lin_edge.bias", "Geo_branch.de4_geo.geo_0.lin_edge.weight", "Geo_branch.de4_geo.geo_0.lin_edge.bias", "Geo_branch.de4_geo.geo_1.lin_edge.weight", "Geo_branch.de4_geo.geo_1.lin_edge.bias", "Geo_branch.cd_3.geo_0.lin_edge.weight", "Geo_branch.cd_3.geo_0.lin_edge.bias", "Geo_branch.de3_geo.geo_0.lin_edge.weight", "Geo_branch.de3_geo.geo_0.lin_edge.bias", "Geo_branch.de3_geo.geo_1.lin_edge.weight", "Geo_branch.de3_geo.geo_1.lin_edge.bias", "Geo_branch.cd_2.geo_0.lin_edge.weight", "Geo_branch.cd_2.geo_0.lin_edge.bias", "Geo_branch.de2_geo.geo_0.lin_edge.weight", "Geo_branch.de2_geo.geo_0.lin_edge.bias", "Geo_branch.de2_geo.geo_1.lin_edge.weight", "Geo_branch.de2_geo.geo_1.lin_edge.bias", "Geo_branch.cd_1.geo_0.lin_edge.weight", "Geo_branch.cd_1.geo_0.lin_edge.bias", "Geo_branch.de1_geo.geo_0.lin_edge.weight", "Geo_branch.de1_geo.geo_0.lin_edge.bias", "Geo_branch.de1_geo.geo_1.lin_edge.weight", "Geo_branch.de1_geo.geo_1.lin_edge.bias", "Geo_branch.cd_0.geo_0.lin_edge.weight", "Geo_branch.cd_0.geo_0.lin_edge.bias", "Geo_branch.de0_geo.geo_0.lin_edge.weight", "Geo_branch.de0_geo.geo_0.lin_edge.bias", "Geo_branch.de0_geo.geo_1.lin_edge.weight", "Geo_branch.de0_geo.geo_1.lin_edge.bias".

What are the Unexpected key(s) in state_dict ? does the VMNet not defined?

  1. When I preprocess the data, I had build VCGlib
    vcglib/apps/tridecimator and vcglib/apps/sample/trimesh_clustering , I add environment path by:
export PATH=$PATH:/path/to/vcglib/apps/tridecimator:/path/to/vcglib/apps/sample/trimesh_clustering
# create links
sudo ln -s /path/to/vcglib/apps/tridecimator/tridecimator /usr/local/bin
sudo ln -s /path/to/vcglib/apps/sample/trimesh_clustering/trimesh_clustering /usr/local/bin

But run the preprocess, there is core dumped:

in_path:../scannet/VMtest
out_path:../scannet/VMNet_data/train/
[0.02, 0.04, 30, 30, 30, 30, 30]
Processing ../scannet/VMtest/scene0000_00/scene0000_00_vh_clean_2.ply
curr_dir: ../scannet/VMNet_data/train/scene0000_00
trimesh_clustering: ../../../vcg/simplex/vertex/component.h:75: vcg::vertex::EmptyCore<TT>::ColorType& vcg::vertex::EmptyCore<TT>::C() [with TT = MyUsedTypes; vcg::vertex::EmptyCore<TT>::ColorType = vcg::Color4<unsigned char>]: Assertion `0' failed.
Aborted (core dumped)
trimesh_clustering: ../../../vcg/simplex/vertex/component.h:75: vcg::vertex::EmptyCore<TT>::ColorType& vcg::vertex::EmptyCore<TT>::C() [with TT = MyUsedTypes; vcg::vertex::EmptyCore<TT>::ColorType = vcg::Color4<unsigned char>]: Assertion `0' failed.
Aborted (core dumped)
multiprocessing.pool.RemoteTraceback: 
"""
Traceback (most recent call last):
  File "/home/keroro/Program_Files/miniconda3/envs/tt/lib/python3.7/multiprocessing/pool.py", line 121, in worker
    result = (True, func(*args, **kwds))
  File "/home/keroro/Program_Files/miniconda3/envs/tt/lib/python3.7/multiprocessing/pool.py", line 44, in mapstar
    return list(map(*args))
  File "prepare_data.py", line 226, in process_frame
    old_vertices=vertices[-1])
  File "prepare_data.py", line 173, in quadric_error_metric
    '.ply', '.csv'), old_vertices=old_vertices, new_vertices=vertices_l)
  File "prepare_data.py", line 78, in csv2npy
    with open(in_file_path, 'r') as csvfile:
FileNotFoundError: [Errno 2] No such file or directory: '../scannet/VMNet_data/train/scene0000_00/curr_mesh.csv'
"""

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "prepare_data.py", line 295, in <module>
    pf_pool.map(process_frame_p, file_paths)
  File "/home/keroro/Program_Files/miniconda3/envs/tt/lib/python3.7/multiprocessing/pool.py", line 268, in map
    return self._map_async(func, iterable, mapstar, chunksize).get()
  File "/home/keroro/Program_Files/miniconda3/envs/tt/lib/python3.7/multiprocessing/pool.py", line 657, in get
    raise self._value
FileNotFoundError: [Errno 2] No such file or directory: '../scannet/VMNet_data/train/scene0000_00/curr_mesh.csv'

Where should I set the correct environment path for vcglib/apps/tridecimator and vcglib/apps/sample/trimesh_clustering?

about downloading ScanNet

Hi,
I haven't used ScanNet before, so I'm not familiar with it and need your help. Due to my limited memory space, I want to only download the needed parts of the ScanNet. Since you say:

Our method relies on the .ply as well as the .labels.ply files.

so does it mean that I can download the ScanNet only using this two command?

download-scannet.py -o [directory in which to download] --type _vh_clean_2.ply
download-scannet.py -o [directory in which to download] --type _vh_clean_2.labels.ply

In addition, how much memory space is needed to store the processed data?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.