Comments (14)
Also, I am curious about why we need scale the mesh into [-0.5,0.5], since i noticed that when you sample the pointcloud from the mesh, the points are then scaled back into [-1,1]?
from dualoctreegnn.
In my initial experiments, I did scale the mesh into [-1, 1]. And the SDF is calculated on such a scale.
And since I reuse the evaluation code and results of ConvONet, ConvONet scales the mesh into [-0.5, 0.5], and I also scale the input point cloud and output mesh into [-0.5, 0.5] while keeping SDF fixed.
When training the network, the point cloud in [-0.5, 0.5] is rescale to [-1, 1] ( ref link )
from dualoctreegnn.
Thanks for your reply!
from dualoctreegnn.
Hi,
Sorry for bother, I have another question.
I am wondering whether I only need to change the "size=128" to "size=256" in the "tools/shapenet.py" for generating SDF with voxel size=256 and the corresponding "point cloud", "points", and "occupancy"?
Are there any other code I need to modify?
Thanks!
from dualoctreegnn.
Because I find it will not work when I train an implicit autoencoder model with SDF resolution=256, while it works well with SDF resolution=128.
from dualoctreegnn.
Because I find it will not work when I train an implicit autoencoder model with SDF resolution=256, while it works well with SDF resolution=128.
I suspect there are some inconsistencies in the function "def sample_sdf()".
from dualoctreegnn.
The code can work with depth=8 since I have tried it in my personal experiments before.
I suggest understanding the code and checking all the places related to depth
, including the code and the *.yaml
configuration files.
from dualoctreegnn.
Thanks for you reply.
In fact, I am not concerning about depth
, but the size
(i.e., resolution of SDF).
In fact, I have checked the code after I opened this issue, and find that it seems some codes should be modified, e.g., this line and this line.
The (xyz<127)
should be changed into (xyz<size-1)
, while the (xyzs/64-1)
should be changed into (2*xyzs/size-1)
.
from dualoctreegnn.
Good job! Yes, you are right!
from dualoctreegnn.
Thanks for your patient, I am new in this field, I still have another trivial question :{
Is there a way to directlt compute the SDF value near the surface? I understand that in your codebase, you firstly compute a SDF voxel
, then do interpolation to get SDF value for query points near the surface.
This may poses several issues:
- SDF might be not accurate.
- Pre-computing sdf voxel is costly or useless since many of them will be not used.
I am seeking a way to compute sdf for irregular points efficiently.
from dualoctreegnn.
Thanks for your patient, I am new in this field, I still have another trivial question :{ Is there a way to directlt compute the SDF value near the surface? I understand that in your codebase, you firstly compute a
SDF voxel
, then do interpolation to get SDF value for query points near the surface. This may poses several issues:
- SDF might be not accurate.
- Pre-computing sdf voxel is costly or useless since many of them will be not used.
I am seeking a way to compute sdf for irregular points efficiently.
Also, why do we set the mesh scale to 0.8? I do think if we set it more close to 1 (e.g., 0.9), the produced sdf could recounstruct the mesh better because the sampled points are uniform grids in [-1,1]^3. What are the considerations when setting this value (i.e., mesh_scale)?
from dualoctreegnn.
Yes, it is OK to set it to 0.9.
The value 0.8 is relatively conservative. Since the dataset construction process is time-consuming, I did not try it in my experiment after I first constructed the dataset with the mesh scale factor as 0.8.
Also, why do we set the mesh scale to 0.8? I do think if we set it more close to 1 (e.g., 0.9), the produced sdf could recounstruct the mesh better because the sampled points are uniform grids in [-1,1]^3. What are the considerations when setting this value (i.e., mesh_scale)?
from dualoctreegnn.
SDF might be not accurate
Yes, there do exist approximation errors with a resolution of 128^3 (less than 1% of the bounding box) .
Pre-computing sdf voxel is costly or useless since many of them will be not used. I am seeking a way to compute sdf for irregular points efficiently.
The OpenVDB library contains such functions. However, I am not familiar with OpenVDB either and I could not provide more information about it.
from dualoctreegnn.
Thank you very much!
from dualoctreegnn.
Related Issues (3)
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from dualoctreegnn.