- Python>=3.7
The project uses Mitsuba2 renderer(v2.1.0) for generating all the images. For quick start, python package for renderer is available here. You can also use docker images which are available here
- For installing python dependencies
pip3 install -r requirements.txt
python3 generate_image.py
- To choose the configuration file see files inside
file_list_cfgs
- To choose the render configurations see files inside
render_cfgs
![]() |
![]() |
---|---|
Real Sensor Image | Rendered Image |
User can render a new object by providing its mesh in Wavefront .obj format and passing its name as a param to render function. new_mesh_render.py
provides an example. Please refer to the script for more details.
- Execute
python new_mesh_render.py
to render a sensor view without any object pressed against it.
- tev is a EXR viewer and comparison tool. You can download the latest executables from here
- qt4Image is another viewer for exr images across different platform. It can also generate low-dynamic range images with gamma encoding. You can download the utility for linux from this link.
- Building from source gives ability to run faster simulation by using GPU and setup optimization for different simulation models. The instructions to build from source are mentioned in official documentation
- The model files for GelSight are in Mitsuba xml format. The Mitsuba2 documentation has details on the parameters and how to set them.
- The important file which are used in rendering for flatgel is or
models/flatgel_with_mesh.xml
- Bidirectional Path tracing
- Heightfields
- Currently mitsuba2 outputs artifacts in rendered image on macOS
![]() |
![]() |
---|---|
Linux | MacOS |
@article{agarwal2020simulation,
title={Simulation of Vision-based Tactile Sensors using Physics based Rendering},
author={Agarwal, Arpit and Man, Tim and Yuan, Wenzhen},
journal={arXiv preprint arXiv:2012.13184},
year={2020}
}