GithubHelp home page GithubHelp logo

taxim's Introduction

Taxim: An Example-based Simulation Model for GelSight Tactile Sensors

Taxim is an example-based simulator for GelSight tactile sensors and its variations. For more information of Taxim, you can check the paper or the webpage.

Installation and Prerequisites

Basic dependencies: numpy, scipy, matplotlib, cv2

To install dependencies: pip install -r requirements.txt

Optional dependencies: ros with usb-cam driver (to collect the tactile images from a tactile sensor), nanogui (to annotate the raw data.)

To install ros usb-cam driver, please check out here.

To install nanogui, please check out here.

Usage

If you want to customize the Taxim on your own sensor, please follow the DataCollection and Calibration to calibrate the Taxim and generate calibration files. And modify the parameters under Basic.params and Basic.sensorParams accordingly.

We provide a set of calibration files and you can work with them directly. You can follow instruction of Optical Simulation and Marker Motion Field Simulation to start working with the provided examples. And feel free to change the parameters under Basic.params.

Data Collection (optional)

  1. Connect a GelSight sensor with your pc and launch the camera driver.
  2. Change the self.gel_sub in gelsight.py to your sensor camera's topic.
  3. Run python record_Gel.py and input the file name and number of frames to collect the data.

Calibration (optional)

  1. Generate data pack: Run python generateDataPack.py -data_path DATA_PATH where DATA_PATH is the path to the collected raw tactile data. Hand annotate the contact center and radius for each tactile image. dataPack.npz will be saved under the DATA_PATH.
  2. Generate polynomial table: Run python polyTableCalib.py -data_path DATA_PATH where DATA_PATH is the path to the data pack. polycalib.npz will be saved under the DATA_PATH.
  3. Generate shadow table: Run python generateShadowMasks.py -data_path DATA_PATH where DATA_PATH is the path to the collected shadow calibration images. shadowTable.npz will be saved under the DATA_PATH.
  4. Generate FEM tensor maps: Export the ANSYS FEM displacement txt files and set the path in the main function. Run python generateTensorMap.py and femCalib.npz will be saved under calibs folder.

All the calibration files from a GelSight sensor have been provided under calibs folder.

Optical Simulation

You can input a point cloud of a certain object model and define the pressing depth, or directly input a depth map. All the parameters in Basic.params are adjustable. depth is in millimeter unit.

Run python simOptical.py -obj square -depth 1.0 to visualize the examples. Results are saved under results.

Marker Motion Field Simulation

You can input a point cloud of a certain object model and define the loads on x, y, z directions. dx and dy are shear loads and dz is normal loads, which are all in millimeter unit. Run python simMarkMotionField.py -obj square -dx 0.3 -dy 0.4 -dz 0.5 to visualize the resultant displacements. Results are saved under results.

Operating System

Taxim has been tested on macOS Catalina (10.15.7) and Ubuntu (18.04.1) with anaconda3.

Configuration for MacOS: python 3.8.5, numpy 1.20.1, scipy 1.6.1, opencv-python 4.5.3.56

Configuration for Ubuntu: python 3.6.13, numpy 1.19.5, scipy 1.5.4, opencv-python 4.5.2.54

License

Taxim is licensed under MIT license.

Citating Taxim

If you use Taxim in your research, please cite:

@article{si2021taxim,
  title={Taxim: An Example-based Simulation Model for GelSight Tactile Sensors},
  author={Si, Zilin and Yuan, Wenzhen},
  journal={arXiv preprint arXiv:2109.04027},
  year={2021}
}

taxim's People

Contributors

si-lynnn avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar

taxim's Issues

Why doesn't the gui display when I run demo

Hi,why doesn't the gui display when I run demo.py

tool0TomatoSoupCan height: 0.101815
height_list: [0.2205]
gripForce_list: [8]
dx_list: [0.0]

sample 1 is collected. h0.220-rot-1.57-f8.0-l1. 0:0:28 is taken in total. 1 positive samples.
Finished!

Potential index bug in shadow rendering

Hi,
I believe there is an indexing error in the shadow generation. The for loop in simOptical.py line 206 iterates variable s from 1 to num_step and retrieves values from the shadow_table at position s:

for s in range(1,num_step):
    cur_x = int(cx_origin + pr.shadow_step * s * ct)
    cur_y = int(cy_origin + pr.shadow_step * s * st)
    # check boundary of the image and height's difference
    if cur_x >= 0 and cur_x < psp.w and cur_y >= 0 and cur_y < psp.h and heightMap[cy_origin,cx_origin] > heightMap[cur_y,cur_x]:
        frame[cur_y,cur_x] = np.minimum(frame[cur_y,cur_x],v[s])

However, this means that v[0] is never used. Looking at generateShadowMasks.py, it seems to me as if for s = 1, v[0] should be retrieved instead of v[1]. Hence, I think this code should be:

for s in range(num_step):
    cur_x = int(cx_origin + pr.shadow_step * (s + 1) * ct)
    cur_y = int(cy_origin + pr.shadow_step * (s + 1) * st)
    # check boundary of the image and height's difference
    if cur_x >= 0 and cur_x < psp.w and cur_y >= 0 and cur_y < psp.h and heightMap[cy_origin,cx_origin] > heightMap[cur_y,cur_x]:
        frame[cur_y,cur_x] = np.minimum(frame[cur_y,cur_x],v[s])

Best,
Tim

Possible fix for OpenGL on macOS for taxim-robot

Hi!

I found that the 'Unable to load OpenGL library' error when running taxim-robot was fixed for me by editing the PyOpenGL file OpenGL/platform/ctypesloader.py, and changing the line

fullName = util.find_library( name )

to

fullName = '/System/Library/Frameworks/OpenGL.framework/OpenGL'

Hope it helps!

problem about ANSYS

Hi! I have read your paper and know that you firstly calibrate the deformation of gelpad under a unit pin with 0.5 mm diameter indenting in ANSYS to marker motion field simulation. When I tried to get the dense nodal displacement results with my own model in ANSYS, the result is not satisfied! So what is the setting of materials or other necessary parts that you use in ANSYS? I think the answer would help me!

Marker motion sim

Hi,

I tried to run
python simMarkMotionField.py -obj square -dx 0.3 -dy 0.4 -dz 0.5

however, the output image does not contain any markers.

Am I missing something?

Issue while trying to calibrate Taxim running generateDataPack.py

Hi all,

Thanks for sharing the simulator.
I followed your usage tutorial video in order to calibrate the simulator with a different sensor. When I run the generateDataPack.py script giving the as data_path the folder with the images, the GUI opens correctly (I think)
nanogui

But, when I try to push the button "open" it raises an excpetion in this line of code, giving this output:
Caught exception in event handler: AttributeError: module 'nanogui' has no attribute 'directory_dialog.
This method does not exist in nano gui, when I run help(nanogui), this method is not shown.
What can I do to run the calibration?
Thanks

How to get training and testing parts in taxim-robot branch?

@Si-Lynnn
Hi Zilin,

Thanks for sharing so nice work "Grasp Stability Prediction with Sim-to-Real Transfer from Tactile Sensing" and having released the simulation part code.

However, I cannot find the training and testing part (e.g. network as presented in Fig. 6: Grasp stability prediction networks). Could you mind offering more information to implement this part?

gelmap.npz for DIGIT Sensor

Is the gelmap5.npz used for the GelSight sensor the same as the one used for DIGIT sensor?
If no is it possible to provide us with the gelmap belonging to DIGIT?

Don't know how to generate gelmap.npz file

Hi all,
Thanks for sharing the simulator.

I am trying to calibrate the simulator with another sensor (the DIGIT), but it seems we require a gelmap.npz file to reproduce the shape of the gel. In this repo there's the file related to the gelsight but I was wondering how can I create this kind of file for my sensor.
Thanks

Calibration with my own sensor

Hello! I've collected about 100 ball tactile images by my own DIGIT. Then I followed the steps in README to calibrate. However, there are some problems as the following pictures show.
The first is the simulated tactile image with background, and the second is the one without background. We can see that the light rendering of non-contact area on the tactile image is not correct. I have tried to change some parameters in Basic.params and sensorParams, but it still didn't work. At the same time, I also found that my DIGIT brightness is set higher than the DIGIT file you provided, is this the key to my problem?
square_shadow
square_sim

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.