GithubHelp home page GithubHelp logo

ros-drivers / leap_motion Goto Github PK

View Code? Open in Web Editor NEW

This project forked from warp1337/rosleapmotion

39.0 11.0 44.0 6.77 MB

Leap Motion ROS integration

Home Page: http://wiki.ros.org/leap_motion

CMake 0.88% Python 19.29% C++ 72.07% SWIG 7.76%

leap_motion's Introduction

ROS LEAP MOTION

ROS driver for the Leap Motion Controller

Build Status

REQUIREMENTS

You should have ROS Kinetic or a newer version installed on your device and the Leap Motion SDK for Linux.

FEATURES

Currently, this ROS package supports one person (left and right arm), publishing raw camera images from the controller, basic visualization using RViz and a pointcloud2 generated from stereo_image_proc node.

There is also a filter node implementing a 2nd-order Butterworth lowpass filter that is used to filter the hand x, y, z coordinates coming from the Leap Controller via Human.msg. For more information refer to Julius O. Smith III, Intro to Digital Filters with Audio Applications.

INSTALLATION

Python API installation

1. If you wish to use the old deprecated Python API you need to append the location of your LeapSDK to your environment variables. This step differs depending on where you saved the SDK. The LeapSDK folder should contain the following files: lib/Leap.py, lib/x86/LeapPython.so, lib/x86/libLeap.so, lib/x64/LeapPython.so, lib/x64/libLeap.so lib/LeapPython.so, lib/libLeap.dylib.

Example:

# 64-bit operating system
export PYTHONPATH=$PYTHONPATH:$HOME/LeapSDK/lib:$HOME/LeapSDK/lib/x64

# 32-bit operating system
export PYTHONPATH=$PYTHONPATH:$HOME/LeapSDK/lib:$HOME/LeapSDK/lib/x86

2. (OPTIONAL) You can edit your ~/.bashrc file to remove the need to export the location of your LeapSDK every time you open a new shell. Just append the LeapSDK location to the end of the PYTHONPATH.

# 64-bit operating system
echo "export PYTHONPATH=$PYTHONPATH:$HOME/LeapSDK/lib:$HOME/LeapSDK/lib/x64" >> ~/.bashrc
source ~/.bashrc

# 32-bit operating system
echo "export PYTHONPATH=$PYTHONPATH:$HOME/LeapSDK/lib:$HOME/LeapSDK/lib/x86" >> ~/.bashrc
source ~/.bashrc

Usage

1. Just go to the src folder of your catkin workspace.

    cd ~/catkin_ws/src
    git clone https://github.com/ros-drivers/leap_motion.git
    cd ~/catkin_ws
    catkin_make

2. Start the Leap control panel in another terminal.

LeapControlPanel

3. (OPTIONAL) If it gives you an error about the leap daemon not running, stop the LeapControlPanel have a look here and use the following command:

sudo service leapd restart

4. Source your current catkin workspace.

source ~/catkin_ws/devel/setup.bash

5. Launch the demo.launch file to see if you have set everything up correctly. If you wish to enable a lowpass filter change "enable_filter" to true in filter_params.yaml file.

roslaunch leap_motion demo.launch

6. You are done! You should see an RViz window opening up displaying the detected hands from the controller.

leap_motion's People

Contributors

130s avatar 4ndr3ar avatar adohaha avatar axkoenig avatar ethanfowler avatar fhmlier avatar k-okada avatar mirzashah avatar nowittyusername avatar warp1337 avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

leap_motion's Issues

A mistake in the leap_interface.py about roll pith yaw

Hi all,

I've noticed that in def on_frame in the leap_interface.py

            self.hand_pitch        = direction.pitch * Leap.RAD_TO_DEG
            self.hand_yaw          = normal.yaw * Leap.RAD_TO_DEG
            self.hand_roll         = direction.roll * Leap.RAD_TO_DEG

However, on the page of LeapMotion Doc https://developer-archive.leapmotion.com/documentation/python/devguide/Leap_Hand.html
The Vector class defines functions for getting the pitch (angle around the x-axis), yaw (angle around the y-axis), and roll (angle around the z-axis):

pitch = hand.direction.pitch
yaw = hand.direction.yaw
roll = hand.palm_normal.roll

image

The problem is yaw and roll are opposite in leap_interface.py and Leap Offical doc. Is this a mistake? Or change the order on purpose? Cheers

Transform this into an extendable template.

From my experience, extending this driver is not difficult. If I need more data, all I need to do is to define more fields in the message, add some very trivial code, and it works. So, reading more data is not a problem.

But, what if someone what less data? Transform more data clearly make it less efficient, under the case where only very simple data is needed.

There're two ways, perhaps:

  1. Make the module configurable. That is, the user can configure what input he needs
  2. Or, in a simpler way, just provide the driver as a template/skeleton, which only provides the simplest data. At the mean time, the actual user of the driver can easily add new data input by making very few modifications.

I actually recommend the latter approach. The basic idea is like https://github.com/angular/angular-seed

Let me know the your comments on this idea :)

Improved ROS driver for the Leap Motion gesture sensor

Hello!

After seeing that the ROS driver for the Leap Motion controller was somewhat lacking I wrote one myself using C++. I would like to know if there's any interest in it here, maybe it could replace the current one.

Currently my ROS package supports one person (left and right arm), publishing raw camera images from the controller, gesture detection, basic visualization using Rviz and a pointcloud generated from a stereo_image_proc node. The gesture functionality is somewhat untested currently, but I am in the progress of improving it during the coming weeks.

I have also made a filter node implementing a 2nd-order Butterworth lowpass filter that is used to filter the hand's x, y, z coordinates coming from the Leap controller via custom made Human.msg.

I pieced it together from multiple repositories found on Github with main sources of inspiration being the UTNuclearRobotics, juancamilog repositories. All other functionality that I deemed missing was implemented by myself.

You can find my current work here for browsing and/or testing the package. If anyone with the right permissions is interested in the merge, let me know.

How to convert leapmotion/data to twist like we convert joy to twisit?

Hi Sir,
I've using a joystick to control a robot arm with the left code very well. However, I'd love to use a LeapMotion to control the arm and I modified the code the the right side code. But it doesn't work. In the rqt_graph the nodes are connected. The robot arm doesn't move as it was control with the joystick. Can you please show me how I can figure it out??

image
image
image

ERROR: Cannot load message class for [leap_motion/leapros]. Are your messages built?

Hi Sir,

I followed the Read.me instructions and
After I run command source ~catkin_ws/devel/setup.bash && rosrun leap_motion sender.py I do receive data running on the terminal.
However, when I try rostopic echo /leapmotion/data, it throws an error:

ysu66@mech1331:~$ rostopic echo /leapmotion/data
ERROR: Cannot load message class for [leap_motion/leapros]. Are your messages built?

Can you please show me how to fix it?
Steven

0.0.6 fails to build locally and on buildfarm

http://jenkins.ros.org/job/devel-hydro-leap_motion/ARCH_PARAM=amd64,UBUNTU_PARAM=precise,label=devel/1/console

Scanning dependencies of target leap_camera
[ 11%] Building CXX object leap_motion/CMakeFiles/leap_camera.dir/src/leap_camera.cpp.o
/tmp/test_repositories/src_repository/leap_motion/src/leap_camera.cpp:3:18: fatal error: Leap.h: No such file or directory
compilation terminated.
make[2]: *** [leap_motion/CMakeFiles/leap_camera.dir/src/leap_camera.cpp.o] Error 1
make[1]: *** [leap_motion/CMakeFiles/leap_camera.dir/all] Error 2
make: *** [all] Error 2
/

I didn't realize upon #3 that depends on LeapMotion SDK and the environment variable.

roslaunch problem

while running the roslaunch command we encounter the following errors:

ERROR: cannot launch node of type [leap_motion/leap_camera]: leap_motion
ROS path [0]=/opt/ros/indigo/share/ros
ROS path [1]=/opt/ros/indigo/share
ROS path [2]=/opt/ros/indigo/stacks
ERROR: cannot launch node of type [leap_motion/leap_hands]: leap_motion
ROS path [0]=/opt/ros/indigo/share/ros
ROS path [1]=/opt/ros/indigo/share
ROS path [2]=/opt/ros/indigo/stacks

do you have any suggestions for these problems?

How to use gesture methods defined in leap_interface.py to improve my sender.py script

Hi,
Merry Christmas.

I am using the sender.py script to read the hand_palm_pos_ data in order to control my robot.
This line is used to launch the sender.py script source ~/catkin_ws/devel/setup.bash && rosrun leap_motion sender.py and it works very well.

And I am aware of that it is doable to use gestures to control something with LeapMotion sensor. I've seen a commented part of code in the leap_interface.py script as shown below:

 # Gestures
            for gesture in frame.gestures():
                if gesture.type == Leap.Gesture.TYPE_CIRCLE:
                    circle = CircleGesture(gesture)
                    # Determine clock direction using the angle between the pointable and the circle normal
                    if circle.pointable.direction.angle_to(circle.normal) <= Leap.PI/4:
                        clockwiseness = "clockwise"
                    else:
                        clockwiseness = "counterclockwise"
                    # Calculate the angle swept since the last frame
                    swept_angle = 0
                    if circle.state != Leap.Gesture.STATE_START:
                        previous_update = CircleGesture(controller.frame(1).gesture(circle.id))
                        swept_angle =  (circle.progress - previous_update.progress) * 2 * Leap.PI
                    print "Circle id: %d, %s, progress: %f, radius: %f, angle: %f degrees, %s" % (
                            gesture.id, self.state_string(gesture.state),
                            circle.progress, circle.radius, swept_angle * Leap.RAD_TO_DEG, clockwiseness)

I am a new programmer. Can you please tell me why this part is commented out? How can I use this part about gestures leap_interface.py script in to improve the sender.py script in order to not only send hand postions but also gestures to control my robot better?
Thanks a million for all the help that you've given me.

BR,
Steven

Error with sender.py on Melodic

Is there anyone who has checked this sourcecode on Melodic?
I have an error with sender.py ;

c@C:~$ export PYTHONPATH=$PYTHONPATH:$HOME/LeapSDK/lib:$HOME/LeapSDK/lib/x64
c@C:~$ rosrun leap_motion sender.py 
Traceback (most recent call last):
  File "/home/camellia/Testing_melodic/src/leap_motion/scripts/sender.py", line 12, in <module>
    from leap_motion.msg import leap
ImportError: No module named msg

Installed with its soucecode on Kinetic is OK.

Please tell me if you have tips.
Thanks in advance for your help.

Frame of the Cameras

Hello, I am using you driver for a tracking object project.

I need to project a 3D point into the left and right camera plane. In the CameraInfo message there is the matrix P and in the message description it says:

"By convention, this matrix specifies the intrinsic (camera) matrix
of the processed (rectified) image. That is, the left 3x3 portion
is the normal camera intrinsic matrix for the rectified image.
It projects 3D points in the camera coordinate frame to 2D pixel
coordinates using the focal lengths (fx', fy') and principal point
(cx', cy') - these may differ from the values in K."

"It projects 3D points in the camera coordinate frame to 2D pixel coordinates"...but then I realize that both left and right cameras have the same frame "leap_optical_frame". I would like to know where is this frame placed...in the middle of the two cameras?

Thank you very much!

Call for maintainer(s)

I think it's fair to say that I'm the only active maintainer at least since I joined the team a few years ago. Since I've moved on to something else recently, I'd like to call for someone(s) who can take over maintenance.

See ROS wiki for the role of maintainers. Obviously we can split chores if there are multiple maintainers. I can also help if needed.

@nowittyusername Is this something you're interested in?

Recognized value not updated after a hand disappears

While sender.py no longer says any hand is detected, rostopic still returns hands value. Assuming sender.py works closer to SDK, this is likely a bug in ROS package?

rosrun leap_motion sender.py
:
Frame id: 60144, timestamp: 4703437196, hands: 0, fingers: 0, tools: 0, gestures: 0
Frame id: 60145, timestamp: 4703471920, hands: 0, fingers: 0, tools: 0, gestures: 0
rostopic echo /leapmotion/data 
:
header: 
  seq: 70228
  stamp: 
    secs: 0
    nsecs: 0
  frame_id: ''
direction: 
  x: 0.920395553112
  y: 0.1912419945
  z: -0.341025710106
normal: 
  x: 0.266604751348
  y: -0.944969654083
  z: 0.189616009593
palmpos: 
  x: 47.4868392944
  y: 46.736366272
  z: 18.5409545898
ypr: 
  x: 125.421399155
  y: 29.2830430497
  z: 101.73803212

---

Leap SDK on armhf and arm64

No data on image_raw topics

The /left/image_raw or /right/image_raw topics have no data getting published. When launching the RViz window, the image topic shows an error. When running rostopic echo /left/image_raw or /right/image_raw, the terminal is blank i.e. there is no data on the topics.

How to generate the Quaternion of the hand orientation?

Hi, I am on this problem for a while, without being able to solve it.

I need to get the quaternion representation of the Yaw, Pitch, Roll Angles contained in the Hand Message.

to obtain the quaternion I am using the tf.transformations.quaternion_from_euler() function.

However, the only way I was able to get something making sense is by doing the following:

hand_rpy = [leap_hand.roll, leap_hand.pitch, -leap_hand.yaw]
hand_quat = tf.transformations.quaternion_from_euler(*hand_rpy, axes='szxy').tolist()

Using this configuration I get an orientation that follows the one of the hand. However, if I rotate my hand around my wrist on the horizontal plane more than 90 degrees, the orientation flips over and I get messed up values.

So, is there an appropriate way to extract the quaternion of the hand orientation?

Thanks

A question about $ roslaunch leap_motion sensor_sender.launch

Hi all,

I am new to ROS, and I am trying to use leapmotion to teleoperate a turtlebot. I think my question probabaly will be helpful to other users of your pacakge and thank you all for your great efforts. I've confronting the following questions:

Firstly, in INSTALLATION of README.md of this package, it says git clone https://github.com/warp1337/rosleapmotion.git.
(which is a different package created by warp 1337 )
Doesn't it should be git clone https://github.com/ros-drivers/leap_motion?

Secondly, I follow one chapter of book ROS ROBOTICS PROJECTS [https://www.safaribooksonline.com/library/view/ros-robotics-projects/9781783554713/] in which your pacages is used. But his steps and commands are different from your instructions in README.md. Especially, he used this command $ roslaunch leap_motion sensor_sender.launch in order to recieve the data from leapmotion. But this is not introduced in your README.md. Actually, I did exactly what the book says but this comand doesn't work well. Can you please kindly tell me where the mistake is if I post the comands in the book ROS ROBOTICS PROJECTS? Coz the author seems not very responsible to his publications. Thank you so much for all of you!

The following is part of the book I followed:
1. Installing the ROS driver for the Leap Motion controller

To interface the Leap Motion with ROS, we will need the ROS driver for it. Here is the link
to get the ROS driver for Leap Motion; you can clone it using the command:
$ git clone https://github.com/ros-drivers/leap_motion

Before installing the leap_motion driver package, we have to do a few things to have it
properly compiled. The first step is to set the path of the Leap Motion SDK in the .bashrc file. Assuming thatthe Leap SDK is in the user's home folder with the name LeapSDK , we have to set the path
variable in .bashrc as follows.
$ export LEAP_SDK=$LEAP_SDK:$HOME/LeapSDK

This environment variable is needed for compiling the code of the ROS driver, which has
Leap SDK APIs. We also have to add the path of the Python extension of the Leap Motion SDK to .bashrc .
Here is the command used to do it:
export PYTHONPATH=$PYTHONPATH:$HOME/LeapSDK/lib:$HOME/LeapSDK/lib/x64

This will enable Leap Motion SDK APIs in Python. After going through the preceding steps,
you can save .bashrc and take a new Terminal, so that we will get the preceding variables
in the new Terminal.
The final step is to copy the libLeap.so file to /usr/local/lib . Here is how we do it:
$ sudo cp $LEAP_SDK/lib/x64/libLeap.so /usr/local/lib

After copying, execute ldconfig :
$ sudo ldconfig

Okay, you are finished with setting the environment variables. Now you can compile the
leap_motion ROS driver package. You can create a ROS workspace or copy the
leap_motion package to an existing ROS workspace and use catkin_make .
You can use the following command to install the leap_motion package:

$ catkin_make install --pkg leap_motion
This will install the leap_motion driver; check whether the ROS workspace path is
properly set.

2.Testing the Leap Motion ROS driver
If everything has been installed properly, we can test it using a few commands.
First, launch the Leap Motion driver or control panel using the following command:
$ sudo LeapControlPanel
After launching the command, you can verify that the device is working by opening the
Visualizer application. If it's working well, you can launch the ROS driver using the
following command:
$ roslaunch leap_motion sensor_sender.launch
If it's working properly, you will get topics with this:
$ rostopic list
If you can see rostopic/leapmotion/data in the list, you can confirm that the driver is
working. You can just echo the topic and see that the hand and finger values are coming in,
as shown in the following screenshot:
(That's the end of that chapter)

Segmentation faults in C++ code when compiled/run on ROS Noetic

Steps to reproduce:

> Pull this repository into a catkin workspace in a Noetic installation
> Compile with catkin_make
> Run any of the launch files:
    > roslaunch leap_motion vizualization.launch

You will see errors (seg faults) for all the C++ nodes being started, such as "leap_motion_driver_node".

[Note: I understand that the leap motion SDK was only built for python 2.7, but that is not what this above Issue is about. I also understand that Noetic is not presently supported for this driver...but if the issue above is a simple fix, perhaps it could help uses who just want to use the C++ portions of the driver].

sudo apt-get install ros-kinetic-leap-motion does not include config folder

I just installed this package from command line, and encountered the following error:

error loading <rosparam> tag: 
	file does not exist [/opt/ros/kinetic/share/leap_motion/config/listener_params.yaml]
XML is <rosparam command="load" file="$(find leap_motion)/config/listener_params.yaml"/>
The traceback for the exception was written to the log file

I checked my /opt/ros/kinetic/share/leap_motion path, and saw there was no config folder there, but there is one if you install from source.

Just a heads up

image api

any plans to use the image api and therefore publish images?

ERRORS after runing rosrun leap_motion sender.py

terminal 1

mario@mario:~$ sudo leapd
[sudo] password for mario:
[Info] WebSocket server started
[Info] Secure WebSocket server started
[Info] Leap Motion Controller detected: LP59556685952
[Info] Firmware is up to date.

terminal 2

mario@mario:~$ source ~/catkin_ws/devel/setup.bash && rosrun leap_motion sender.py

Traceback (most recent call last):
  File "/home/mario/catkin_ws/src/leap_motion/scripts/sender.py", line 81, in <module>
    sender()
  File "/home/mario/catkin_ws/src/leap_motion/scripts/sender.py", line 25, in sender
    rospy.loginfo("Parameter set on server: PARAMNAME_FREQ={}".format(rospy.get_param(PARAMNAME_FREQ_ENTIRE, FREQUENCY_ROSTOPIC_DEFAULT)))
  File "/opt/ros/kinetic/lib/python2.7/dist-packages/rospy/client.py", line 465, in get_param
    return _param_server[param_name] #MasterProxy does all the magic for us
  File "/opt/ros/kinetic/lib/python2.7/dist-packages/rospy/msproxy.py", line 121, in __getitem__
    code, msg, value = self.target.getParam(rospy.names.get_caller_id(), resolved_key)
  File "/usr/lib/python2.7/xmlrpclib.py", line 1243, in __call__
    return self.__send(self.__name, args)
  File "/usr/lib/python2.7/xmlrpclib.py", line 1602, in __request
    verbose=self.__verbose
  File "/usr/lib/python2.7/xmlrpclib.py", line 1283, in request
    return self.single_request(host, handler, request_body, verbose)
  File "/usr/lib/python2.7/xmlrpclib.py", line 1311, in single_request
    self.send_content(h, request_body)
  File "/usr/lib/python2.7/xmlrpclib.py", line 1459, in send_content
    connection.endheaders(request_body)
  File "/usr/lib/python2.7/httplib.py", line 1053, in endheaders
    self._send_output(message_body)
  File "/usr/lib/python2.7/httplib.py", line 897, in _send_output
    self.send(msg)
  File "/usr/lib/python2.7/httplib.py", line 859, in send
    self.connect()
  File "/usr/lib/python2.7/httplib.py", line 836, in connect
    self.timeout, self.source_address)
  File "/usr/lib/python2.7/socket.py", line 575, in create_connection
    raise err
socket.error: [Errno 111] Connection refused
mario@mario:~$ 

Thanks a million

Escalate leapros.msg to a ROS standard message

As suggested (I'm not sure by who) on ros wiki page for leap_motion:

:
This information is encompassed in the custom message leap_motion/leapros.msg. In the future standard ROS messages should be used. 

I've never proposed ROS message types, but if someone is interested in please go ahead. Would it become like sensor_msgs/Hand?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.