rst-tu-dortmund / pxpincher_ros Goto Github PK
View Code? Open in Web Editor NEWHardware and support package for the PhantomX Pincher robot arm
Hardware and support package for the PhantomX Pincher robot arm
Hi
I want to run the arm in rviz,
This is my launch file
<launch>
<node name="arbotix" pkg="arbotix_python" type="arbotix_driver" output="screen">
<rosparam file="$(find pxpincher_config)/config/pincher_arm.yaml" command="load" />
</node>
<!-- Now load robot model -->
<param name="robot_description" command="$(find xacro)/xacro.py '$(find pxpincher_description)/urdf/pincher_arm.urdf.xacro'"/>
<node name="robot_state_pub" pkg="robot_state_publisher" type="robot_state_publisher"/>
<node name="gripper_controller" pkg="arbotix_controllers" type="gripper_controller">
<param name="model" value="singlesided"/>
<param name="invert" value="true"/>
<param name="pad_width" value="0.0254"/>
</node>
</launch>
But I get robot model error in rviz as follow
gripper_link
No transform from [gripper_link] to [base_link]
What if my phantomX pincher cannot go to the exact correct joints' angle as I set?
ubuntu:14.04
ros:indigo
I cloned the package from the git,and run catkin_make to compile.Then I get two errors:
The transition of servos is jerky if used in combination with the joint trajectory following controller.
Since the joint trajectory following controller is configured to use the position interface of the robot, no PID gain values can be specified.
Changing the rate of the pxpincher node did not improve it significantly.
But if used with the joint trajectory interface gui:
rosrun rqt_joint_trajectory_controller rqt_joint_trajectory_controller
The movement is slowly. They send the message using the command interface instead of the action controller and update their values frequently. Therefore it could depend on the number of interpolated samples of the resulting trajectory.
i want to run turtlebot on rviz, i robot model error, iam using indigo
Hi
I am using ros kinetic on ubuntu 16.04
I don't know how to control the actual pincher arm by using the interactive markers in rviz
would u help me plz
I run the following codes, I got rviz, but when I move any joint the real hand is not moving, I dont know if there's communication or not.
roslaunch pxpincher_launch pxp.launch
roslaunch pxpincher_launch pxp_sim.launch
roslaunch pxpincher_launch pxp_rviz.launch
rosrun pxpincher_lib pxpincher_test
rosrun pxpincher_lib teach
Hi,
I tested the arm with rviz and everything works fine.
Now, I want the arm to track an object by using normal camera or even kinect v1. How the arm can track an object based on color for example ???
I set speed to be 0.5, but it's always be bounded.
How do I remove the limitation?
Why it asked me for user name and password when I want to clone it.
grasp001@grasp001:~$ cd ~/catkin_ws/src
grasp001@grasp001:~/catkin_ws/src$ git clone https://git.rst.e-technik.tu-dortmund.de/robotics-rigid-arms/pxpincher.git
Cloning into 'pxpincher'...
Username for 'https://git.rst.e-technik.tu-dortmund.de':
Password for 'https://git.rst.e-technik.tu-dortmund.de':
I tried to run the test by executing
rosrun pxpincher_lib pxpincher_test
It worked fine initially, however, after it reaches a position (TCP pose: [x: 0.125, y: -0.080 z: -0.005]; TCP rpy: [roll: -0.000 pitch: 1.509 yaw: -0.569]; gripper opened: 20%), it stops moving and an error message appeared
rosrun pxpincher_lib pxpincher_test
[ INFO] [1550614680.723574814]: Waiting for arm action server to start.
[ INFO] [1550614681.098810545]: Arm action server found. Waiting for gripper action server to start.
[ INFO] [1550614681.393284466]: All action servers started.
[ INFO] [1550614681.460526786]: Waiting for joint_states message...
[ INFO] [1550614681.616289894]: Driving into default position in order to setup kinematic model...
[ INFO] [1550614681.617034467]: Initialization completed.
[ INFO] [1550614689.633672318]: Current joint configuration q=[0.8 0.6 0.9 1.6]
[ INFO] [1550614694.045851361]: TCP rotation matrix w.r.t. base:
0.043 -0.72 0.7
0.044 0.7 0.72
-1 -3.7e-06 0.062
[ INFO] [1550614694.046022662]: TCP translation vector w.r.t. base: [ 0.12 0.12 -0.0036]
[ WARN] [1550614694.049992051]: Inverse Kinematics: solution found, but mismatch in the translational part detected.
[ WARN] [1550614694.050625792]: Inverse Kinematics: solution found, but mismatch in the orientation part detected.
rosrun pxpincher_lib pxpincher_test
[ INFO] [1550614680.723574814]: Waiting for arm action server to start.
[ INFO] [1550614681.098810545]: Arm action server found. Waiting for gripper action server to start.
[ INFO] [1550614681.393284466]: All action servers started.
[ INFO] [1550614681.460526786]: Waiting for joint_states message...
[ INFO] [1550614681.616289894]: Driving into default position in order to setup kinematic model...
[ INFO] [1550614681.617034467]: Initialization completed.
[ INFO] [1550614689.633672318]: Current joint configuration q=[0.8 0.6 0.9 1.6]
[ INFO] [1550614694.045851361]: TCP rotation matrix w.r.t. base:
0.043 -0.72 0.7
0.044 0.7 0.72
-1 -3.7e-06 0.062
[ INFO] [1550614694.046022662]: TCP translation vector w.r.t. base: [ 0.12 0.12 -0.0036]
[ WARN] [1550614694.049992051]: Inverse Kinematics: solution found, but mismatch in the translational part detected.
[ WARN] [1550614694.050625792]: Inverse Kinematics: solution found, but mismatch in the orientation part detected.
I searched online but found no one has solved this problem. Does anyone has any ideas?
What does the warning message mean?
And, how can I resolve this?
If the pxpincher_hardware node is started in simulation mode, all actions are performed using a simple integration model. The integration model is executed much faster than the rate of the actual hardware node (e.g. 3x). But there are still some discrepancies.
If the action server is used with a specified trajectory, the execution cancels after the desired trajectory duration is finished regardless of reaching the actual goal.
This can be also be identified using the testKinematicModel method of the pxpincher_lib (see pxpincher test node): the desired positions are not reached.
But if the simulator is modified in order to execute the trajectories with a much higher velocity than specified, the goals are reached perfectly (using testKinematicModel).
On the other hand, this might be not a valid solution, since the simulator would not mimic the actual transition. Obviously, it is not possible to mimic the actual real-time behavior, but maybe it is possible to improve the timing, since the simulator can be executed much faster.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.