personalrobotics / ada_meal_scenario Goto Github PK
View Code? Open in Web Editor NEWA set of scripts for a meal serving scenario using Ada.
A set of scripts for a meal serving scenario using Ada.
It might make sense now to tag the current working demo, emphasis on working, and switch to Jen's branch.
You're killing me with this man. Stick to one and be consistent.
https://github.com/personalrobotics/ada_meal_scenario/blob/master/src/bite_serving_FSM.py#L209
We should really stick to the same incantation that we use in herbpy
and make the call to ActiveManipulator
and make sure that we set the current arm to be active. I know this is overkill with just one manipulator but I want to really think forward to the right thing.
https://github.com/personalrobotics/ada_meal_scenario/blob/master/src/bite_serving_FSM.py#L95
@mkoval, comments?
Would you please order some shorter forks, using the faster shipping method possible? The ones we have are too long to be stable. It would be great if the forks are as short as possible.
switch to using Kinova's fork holder instead of our foam noodle for the demo.
It doesn't look like this file is used anymore.
There are a bunch of configurations floating around in the demo file, like this one:
https://github.com/personalrobotics/ada_meal_scenario/blob/master/src/bite_serving_FSM.py#L40
I suggest you convert them to named configurations like this:
https://github.com/personalrobotics/herbpy/blob/master/config/configurations.yaml
You can then use the NamedPlanner
and plan to these configurations easily. Please talk to @mkoval and @jeking04 if you have any questions about this.
The camera mount seems to be loose at the base joint. It looks like one of the screws has sheared.
With the latest master versions of PrPy and Adapy (as well as with the tagged versions prpy 0.5.0 and ada 0.1.2), manip.PlanToConfiguration results in the program to freeze. Here is the output:
/homes/snikolai/catkin_demo_jen_branch/src/ada_meal_scenario/src/ada_meal_scenario/actions/trajectory_actions.py:36: FutureWarning: comparison to `None` will result in an elementwise object comparison in the future.
first_config = cspec.ExtractJointValues(first_wpt, robot, manip.GetArmIndices())
/homes/snikolai/catkin_demo_jen_branch/src/prpy/src/prpy/base/manipulator.py:99: FutureWarning: comparison to `None` will result in an elementwise object comparison in the future.
return self.GetRobot().GetDOFValues(self.GetIndices())
[INFO] [ada_meal_scenario:trajectory_actions.py:41]:_run: Planning to start of trajectory for action LOOKING_AT_PLATE
[environment-core.h:116 Environment] setting openrave home directory to /homes/snikolai/.openrave
[odecollision.h:124 ODECollisionChecker] ode will be slow in multi-threaded environments
[environment-core.h:193 Init] using ode collision checker
[environment-core.h:848 SetCollisionChecker] setting ode collision checker
[qtcoinviewer.cpp:2928 UpdateFromModel] timeout of GetPublishedBodies
[basemanipulation.cpp:117 main] BaseManipulation: using BiRRT planner
[taskmanipulation.cpp:179 main] using BiRRT planner
Make pretty visualization in or_rviz
.
Make sure the demo is portable and runs on the Thinkpad.
The correct way to get the camera transform is not to hardcode it but to programatically obtain it from the OpenRAVE robot by attaching an OpenRAVE sensor to the robot model.
Sometimes (not very often) the morsel detector will consider the plate edges as bites and go for those.
It'd be nice if the robot performed some keep-alive motion, like a scan around the room [something that's even precomputed] if it doesn't see a morsel for some timeout. Otherwise the demo will look very static.
In general, all custom classes in Python should extend object
. Failing to do so can lead to some unexpected behavior, especially with __special__
methods.
What is the best way to do the bite-detection-serving action cycle in a loop?
It's already publishing a ROS message containing a std_msgs/String
, so it seems silly to just fill it with JSON. It looks like the the JSON contains a point cloud. Why not use a standard ROS message type for this (e.g. PointCloud
or an array of Point
s)?
Update recognition and registration of peeps in realtime.
Tag before making any more changes.
There are only two configurations in the demo, feed
and perceive
. You can pre-compute the trajectory from perceive
to feed
. Then reverse it to get a trajectory from feed
back to perceive
. This trajectory only needs to be computed once, when you load the demo script.
It's not possible to pre-compute the trajectory from perceive
to stab the bite, since it depends on perception. But, once planned, you can just reverse this trajectory to get back to perceive
.
The cable appears to have been damaged, due to strain from the robot motion. @siddhss5
https://github.com/personalrobotics/ada_meal_scenario/blob/master/src/bite_serving_FSM.py#L32
https://github.com/personalrobotics/ada_meal_scenario/blob/master/src/bite_serving_FSM.py#L12
You should really use a lint checker. I like Anaconda
, a plugin for SublimeText
.
Make a full video of the entire demo with good camera angles and captured visualization and annotations.
I'd like Rachel or anyone else to be able to run the demo. This requires good documentation. I suggest putting that in the README.
The two videos below run identical code, apart from the the first using prpy 0.3.1 and the second 0.4.0 For the first, the duration of grasp_bite - serve - look - ready-to-grasp is 25sec, for the second 49 secs https://www.youtube.com/watch?v=YkwINudZnOw https://www.youtube.com/watch?v=LbgmtMS4kpo
I took time stamps before and after each PlanToConfiguration and PlanToEndEffectorOffset call:
0.4.0 | 0.3.1. | Call |
---|---|---|
935486 | 11590 | PlanToLookingAtFace |
905313 | 330854 | PlanToLookingAtPlate |
362034 | 82758 | PlanToEndEffectorPose |
937087 | 702920 | PlanToEndEffectorOffset |
263716 | 783048 | PlanToEndEffectorOffset |
166682 | 586631 | PlanToServingConfiguration |
3570318 | 2497801 | Total |
It appears though that the actual delays in the video are much larger than the ones in the time stamps
@siddhss5 @mkoval @jeking04
Make sure the utensils are usable by ADA and are matte for perception. Update this issue with possible options.
Sometimes, even when there is a bite in the plate, the morsel detector will ignore it.
Work with everyone else and make a nice poster for the demo.
The demo currently calls SetDOFValues
soon after loading the real robot. This can cause a race condition if you query the position of the robot before the controller plugin updates the joint values.
Must do this soon.
rospy.Subscriber
s run in a separate thread. This means that you have to use locks when accessing or modifying data in the callback. In this script, this includes:
self.ROBOT_STATE
self.bite_world_pose
self.bite_detected
You should also lock the OpenRAVE environment when reading or writing anything. This includes:
world_camera = self.robot.GetLinks()[7].GetTransform()
self.robot.SetDOFValues(self.robot.GetDOFValues())
self.robot.SetTransform(robot_pose)
self.robot.SetActiveDOFs(activedofs)
I don't know where to exactly raise this issue but I'll do it here for now.
Relevant also to @psigen @mkoval @jeking04 @Stefanos19.
I see several ActionException
s being raised by our actions, spread over the several action.py
files. The main demo file has a try-except
that catches these exceptions and might deal with them. However, a big issue is that
Any thoughts on fixing this?
It appears that the pose of the morsel continues to change after the robot starts moving to grab it. That's because the callback continues to get called after the bite-grabbing action:
https://github.com/personalrobotics/ada_meal_scenario/blob/feature/action_framework/src/actions/detect_morsal.py#L77
One way to speed things up is to break down the PlanToEndEffectorOffset into a PlanToTSR (with free roll) to as close to the food as possible followed by a shorter PlanToEndEffectorOffset. I think this will succeed more often. @mkoval and jeking, does this make sense? I think we can encode this as an action, much like the push-grasp.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.