GithubHelp home page GithubHelp logo

marinakollmitz / human_aware_navigation Goto Github PK

View Code? Open in Web Editor NEW
118.0 118.0 59.0 25.94 MB

The human_aware_navigation repository includes ROS packages to enable the planning of navigation paths that take human comfort into account

License: BSD 2-Clause "Simplified" License

CMake 4.17% Python 3.93% C++ 91.90%

human_aware_navigation's People

Contributors

marinakollmitz avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

human_aware_navigation's Issues

people_msgs/PeoplePrediction.h

Hello, when compiling the code, there comes out this information

human_aware_navigation/people_prediction/include/people_prediction/constant_velocity_prediction.h:42:10: fatal error: people_msgs/PeoplePrediction.h: No such file or directory
#include <people_msgs/PeoplePrediction.h>

Would you mind providing this message file?

Simulation problems

Hi!

I was trying to run the simulation in gazebo as suggested, and realized the robot would not listen to navigation goals and moved randomly. I later figured out this is because the turtlebot3_drive node also publishes to the cmd_vel topic, and this node is started in sim_world.launch under human_aware_nav_launch (which starts turtlebot3_simulation.launch under turtlebot3_gazebo, which finally launches the node called turtlebot3_drive). Removing the:

<node name="$(arg name)_drive" pkg="turtlebot3_gazebo" type="turtlebot3_drive" required="true" output="screen"/>

line from turtlebot3_simulation.launch seems to have solved the issue, and the simulation now works as expected.

I'm very much a beginner with ROS so this might not have been the most elegant way to solve it, so I wanted to let you know that this problem exists!

Also the first page of the wiki says indigo twice instead of melodic.

Thank you for your time.

Gazebo world

your melodic version playground.world can't loading

about some graphs in experiments

I want to do some experiments, but I don't know how to draw the following diagram
1619145013(1)
Can you teach me how to draw such a picture?Is there any package in ROS?
Thank you very much!

Conversion of dynamic social layered costmap to other mobile platform

Hi Marina,

First of all, thank you for your package for human_aware_navigation, it helps me a lot in understanding how social costmap collaborates with the Navigation Stack. However, if I wish to adapt your dynamic social costmap plugin to any other platform, what can I do? I wrote my common_params.yaml for move_base as below:

robot_base_frame: base_link

# obstacle_range: 6.0
# raytrace_range: 6.5

max_obstacle_height: 2.0
min_obstacle_height: -1.0

use_dijkstra: false
# visualize_potential: false



footprint: [[0.50, 0.50], [0.50, -0.50], [-0.50, -0.50], [-0.50, 0.50] ]  

# NOTE!! that including obstacles it may happen that in narrow corridors echos from the laser scanner may appear as obstacles!
# Not needed anyway cause using the static map layer
obstacles:
     observation_sources: scan 
     scan: 
       data_type: LaserScan
       topic: /beebot/laser/scan
       sensor_frame: /hokuyo_link 
       marking: true 
       clearing: true 
     obstacle_range: 30.0
     raytrace_range: 30.0
     enabled: true

inflater:   
     observation_sources: scan 
     scan: 
    data_type: LaserScan
    topic: /beebot/laser/scan 
    sensor_frame: /hokuyo_link 
    marking: true 
    clearing: true 
    map_type: costmap
     inflation_radius: 1.2
     enabled: true

dynamic_social_costmap:
     observation_sources: people 
     people: 
    data_type: people_msgs/People
    topic: /people  
    marking: true 
    clearing: true 
    map_type: costmap
     inflation_radius: 1.2
     enabled: true

For dynamic_social_costmap, I deduced from the source code that the local costmap node should subscribe to /people topic, so I put it this way. Once dynamic_social_costmap is announced, I hence announce the usage of this layer costmap in global_costmap.yaml and local_costmap.yaml. However, I must have missed something because move_base does not initialize as with other layered costmap.
I saw you loaded the costmap plugin as:
<param name="TBPlanner/dynamic_layers_plugin" value="dynamic_social_costmap::SocialLayers" />
into move_base. Thus, what should be the correct way of loading the social costmap in any other yaml file? Thank you so much !

Assistant needed :"easy_markers" :

Thank you for your share, @marinaKollmitz ,I have git all four packages in workspace ,when using catkin_make to compile these pakages, I couldn't find the "easy_markers" package using google, where can I find the "easy_markers" package, Thank you !
`Could not find a package configuration file provided by "easy_markers" with
any of the following names:
easy_markersConfig.cmake
easy_markers-config.cmake

`

questions about ‘social_nav_simulation’

Hi!

I have questions about how to control people to move in Gazebo simulation,which makes it easy to test the planner in the dynamic situation.

ps:
the people model is set like this in launch files,

questions about planning process in every period

Hi, I list my understandings and want to check these with you.

  1. The beginning states expand max_timesteps times (the max_timesteps are also the max steps of prediction,default value should be 15) at most during the planning process in every period( default value should be about 0.5 second).

  2. If the current planning cannot get the goal state and publish "current_plan", the next planning will just plan from the state in the "current_plan" whose timestamp is near the time of publishing "the plans of the next planning ".

lattice planner problems

Hello,When reproducing the code, I don't know how to get some in state_discretizer.cpp Formulas used in,such as Line 83 in state_discretizer:
int max_angle = (int) ((4 * M_PI) / (acc_w * time_delta) + 1);
These may involve the knowledge of lattice planner. Do you have any relevant paper recommendations for the formulas used in this?

questions about the experiment of real robot

I want to experiment with real robots。When I start "human"_ aware_ nav_ Launch navigation. Launch "and / people_ velocity_ tracker/filtered_ tracked_ Detector. Launch ", Pedestrian detection and prediction are normal. Once I set the target point in rviz and try to make the robot move, the following problems appear, and then the navigation cannot be performed,The following problems arise
图片

How do I set up to complete human detection, prediction and navigation on a real robot?

Thank you very much!

Unable to navigate robot in rviz - Is Problem regarding move_base, Lattice Planner and Timed Path follower params !?

Hello @marinaKollmitz,
Firstly, thanks for this project.

Concerning the issue, I have implemented this package with respect to the robot (a simple robot with chassis, 2 wheels and a hokuyo link) I have created. Also, I have created a map and world - with some static and dynamic obstacles according to my requirements.

Here the problem is,

  1. when I launch rviz w.r.t to the map (playground_new.yaml) that you provided, the robot (which I created) navigates fine.
  2. But, when I launch rviz w.r.t to the map (I created), the robot (I created) doesn't start navigating and produces the following errors.

Screenshot from 2022-05-21 14-06-40

I assume the error is related to Lattice Planner(global_planner_params.yaml) and Timed Path follower (local_planner_params.yaml).

  • If yes, would you please let me know where the exact issue arises? I tried to sort out the error but didn't find any solution. Maybe I was missing something!
  • If not, would you please let me know what is the exact issue and help with that?

Your answer would help me a lot. Thank you! @marinaKollmitz

Error with dynamic layers plugin

Whenever i try to launch the lattice planner i get the follow error
dynamic_costmap: failed to load dynamic layers plugin: According to the loaded plugin descriptions the class undefined with base class type lattice_planner::DynamicLayers does not exist. Declared types are dynamic_social_costmap::SocialLayers

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.