GithubHelp home page GithubHelp logo

Comments (12)

JeremyLinky avatar JeremyLinky commented on August 15, 2024 1

Thanks for the kind reply !!

from occupancyanticipation.

srama2512 avatar srama2512 commented on August 15, 2024

Hi. Thank you for your question. The colorscheme of the map (see Fig. 5 here) is:

  • Dark and light green are regions classified as occupied by the model.
  • Gray and purple are regions classified as free by the model.
  • When compared to the ground-truth map, the light green and gray predictions are incorrect. Dark green and purple are correct predictions.

Here, the model thinks that the gray area is free (eventhough it isn't). So it plans through it.

from occupancyanticipation.

JeremyLinky avatar JeremyLinky commented on August 15, 2024

Hi. Thank you for your question. The colorscheme of the map (see Fig. 5 here) is:

  • Dark and light green are regions classified as occupied by the model.
  • Gray and purple are regions classified as free by the model.
  • When compared to the ground-truth map, the light green and gray predictions are incorrect. Dark green and purple are correct predictions.

Here, the model thinks that the gray area is free (eventhough it isn't). So it plans through it.

Thanks for your reply! But it seems that you haven't figure out my question. My question is why can the planned path pass through the obstacle after the obstacle is set in front of the agent on the collision map? The obstacles are drawn in orange and the planned path is drawn in dark purple as below.
image

from occupancyanticipation.

srama2512 avatar srama2512 commented on August 15, 2024

Oh, sorry about that. I misunderstood the question. This is interesting. Could you provide more details about how the orange lines were generated, and confirm whether the updated map is sent to the planner or not?

from occupancyanticipation.

JeremyLinky avatar JeremyLinky commented on August 15, 2024

Oh, sorry about that. I misunderstood the question. This is interesting. Could you provide more details about how the orange lines were generated, and confirm whether the updated map is sent to the planner or not?

  1. The orange lines represent the obstacle in front of the agent if a collision happens. It is from your original code:
    image

And I just plot the points set as 1 in the collision map as below.
image

  1. I think the updated map is sent to the planner since the planner is behind updating the collision map:
    image

Also, I plot the global map after self._process_maps( ), finding that the obstacles are indeed updated in the map:
image

from occupancyanticipation.

srama2512 avatar srama2512 commented on August 15, 2024

Thank you for providing the details. This sounds about right. Can you also share how you are plotting the planner outputs?

from occupancyanticipation.

JeremyLinky avatar JeremyLinky commented on August 15, 2024

Thank you for providing the details. This sounds about right. Can you also share how you are plotting the planner outputs?

Thanks for your reply. I save the path in function _compute_plans_and_local_goals( ) as below.
image

And I plot it like:
image

from occupancyanticipation.

srama2512 avatar srama2512 commented on August 15, 2024

It looks like the issue might be happening within the planner rather than the ActiveNeuralSlamExplorer. Is it possible for you to share the planner map (the purple and yellow map), start and goal positions with me? I can run the planner locally and check what the issue is.

from occupancyanticipation.

JeremyLinky avatar JeremyLinky commented on August 15, 2024

It looks like the issue might be happening within the planner rather than the ActiveNeuralSlamExplorer. Is it possible for you to share the planner map (the purple and yellow map), start and goal positions with me? I can run the planner locally and check what the issue is.

Thanks for your kind reply! Here, I provide the global_map、global_map_proc、goal_map_xy、agent_map_xy for you in a ".npz" file as the form of numpy array . These variables appear in the following locations when they are saved :
image

You can first unzip the following file and then use x=np.load('./result.npz',allow_pickle=True) to load it as x. If you want to read the global_map, you can then use keys "global_map" like x['global_map'], so as to other variables.
result.zip

Here I also provide the map without the planned path and the map with the planned path as follows. Green represents occupied or obstacles, black represents free space or unknown space, and red represents the planned path. It may not be easy to see the planned path clearly, you can enlarge the second picture slightly in order to see it.
image02
image03

Looking forward to your reply!! Much appreciate!!

from occupancyanticipation.

srama2512 avatar srama2512 commented on August 15, 2024

Hi @JeremyLinky , sorry about the latency on this issue. I tried running pyastar.astar_planner on the map you sent. The result I got makes sense (see below). I'm not sure where exactly your issue is coming from. My intuition is that the (x, y) conventions may have been incorrect somewhere along the way. Could you please check this once and let me know?

topdown_map

Here is the code to reproduce this:

import cv2
import pyastar
import numpy as np


x = np.load("./result.npz", allow_pickle=True)


global_map = (x["global_map_proc"][0] * 255.0).astype(np.uint8)
global_map = np.repeat(global_map[..., np.newaxis], 3, axis=2)

global_map = cv2.circle(global_map, tuple(x["agent_map_xy"][0].astype(np.int32).tolist()), 4, (255, 0, 0), -1)
global_map = cv2.circle(global_map, tuple(x["goal_map_xy"][0].astype(np.int32).tolist()), 4, (0, 255, 0), -1)

# Perform planning
agent_xy = x["agent_map_xy"][0].astype(np.int32)
goal_xy = x["goal_map_xy"][0].astype(np.int32)
path_x, path_y = pyastar.astar_planner(x["global_map_proc"][0], agent_xy, goal_xy, allow_diagonal=True)

# Draw path
for x, y in zip(path_x, path_y):
    global_map = cv2.circle(global_map, (x, y), 2, (127, 127, 127), -1)

cv2.imshow("Global map", global_map[..., ::-1])
cv2.waitKey(0)

from occupancyanticipation.

JeremyLinky avatar JeremyLinky commented on August 15, 2024

Hi @JeremyLinky , sorry about the latency on this issue. I tried running pyastar.astar_planner on the map you sent. The result I got makes sense (see below). I'm not sure where exactly your issue is coming from. My intuition is that the (x, y) conventions may have been incorrect somewhere along the way. Could you please check this once and let me know?

topdown_map

Here is the code to reproduce this:

import cv2
import pyastar
import numpy as np


x = np.load("./result.npz", allow_pickle=True)


global_map = (x["global_map_proc"][0] * 255.0).astype(np.uint8)
global_map = np.repeat(global_map[..., np.newaxis], 3, axis=2)

global_map = cv2.circle(global_map, tuple(x["agent_map_xy"][0].astype(np.int32).tolist()), 4, (255, 0, 0), -1)
global_map = cv2.circle(global_map, tuple(x["goal_map_xy"][0].astype(np.int32).tolist()), 4, (0, 255, 0), -1)

# Perform planning
agent_xy = x["agent_map_xy"][0].astype(np.int32)
goal_xy = x["goal_map_xy"][0].astype(np.int32)
path_x, path_y = pyastar.astar_planner(x["global_map_proc"][0], agent_xy, goal_xy, allow_diagonal=True)

# Draw path
for x, y in zip(path_x, path_y):
    global_map = cv2.circle(global_map, (x, y), 2, (127, 127, 127), -1)

cv2.imshow("Global map", global_map[..., ::-1])
cv2.waitKey(0)

Thank you for your patience! I took a closer look at the codes of the project, and found that before sending the map to the path planning, it also cropped the processed map in the codes. I guess this is the problem.

The a* algorithm is completely fine, but before handing over the map to the a* algorithm, it crops the map around the midpoint of the line connecting the agent position and the goal point position as the center, and reserves a 3m wide space on the boundary, which is set as a free space, as shown by the blue circle in the figure below( the green represents the cutting position without reserved space, and the red represents the cutting position after 3m of space is reserved ). This resulted in the place where the obstacles at the boundary originally being set up as free space, the A* algorithm will plan the path to pass through, and finally displayed on the uncropped map becomes the planned path through the obstacle .

The curve in the figure below is roughly drawn by me with a brush, but it does not affect what I want to express. The orange line represents the approximate direction of the path I found to pass through the obstacle. You can refer to my last answer.

If you have time, you can run the source code of your project to verify that my statement is correct, much apperciate!

image
The source code is as follows.
image

from occupancyanticipation.

srama2512 avatar srama2512 commented on August 15, 2024

This totally makes sense. Yes, I added the padding outside there to prevent the agent's plans from completely failing due to the cropping. However, this is an approximation which was needed to finish evaluating within the time-limits for the habitat challenge. If computational time is not a concern, I would recommend planning on the full map. Please feel free to close the issue if this resolves it.

from occupancyanticipation.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.