bitsrobocon / autonomousdrone Goto Github PK
View Code? Open in Web Editor NEWAutonomous Drone for Indoor Navigation and Parcel Delivery
Autonomous Drone for Indoor Navigation and Parcel Delivery
It is important to work on a simulated environment rather than going for on-field testing directly to avoid any unwanted damage to property especially when working with aerial robots. The challenges are:
To decide the appropriate simulation software based on the type of environment.
To fake the sensor data which is comparable to the real data to a great extent and hence the need to determine the algorithm of Intel Realsense t265 and d435 cams.
To prepare the appropriate 3D environment or importing existing ones from online databases.
Add a companion computer (RasPi) to the Pixhawk and control the drone manuallyusing commands sent from a ground station joystick connected to a computer.
The drone needs to do object detection to have any real utility. That means finding the 3D coordinates of nearby objects(as many as possible). These detections will then be used to aid in Path planning, since the motion of the detected obstacles can be used to model and predict their state in the future which will help plan safer trajectories.
Since the bot is equipped with a lidar and a stereo camera as well, we decided to go ahead with just stereo cameras for now. This is mainly to make the pipeline as streamlined and fast as possible. The process of getting the depth information from stereo image pairs is called stereopsis. We and all other 2-eyed animals do it all the time. Checkout OpenCV: Depth Map from Stereo Images
The current plan is to directly apply the object detector on the stereo images, without calculating the disparity map or the point cloud. Again, this is done to speed things up; Disparity calculation takes time(checkout this link at computerphile: Stereo 3D Vision to get a feel.
Still, there are existing ros packages like stereo_image_proc - ROS Wiki that do it, if one wants to try.
What’s done and what needs to be:
Controllers | RAM | Current | Voltage | Base Price |
---|---|---|---|---|
JETSON TX2 | 8gb | 0.81-2.5A | 5.5-19.6V | $549 |
UDOO X86 | 8gb | 1-3A | 3.3.V | $174 |
ODROID H2 | upto 32 gb | 0.286-1.572 | 14-20V | $110 |
ODROID N2 | 4gb | 2A | 8-18V | $79 |
PS: We went ahead with Jetson Xavier NX after a detailed comparision.
Integration of Intel Realsense T265 and D435/D435i for position identification, hold and mapping
Hardware for the drone to achieve at least 15 minutes time of flight with ~600g payload
We decided to use a combination of the Extended Kalman Filter (EKF) and the Particle Filter (PF) for localization of the quadcopter as they complement each other well and have already been used in SLAM algorithms like FastSLAM
This issue will help not only the State Estimation subsystem but also the Motion Planning subsystem. I will cover the State Estimation aspect of the issue here.
First, head over to this link and read the short article to understand what State Estimation means and why it is important.
You are encouraged to read this wonderful roadmap prepared by Mehul.
From the doc you just read, we might not need Gmapping, SLAM Toolbox, Google Cartographer, Navigator 2, or Hector SLAM. So there is no need to explore these immediately for the quadcopter project. One of the most important takeaways from it, though, is that you WILL HAVE to learn C++. All the SLAM packages are written in C++, and for understanding the code as well as modifying it later, we will need good C++ skills.
Now coming to the immediate subtask for the subsystem. This subtask will be of great help to not only the State Estimation Subsystem but also the Motion Planning Subsystem.
gz sdf -p MODEL_URDF.urdf > MODEL_SDF.sdf
. Unfortunately, one cannot go the other way round. So you will have to convert each line of the sdf file into the urdf compatible syntax. Once you have converted one part of the sdf file to urdf (e.g., some link, or some joint, etc.), you can check if the conversion is valid by running gz sdf -p MODEL.urdf
. The output of this command should be the same as the corresponding code from your sdf file.After that, we will be able to use robot_localization and MoveIt! (a package for motion planning) on our quadcopter. More details will be provided on how that can be done after the completion of this subtask.
This is not a task to be completed but an issue dedicated to discussing and recording all the relevant resources that one can find useful on the internet.
This issue has been opened so that those resources that we come across don't get lost away 😄
Feel free to post all links as comments below 😃
RTABMAP
(primarily for mapping) + PCL
(handling pointcloud data) + RViz
(for visualisation) + OctoMap
(for storing the occupancy grid).
The aim of the motion planning subsystem is to calculate the motion of Quadcopter in 3D.
This task of motion planning is subdivided into the following sub-parts:
Given an Octomap, start position, and goal position, we need to calculate the ideal trajectory that the drone should follow to reach the goal. This trajectory would be planned before the drone starts its mission. Note that this is an ideal trajectory that is most probably going to change due to unpredicted obstacles in the way, inaccuracies in the initial map, and drone slightly deviating from its path while maneuvering. Hence, the need for a Local Motion Planner arises.
The Local Motion Planner would receive
and then, based on the surroundings and deviation from the ideal path, it should be able to recalculate its trajectory as well as motion.
Since we also want to survey unknown environments to generate the maps for the first time, we need to work on a surveying algorithm as well for unknown environments. There has been really good work by HKUST in this field. Fast Planner and FUEL are some of the examples that we were looking to test. Needless to say, these might also be useful in understanding more about how to go about for Local Planner.
The commands generated by Local Motion Planner need to be converted into ones that are understandable by the PX4 and then can be sent over to the bot. For this MAVROS is to be used.
Since it's a bit difficult to code up everything by hand, and we don't want to reinvent the wheel, We are in the process of trying out MoveIt! for Global Planner.
MoveIt! already has abstract 3D planners, collision checking algos, etc. implemented in the context of robots (robotic arms to be precise). We can just drop the URDF of our bot in it, give it our world's Octomap and specify algo with which we want to calculate the trajectory/motion.
As we were trying out MoveIt! we realized that we need the URDF (Unified Robot Description Format) of our bot to move any further, but we have only SDF (Simulation Description Format) for now (as it was being used in the gazebo).
And hence, the next immediate step is to convert the SDF into URDF (which mostly needs to be done manually).
For MoveIt! the implementation we are referring to is Wil Selby 3D Mapping and Navigation on Drone using MoveIt!
Aim : To achieve around 4:1 thrust : weight ratio.
Even if you only just planned to fly a slow and stable aerial photography rig, you should aim at somewhere between 3:1 and 4:1 ratios. This does not only give you better control, but also provides room for extra payload in the future.
Source : Oscar Liang
What we know :
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.