novog93 / sjtu_drone Goto Github PK
View Code? Open in Web Editor NEWROS/ ROS 2 Gazebo quadcopter simulator.
License: GNU General Public License v3.0
ROS/ ROS 2 Gazebo quadcopter simulator.
License: GNU General Public License v3.0
When I launch the program, using:
ros2 launch sjtu_drone_bringup sjtu_drone_bringup.launch.py
everything is initiating and I manage to move the drone using its teleope window.
The problem is, the simulation doesn't have any reference frame, and the "higher" frame is the one located in the urdf.xacro file which is "base_link".
In previous simulations I used, both in ROS and ROS2, the Gazebo simulator used to have some reference frame to describe the robot's positioning according to it. Creating a new "world" frame in the URDF and using a static_tf_broadcaster is not an option here, since I want the drone to move.
How can I solve this problem?
Hello, a few days ago I started working on a project whose aim is to avoid obstacles in the simulation autonomously by the drone and travel from point A to point B. Now I have reached the stage where I want to project points on objects in order to check if everything is in order before creating bounding boxes and later a dataset with them.
For now, I want to project a point on the middle of the "Construction Cone" object, having the drone in its initial position when I open the simulation. The problem I ran into is that, following the logic of my transformation chain, the point seems to be projected incorrectly.
This is my first project that I want you to do using Gazebo and ROS (being still in college) and since I haven't found much guidance from anyone else, I thought I could ask here what the solution could be. Besides defining the necessary rotation for aligning the camera with the Z axis of the world and extracting the intrinsic and extrinsic parameters of the camera and the position and orientation of the object, I don't know what else I could miss to make the projection correctly. Thank you for possible solutions.
Here's my code so far:
https://github.com/4kaws/object_detection_drone/blob/master/README.md?plain=1
And here's an image with the projected point (the cone on which the point must be projected is the first on the right):
i try to take off the drone and wait for a moment, it looks like the drone yaw by itself. tried to deactivate the wind on gazebo but still yaw, what might seems the problem here?
Hey, I am thinking of contributing a PR for joystick support, but I'd like to discuss how we should go about it first.
Would you prefer if it was an entirely new launch file that would launch the teleop joystick node, or should we stick with the current launch file but we pass some sort of flag as an argument?
For example,
ros2 launch sjtu_drone_bringup sjtu_drone_bringup.launch.py joystick
I haven't checked if the launch
API allows you to pass arguments to the launch file, but I reckon it's likely possible.
Hello Community,
I want to add depth camera to this drone. I changed the urdf file like below. However, I couldn't get any depth camera data. How can I solve the problem?
<gazebo reference="front_cam_link">
<sensor name="front_camera" type="camera">
<always_on>1</always_on>
<visualize>1</visualize>
<update_rate>60</update_rate>
<camera>
<horizontal_fov>2.09</horizontal_fov>

<clip>
<near>0.1</near>
<far>100</far>
</clip>
<noise>
<type>gaussian</type>
<mean>0.0</mean>
<stddev>0.005</stddev>
</noise>
</camera>
<plugin filename="libgazebo_ros_camera.so" name="camera_front">
<ros>
<namespace>/simple_drone</namespace>
</ros>
<frame_name>/simple_drone/front_cam_link</frame_name>
<camera_name>front</camera_name>
<hack_baseline>0.07</hack_baseline>
</plugin>
</sensor>
<!-- Depth Cam -->
<sensor name="front_depth_camera" type="depth">
<always_on>1</always_on>
<visualize>1</visualize>
<update_rate>30</update_rate>
<camera>
<horizontal_fov>2.09</horizontal_fov>

<clip>
<near>0.1</near>
<far>10</far>
</clip>
</camera>
<plugin filename="libgazebo_ros_openni_kinect.so" name="camera_front_depth">
<ros>
<namespace>/simple_drone</namespace>
</ros>
<frame_name>/simple_drone/front_cam_link</frame_name>
<camera_name>front_depth</camera_name>
</plugin>
</sensor>
</gazebo>
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.