GithubHelp home page GithubHelp logo

novog93 / sjtu_drone Goto Github PK

View Code? Open in Web Editor NEW
74.0 74.0 59.0 8.19 MB

ROS/ ROS 2 Gazebo quadcopter simulator.

License: GNU General Public License v3.0

CMake 3.46% C++ 42.95% Python 50.85% Dockerfile 1.61% Shell 1.14%
drone gazebo gazebo-ros ros2

sjtu_drone's Introduction

GitHub Streak


Anurag's GitHub stats

sjtu_drone's People

Contributors

dependabot[bot] avatar novog93 avatar winstxnhdw avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar

sjtu_drone's Issues

"world" frame is missing

When I launch the program, using:

ros2 launch sjtu_drone_bringup sjtu_drone_bringup.launch.py

everything is initiating and I manage to move the drone using its teleope window.

The problem is, the simulation doesn't have any reference frame, and the "higher" frame is the one located in the urdf.xacro file which is "base_link".

In previous simulations I used, both in ROS and ROS2, the Gazebo simulator used to have some reference frame to describe the robot's positioning according to it. Creating a new "world" frame in the URDF and using a static_tf_broadcaster is not an option here, since I want the drone to move.

How can I solve this problem?

Creating a dataset

Hello, a few days ago I started working on a project whose aim is to avoid obstacles in the simulation autonomously by the drone and travel from point A to point B. Now I have reached the stage where I want to project points on objects in order to check if everything is in order before creating bounding boxes and later a dataset with them.
For now, I want to project a point on the middle of the "Construction Cone" object, having the drone in its initial position when I open the simulation. The problem I ran into is that, following the logic of my transformation chain, the point seems to be projected incorrectly.
This is my first project that I want you to do using Gazebo and ROS (being still in college) and since I haven't found much guidance from anyone else, I thought I could ask here what the solution could be. Besides defining the necessary rotation for aligning the camera with the Z axis of the world and extracting the intrinsic and extrinsic parameters of the camera and the position and orientation of the object, I don't know what else I could miss to make the projection correctly. Thank you for possible solutions.

Here's my code so far:
https://github.com/4kaws/object_detection_drone/blob/master/README.md?plain=1

And here's an image with the projected point (the cone on which the point must be projected is the first on the right):
image

drone unintentionally yaw

i try to take off the drone and wait for a moment, it looks like the drone yaw by itself. tried to deactivate the wind on gazebo but still yaw, what might seems the problem here?

[Enhancement]: Joystick Support

Hey, I am thinking of contributing a PR for joystick support, but I'd like to discuss how we should go about it first.

Would you prefer if it was an entirely new launch file that would launch the teleop joystick node, or should we stick with the current launch file but we pass some sort of flag as an argument?

For example,

ros2 launch sjtu_drone_bringup sjtu_drone_bringup.launch.py joystick

I haven't checked if the launch API allows you to pass arguments to the launch file, but I reckon it's likely possible.

Depth Camera

Hello Community,

I want to add depth camera to this drone. I changed the urdf file like below. However, I couldn't get any depth camera data. How can I solve the problem?

<gazebo reference="front_cam_link">
  <sensor name="front_camera" type="camera">
    <always_on>1</always_on>
    <visualize>1</visualize>
    <update_rate>60</update_rate>
    <camera>
      <horizontal_fov>2.09</horizontal_fov>
      
      <clip>
        <near>0.1</near>
        <far>100</far>
      </clip>
      <noise>
        <type>gaussian</type>
        <mean>0.0</mean>
        <stddev>0.005</stddev>
      </noise>
    </camera>
    <plugin filename="libgazebo_ros_camera.so" name="camera_front">
      <ros>
        <namespace>/simple_drone</namespace>
      </ros>
      <frame_name>/simple_drone/front_cam_link</frame_name>
      <camera_name>front</camera_name>
      <hack_baseline>0.07</hack_baseline>
    </plugin>
  </sensor>
  <!-- Depth Cam -->
  <sensor name="front_depth_camera" type="depth">
    <always_on>1</always_on>
    <visualize>1</visualize>
    <update_rate>30</update_rate>
    <camera>
      <horizontal_fov>2.09</horizontal_fov>
      
      <clip>
        <near>0.1</near>
        <far>10</far>
      </clip>
    </camera>
    <plugin filename="libgazebo_ros_openni_kinect.so" name="camera_front_depth">
      <ros>
        <namespace>/simple_drone</namespace>
      </ros>
      <frame_name>/simple_drone/front_cam_link</frame_name>
      <camera_name>front_depth</camera_name>
    </plugin>
  </sensor>
</gazebo>

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.