GithubHelp home page GithubHelp logo

jo-valer / robotics Goto Github PK

View Code? Open in Web Editor NEW
16.0 1.0 1.0 193.54 MB

A mobile robot equipped with a 6-DoF manipulator to pick up different bricks in a partially known environment: kinematics, trajectory planning & control, object localization & classification.

CMake 3.39% C++ 73.08% C 3.23% Shell 0.27% HTML 0.16% Cuda 1.26% Fortran 10.80% XSLT 0.05% Python 6.73% JavaScript 0.07% CSS 0.05% MATLAB 0.11% Dockerfile 0.04% Jupyter Notebook 0.77%
gazebo machine-learning machine-vision robotics ros yolov5 classification kinematics localization object-classification

robotics's Introduction

Robotics project - A mobile robot to pick up LEGO bricks.

This is the Fundamentals of Robotics' project by Pietro Fronza, Stefano Genetti, and Giovanni Valer.

โ„น We have written the code mainly in C++ (for the part regarding kinematics and trajectory planning) and Python (for object classification and localization). For any info about the implementation solutions, read the paper: Robotics_Project.pdf.

๐ŸŽฅ Videos

It might be useful to have a look at some simulations we have done. You can find them on YouTube at the following links:

๐Ÿ“ Repository content

Here you can find three different catkin workspaces, each one for a different assignment. All goals have been achieved.

โš ๏ธ If you want to download the code and run it, make sure you comply with the requirements. Furthermore, you need to give execution permissions to 3 python scripts, namely: src/my_world/world/lego_spawner.py, src/robotic_vision/src/localize_listener.py, and src/robotic_vision/src/yolov5/my_detect.py. If you are not on a native linux machine you may need to directly create those files (then copying into them the content from this repo).

The whole environment has to be launched with:

roslaunch my_world startcomplete.launch

After having un-paused the simulation, you can run the controller:

rosrun mir_controller mir_controller

You will find the recognized bricks directly printed on the shell, and also in OUTPUT.txt. Here is an example with 4 bricks:

LEGO   class: 7,   name: X1_Y4_Z1,         x: 0.121254,   y: -2.0017
LEGO   class: 5,   name: X1_Y3_Z2,         x: -0.120827,  y: -2.00102
LEGO   class: 6,   name: X1_Y3_Z2_FILLET,  x: 1.86269,    y: 1.9847
LEGO   class: 1,   name: X1_Y2_Z1,         x: 2.12052,    y: 1.9992

Basically, everything works as in the previous assignment, except for the robot, that now carries the bricks to their basket. The commands and the output are the same.

For the last assignment we decided to have a much more complicated environment, so now the world is stored as a map in a file: src/my_world/src/map.txt.

To run this simulation, everything works as previously seen, but now the output has to be slightly different, since there can be up to 3 different bricks in the same target area:

AREA: 3 - LEGO  class: 7,   name: X1_Y4_Z1,         x: 0.121254,   y: -2.0017
AREA: 3 - LEGO  class: 5,   name: X1_Y3_Z2,         x: -0.120827,  y: -2.00102

AREA: 1 - LEGO  class: 1,   name: X1_Y2_Z1,         x: 2.12052,    y: 1.9992
AREA: 1 - LEGO  class: 6,   name: X1_Y3_Z2_FILLET,  x: 1.86269,    y: 1.9847

AREA: 2 - LEGO  class: 8,   name: X1_Y4_Z2,         x: 3.10256,    y: 4.01493
AREA: 2 - LEGO  class: 8,   name: X1_Y4_Z2,         x: 2.91906,    y: 4.11383
AREA: 2 - LEGO  class: 8,   name: X1_Y4_Z2,         x: 2.92037,    y: 3.90152

AREA: 4 - LEGO  class: 6,   name: X1_Y3_Z2_FILLET,  x: -2.07731,   y: 1.06888
AREA: 4 - LEGO  class: 0,   name: X1_Y1_Z2,         x: -2.0487,    y: 0.887195
AREA: 4 - LEGO  class: 1,   name: X1_Y2_Z1,         x: -1.87581,   y: 1.04202

So that we can know how the robot explored the environment. In this example, it firstly visited area 3, then area 1 and 2, and at the end area 4.

Requirements

  • Python 3
  • ROS Noetic
  • Ubuntu 20.04
  • After having cloned the repository:
cd src/robotic_vision/src/yolov5/
pip install -qr requirements.txt

robotics's People

Contributors

jo-valer avatar pietrofronzaunitn avatar stefanogenettiunitn avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar

Forkers

jovaler

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.