GithubHelp home page GithubHelp logo

srobo / competition-simulator Goto Github PK

View Code? Open in Web Editor NEW
7.0 22.0 2.0 4.94 MB

A simulator for Student Robotics Virtual Competitions

Home Page: https://studentrobotics.org/docs/simulator/

License: MIT License

Python 99.03% Shell 0.84% Makefile 0.12%
simulator webots

competition-simulator's People

Contributors

13ros27 avatar adimote avatar antoinepetty avatar jennylynnfletcher avatar oliverwales avatar peterjclaw avatar prenouf avatar prophile avatar raccube avatar realorangeone avatar sedders123 avatar willb97 avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

competition-simulator's Issues

Fix Echo's weight distribution

Echo tips quite easily. Not sure what level we consider to be a feature instead of a bug but at the moment it seems to be on the side of bug

Validate apparent size/distance of token makers

I have a feeling that the vision system may be slightly off in its representation of the size and or distance of the token makers.

The discrepancy originates in the fact that the MarkerInfo.size value should be the size of the marker not the token, the former being smaller than the latter by enough to put tape around the marker for the construction of the token.

However the code currently uses only a single value for both concepts. This means that either:

  • the size of the maker is misreported
  • the size of the token is incorrect

Depending on other interactions, due to the way that the distance to the marker is calculated (relative to the centre of the token), this may also mean that the distance to the marker is misreported.

Document device connectivity

Need some sort of documentation (Maybe diagrams) to show how we're modelling the motors/sensors against the SR kit

Pull out mapping function

sensor_devices.py and motor.py both have a way to convert one range of values to another. This should be pulled out into a common file

Arena markers

We need a way of detecting arena walls and positioning things in the arena.

We need things on the arena walls which look like tokens, but respond to Webots' existing vision (#2).

Arena corner colours

Colour (+number?) corners of arena as we would in reality to show which corner is related to each bot.

You can see the colours and numbers in this photo from last year's competition:
image

At the moment our arena is blank
image

Lighting effects

Lighting effects similar to the actual competition.

For example

  • dark before the match starts
  • brightly lit during match
  • dark after match

Rules update

Review and update rules where needed for the virtual competition

Live scoring

Implement a way to score each round automatically (and display this on screen?)

This will involve at least:

  • checking that we can detect cases where an object is "inside" or "touching" another object (at least in the 2D arena sense, possibly as well as implementing the three-corner rule or similar)
  • recording changes to that state, either as a log of change events or as snapshots of the new state on change, or similar so that we can replay the scoring later; likely we'd want match-timestamps here to help line up to the animation

I suggest that we probably want a separate Supervisor to handle the scoring, if Webots supports having more than one.
Note that this scoring Supervisor may need to communicate with the arena Supervisor (see #216), so it may make sense for them to be the same process.

Grabber

Investigate grabber functionality

Relative texture paths

Currently the project has absolute paths to the files. Should be relative to the project directory

Make material paths relative

image
As highlighted by a team on the forum, the paths to certain textures are Windows-specific. They should work on any platform

Decide on Ruggeduino initialisation

At the moment there is no initialisation of pins for the ruggeduino in our API. On the real kit you need to set the pinmode before use. Need to decide on which option to go with and either write code or document this change.

Development instructions

It would be great if we could have some instructions on how to setup a development environment :)

Arena markers

Add markers to the arena wall so that the robots can work out where they're pointing

Add all Echo bot sensors to API

Echo has a few different sensors. Ideally we will enable all distance/bump sensors through the Ruggeduino part of the API

Aesthetic customisation of robot

Teams could do with a way to customise the way their robot looks. This would need to be easily changeable by the blueshirts running the code though (so they don't need to manually load individual texture files each time that robot competes). Not sure how we could do this.

Fix clipping

Sometimes the robot loses parts of itself or manages to fall through things. We should fix this.

Fix gripper controls

Currently the gripper behaviour with the SR wrapper isn't very logical. Preferably one motor to lift and one motor to open/close.

Add sounds to simulation

This should help the viewer understand what's happening but there's definitely potential for this to just be annoying

Raise arena wall

Currently the distance sensors seem to look over the wall into infinity

Automatic Match Stop

After the duration of a match, robots automatically stop. We should have this!

Whether this is implemented as a pause on the simulation, or by killing the robot code doesn't especially matter.

Distribution to teams

Figure out a way to distribute the simulator to teams
It needs to be super simple for them to create and test their code

Add IDE support

It would be great if we could support competitors continuing to use the IDE for their development if they want to.

For this, we need:

  • a script which can build a suitable archive on exporting from the IDE, which will need access to the content of this repo for inclusion (#87)
  • to configure the IDE (either in the IDE itself or our puppet config) to use that script
  • to configure the IDE to use this API as its reference (just using the code from this repo might work, though that should be tested)
  • to configure the IDE for Python 3 code being developed

For the archive script, it needs to be executable with three positional command line arguments and live inside a git repo. These are the arguments it will be passed:

  • the TLA
  • the directory containing the user code
  • the destination archive to create (will be a .zip)

The user code directory will be a folder called user within a temporary directory that the script can use as it sees fit. All content of the folder (which may include "hidden", i.e: dot prefixed, files) should be included in the archive.

For context, though this should not be relied upon: the script will be checked out also within the temporary directory, as a sibling of the user directory:

/tmp/tmp-ide-checkout  # will actually be a randomised `mktmp` style name
├── ilbRobot
│   ├── ...
│   └── .git
└── user
    ├── robot.py
    ├── ...
    └── .user-rev

The current directory when the script is run is not defined.

Validate vision of arena wall markers

In the current model, there's a concern that Arena wall markers likely behave weirdly due to the size of a Token (which assumes they are cubes).

We should validate that, when looking sideways at a wall marker, the camera only reports a single face as visible (or none at all if it's only obliquely seen). It doesn't matter what "name" that face has.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.