srobo / competition-simulator Goto Github PK
View Code? Open in Web Editor NEWA simulator for Student Robotics Virtual Competitions
Home Page: https://studentrobotics.org/docs/simulator/
License: MIT License
A simulator for Student Robotics Virtual Competitions
Home Page: https://studentrobotics.org/docs/simulator/
License: MIT License
Echo tips quite easily. Not sure what level we consider to be a feature instead of a bug but at the moment it seems to be on the side of bug
I have a feeling that the vision system may be slightly off in its representation of the size and or distance of the token makers.
The discrepancy originates in the fact that the MarkerInfo.size
value should be the size of the marker not the token, the former being smaller than the latter by enough to put tape around the marker for the construction of the token.
However the code currently uses only a single value for both concepts. This means that either:
Depending on other interactions, due to the way that the distance to the marker is calculated (relative to the centre of the token), this may also mean that the distance to the marker is misreported.
Wrap robot control with srobo api
For reference
Our current physical robot code: https://github.com/srobo/sr-robot
Our existing simulator: https://github.com/srobo/sr-turtle
Need some sort of documentation (Maybe diagrams) to show how we're modelling the motors/sensors against the SR kit
Investigate distance sensor / ultrasound functionality
sensor_devices.py and motor.py both have a way to convert one range of values to another. This should be pulled out into a common file
The current position of the camera on the arm means that your view is obscured when you are holding a token
Should throw NotImplementError when trying to control pinmode of Ruggeduino as explained here #55
We need a way of detecting arena walls and positioning things in the arena.
We need things on the arena walls which look like tokens, but respond to Webots' existing vision (#2).
Confirm the game to be played in the simulated competition
Probably the same or very similar to Two Colours https://www.studentrobotics.org/docs/resources/2020/rulebook.pdf
Lighting effects similar to the actual competition.
For example
The code I've been writing has barely any comments. Would be good to add more for future developers
Create full set of tokens for use in the competition
https://github.com/srobo/game-markers
Review and update rules where needed for the virtual competition
Implement a way to score each round automatically (and display this on screen?)
This will involve at least:
I suggest that we probably want a separate Supervisor to handle the scoring, if Webots supports having more than one.
Note that this scoring Supervisor may need to communicate with the arena Supervisor (see #216), so it may make sense for them to be the same process.
Integrate Libkoki or similar computer vision token system
Our physical competition is based on https://github.com/srobo/libkoki
Investigate grabber functionality
Add artificial jitter/inaccuracy to control and sensor readings
I put the MIT licence in when creating the repo but do we need to match the Apache one from the WeBots repo? https://github.com/cyberbotics/webots
Currently the project has absolute paths to the files. Should be relative to the project directory
related to #37
Main goals
At the moment there is no initialisation of pins for the ruggeduino in our API. On the real kit you need to set the pinmode before use. Need to decide on which option to go with and either write code or document this change.
It would be great if we could have some instructions on how to setup a development environment :)
Add markers to the arena wall so that the robots can work out where they're pointing
Allow teams to select from a range of robots or configure a robot
The initial world added to the repo is missing the texture images specified in the file.
Echo has a few different sensors. Ideally we will enable all distance/bump sensors through the Ruggeduino part of the API
Teams could do with a way to customise the way their robot looks. This would need to be easily changeable by the blueshirts running the code though (so they don't need to manually load individual texture files each time that robot competes). Not sure how we could do this.
Sometimes the robot loses parts of itself or manages to fall through things. We should fix this.
Currently the gripper behaviour with the SR wrapper isn't very logical. Preferably one motor to lift and one motor to open/close.
This should help the viewer understand what's happening but there's definitely potential for this to just be annoying
Echo has 2 sets of LEDs (RGB). We could allow teams to control them through the ruggeduino
Currently the distance sensors seem to look over the wall into infinity
This would remove the warning you get when loading the arena world
Find a way to have multiple camera angles on the livestream
After the duration of a match, robots automatically stop. We should have this!
Whether this is implemented as a pause on the simulation, or by killing the robot code doesn't especially matter.
Figure out a way to distribute the simulator to teams
It needs to be super simple for them to create and test their code
Currently mountains, could be more appropriate
Team CLY highlighted that the only cross-platform version of Python supported by Webots is 3.7 64-bit https://cyberbotics.com/doc/guide/using-python#introduction
It would be great if we could support competitors continuing to use the IDE for their development if they want to.
For this, we need:
For the archive script, it needs to be executable with three positional command line arguments and live inside a git repo. These are the arguments it will be passed:
.zip
)The user code directory will be a folder called user
within a temporary directory that the script can use as it sees fit. All content of the folder (which may include "hidden", i.e: dot prefixed, files) should be included in the archive.
For context, though this should not be relied upon: the script will be checked out also within the temporary directory, as a sibling of the user
directory:
/tmp/tmp-ide-checkout # will actually be a randomised `mktmp` style name
├── ilbRobot
│ ├── ...
│ └── .git
└── user
├── robot.py
├── ...
└── .user-rev
The current directory when the script is run is not defined.
At the moment many markers (tokens) can be seen even if they're actually well out of the camera's field of view off to one side, but still "square on".
In the current model, there's a concern that Arena wall markers likely behave weirdly due to the size
of a Token
(which assumes they are cubes).
We should validate that, when looking sideways at a wall marker, the camera only reports a single face as visible (or none at all if it's only obliquely seen). It doesn't matter what "name" that face has.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.