Integrate control and guidance algorithms with the rest of the Autobot Racing system.
Pass output of control systems to the vehicle transmitters.
Test the complete integrated system with one vehicle.
Test the complete integrated system with multiple vehicles.
Acceptance Criteria
The full Autobot Racing system and the ControlSim simulator should use the same control/guidance algorithm codebase.
Given a single vehicle test, the vehicle should successfully navigate the track.
Given a multiple vehicle test, the vehicles should all successfully navigate the track, barring any collisions between vehicles due to a lack of overtaking capabilities in the guidance system.
Given the complete integrated system, delay from computer vision vehicle identification to control system output must be negligible.
Use the vehicle target to determine the vehicle’s heading.
Pass vehicle headings through a Kalman filter. This filter will first be implemented in another user story.
Store a set number of previous vehicle headings, for use with later control systems.
Test the system using a variety of vehicles at different locations on the track and pointed at different headings.
Acceptance Criteria
Given a single vehicle in view of the camera, its position is accurately recorded.
Given multiple vehicles in view of the camera, the position of each is accurately recorded.
Given a moving vehicle, its position is updated with a high degree of accuracy.
Given the computer vision system and a standard web-camera, a high image sampling rate can be achieved, allowing for a high-fidelity recording of the vehicle position over time.
Perform testing in different lighting environments to determine computer vision failure modes.
Make adjustments to CV tuning parameters to compensate for different lighting conditions.
Research methods to automatically adjust CV parameter tuning based on environment.
Test adjustments to ensure that the computer vision system works reliably in environments commonly found in the Lawson Computer Science Building.
Acceptance Criteria:
Given multiple reasonable lighting environments (in LWSN), the computer vision system is robust enough to function in any of them.
Given a proper lighting environment, the computer vision system is able to capture and track moving vehicles in all locations on the track.
Given a proper lighting environment, the computer vision system tracks moving vehicles with minimal “drops”, or brief periods of time when vehicles cannot be detected.
Cause the computer vision system to poll the camera at a set frequency and determine the location of each vehicle.
Pass vehicle locations through a Kalman filter. This filter will first be implemented in another user story.
Implement a function to determine if a vehicle is within the boundaries of the track.
Store a set number of previous vehicle positions, for use with later control systems.
Determine a system response when vehicle cannot be located.
Test the system using a variety of vehicles at different locations on the track.
Test the system response when a vehicle cannot be found.
Acceptance Criteria
Given a single vehicle in view of the camera, its position is accurately recorded.
Given multiple vehicles in view of the camera, the position of each is accurately recorded.
Given a moving vehicle, its position is updated with a high degree of accuracy.
Given the computer vision system and a standard web-camera, a high image sampling rate can be achieved, allowing for a high-fidelity recording of the vehicle position over time.
Create a new version of the wall-following guidance system that allows for variable distance from the wall.
Determine distance from one point on track to another. (Not the straight-line or Euclidean distance)
Increase or decrease the vehicle’s distance from the wall when it is within a set track distance of the next car in front of it and its speed is faster than the car in front.
Test the system using the ControlSim application developed in Sprint 1.
Test the system using the real vehicles and track, if possible.
Acceptance Criteria
Given an overtake situation where vehicle A is gaining on vehicle B, the distance from the wall of vehicle A is appropriately adjusted for passing.
Given an overtake situation when the slower vehicle is on the inside of the track, the overtaking vehicle’s distance from the wall will be increased.
Given an overtake situation when the slower vehicle is on the outside of the track, the overtaking vehicle’s distance from the wall will be decreased.
Write functions to connect the currently existing buttons.
Write a start function to start a new race.
Write a stop function to permanently stop a race.
Write a pause function to temporarily stop a race.
Write a resume function to resume a paused race.
Test the start/stop/pause/resume functionality and buttons under various race conditions.
Acceptance Criteria:
Clicking the “Start Race” button must start the race, causing cars to begin to move.
Clicking the “Pause Race” button should temporarily stop a race. This stops the cars but the cars should resume from their current positions once the resume button is clicked.
Clicking the “Resume Race” button should restart a previously paused race. Cars should continue from their paused positions, and the race will continue as normal.
Clicking the “Stop Race” button must immediately halt the race, stopping the cars. This ends the race permanently, and cars must be reset before a new race can begin.
Create APIs for use by researchers to allow the use of researcher-designed control systems.
Test the associated APIs.
Acceptance Criteria:
An API is created which allows for multiple control systems to be used simultaneously, each on a different vehicle.
The framework provides an interface to the camera to allow for computer vision.
Given a certain specification for the structure of a control system, a researcher can create a control system that will interact successfully with the framework.
Assigned To: Harold Smith + Ben Huemann
Workload: 16 hours