bazilinskyy / coupled-sim Goto Github PK
View Code? Open in Web Editor NEWCoupled simulator for research on driver-pedestrian interactions made in Unity3D.
Home Page: https://bazilinskyy.github.io/research/#multi-user-communication-in-traffic
Coupled simulator for research on driver-pedestrian interactions made in Unity3D.
Home Page: https://bazilinskyy.github.io/research/#multi-user-communication-in-traffic
At the moment, when experiment is running, the buttons from the UI stay on the screen. It would be nice to make them optional.
Framework for adding various (real-life) signs around the environment. As a start, we should add some basic signs that are used in The Netherlands (https://en.wikipedia.org/wiki/Road_signs_in_the_Netherlands).
At the moment, cars enabled for driving are quite unrealistic. We need to make the dashboard more realistic.
Add the sound of the engine to both cars controlled by humans and computer-controlled cars. Preferrable with a free asset.
Possibility to control a bicycle as a participant and have bots that are cyclists.
When using the steering wheel as an input.
Extend logged data with these parameters.
With the addition of bigger roads (#12), it will be beneficial to add support for controlling traffic.
Need to replace this texture with a more neutral image. One of the houses has a texture of some beer company (next to spawn location).
To avoid shaking due to 2 sensors for head rotation.
At the moment, the Xsens avatar (or mvn puppet in Xsens terminology) is a default model offered by Xsens, that does not look realistic. We should investigate the possibility to add textures, somewhat realistic-looking model of the human on top.
Importing OpenDrive format to build the road.
We fixed the jumps in value of speed on the speedometer by reducing the update rate. Now, the speed changes with big jumps. We should make it more smooth.
The car model has some issues:
Finalise documentation as README.md
. Possibly move to https://github.com/bazilinskyy/coupled-sim/wiki later.
Add a list of used assets to readme and information on their copyrights.
Now the world has only 2-lane roads (1 each direction). It could be nice for future experiments to also have a bigger road. Possibly also a highway as a ring road around the current city.
Add some free collection of traffic signs to be placed around the world as static objects.
Enabling cars to play sounds at certain timestamps/location in the world or based on events in the world.
A vehicle model with an interior that is designed for left-sided traffic.
Manually driven car is shaking when following WPs. Fixed for now with the car elevated over the ground. Possible cause is inconsistencies in the road surface.
Brainstorm a catchy name.
Related to #12. If we implement the highway, we would need to have more realistic physics engine for higher speeds.
When in manual driving mode, the hands of the person are not moving with the wheel. Need to bind the hands of the steering wheel with the steering wheel angle.
In the keyboard mode, the camera cannot be moved. Add movement of the camera with the mouse in addition to the WASD controls.
It would beneficial to know at least the average latency for the duration of the experiment.
Ability to have multiple cameras looking at different angles for the screen-output. For example, one screen can be showing the side mirror view. This should come with an interface to configure the views.
Add documentation to new files in https://github.com/bazilinskyy/coupled-sim/tree/master/Assets/Scripts
Add 2nd world from https://assetstore.unity.com/packages/3d/environments/roadways/windridge-city-132222. Check licensing of the world.
When we were working on the demo (https://www.youtube.com/watch?v=W2VWLYnTYrM), an experimenter had to run around quite a lot to launch everything. It will be more efficient to have a centralised host with a dashboard to launch an experiment and also a top-down view with locations of all moving objects to monitor the status of the experiment.
We can already run one client as the host. It should not be too much work to add the role of client that is showing a top view with some bright rectangles showing the current location of each agent.
Also add buttons for "Start all" that will start the experiment, "Start data logging" and "Visually sync" to sync across all agents.
It would be important to be able to control the cars with a controller. Currently, the automated car seems to have a clunky controller (not a smooth PID controller) and sticks to 30 kph. It would be good to be able to have the freedom and adjust e.g. the speed of the car based on e.g. the speed of another car, or the implement an automation-to-manual take over. Most of this is possible already but would be good to have an easy-to-use framework to control the behaviour of the automated car. Goes together with #30.
May not be easy to implement something realistic in Unity. But should be beneficial for research on eHMI.
Related to #19. Reusing results of that issue can help here.
Add support for different weather conditions, such as rain, fog, snow.
Allow for a scripted path, e.g. can upload a csv file with (x;y;speed) values and the car follows that path.
When pressing S on the keyboard or assigned button on the steering wheel, the car does not reverse.
Likely device is https://vr.tobii.com/products/htc-vive-pro-eye/
The avatar (or puppet in Xsens terminology) inside of an automated vehicle should be able to maintain eye contact with any (moving) point in the world (e.g., point of camera/eyes of a pedestrian). Such functionality should be optional. Following the completion of #25, the head of the puppet with textures should also be able to maintain eye contact.
Save file with config vars for each client before the experiment to be able to check the status of each client after the experiment.
Add realistic enough computer-controlled pedestrians.
To be able to export geometry of the road network of the city for post-processing.
Based on the results of #14. Once we have the eye tracker integrated, it would be nice to have the ability to render eye contact in some way. For example, by rendering a ray between the eyes of the pedestrian and the driver. Another option would be red rectangles at the level of the eyes, once the eye contact is established. The best option would be to have a realistic movement of the eyes of the puppets, but that would be rather tricky.
Currently, no speed of the car and its lateral position within the lane is recorded.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.