GithubHelp home page GithubHelp logo

lcav / continuous-localization Goto Github PK

View Code? Open in Web Editor NEW
9.0 6.0 2.0 11.43 MB

Code of paper "Relax and Recover: Guaranteed Range-Only Continuous Localization"

License: MIT License

Jupyter Notebook 14.31% Python 81.17% Shell 2.21% Makefile 0.23% TeX 2.07%

continuous-localization's Introduction

Relax and Recover: Guaranteed Range-Only Continuous Localization

Build Status

Code for continuous localization based on sparse range measurements only, including recovery guarantees. If you use this code, please cite the paper:

@article{8978573,
  author={Pacholska, Michalina and D{\"u}mbgen, Frederike and Scholefield, Adam},
  journal={IEEE Robotics and Automation Letters}, 
  title={Relax and Recover: Guaranteed Range-Only Continuous Localization}, 
  year={2020},
  volume={5},
  number={2},
  pages={2248-2255},
  doi={10.1109/LRA.2020.2970952}
}

Authors

Created by (in alphabetic order):

  • Adam Scholefield
  • Frederike Duembgen
  • Michalina Pacholska

Installation

Get this repository using:

git clone  https://github.com/LCAV/continuous-localization.git

You can install all standard python requirements it (at least) two ways:

  1. using pip in you favourite Python 3 environment:
    pip install -r requirements.txt
    
  2. using conda, that will create virtual environment for you:
    conda env create -f environment.yml
    

Not that for some plotting and saving functionalities, you need to have LaTeX installed.

Contents

Notebooks

  • GenerateAllFigures.ipynb: generate the Figures used in the paper.
  • PublicDatasets.ipynb: evaluation of public datasets.

Scripts (scripts/)

  • generate_results: code to generate the results for different algorithms (see table in paper).

Other

  • bin/ Scripts used for formatting and automatic testing of this repository.
  • datasets/: Lawnmower and WiFi datasets for range-only localization. See datasets/README.md for descriptions.

Contribute

If you want to contribute to this repository, please check CONTRIBUTE.md.

Documentation

To look at documentation locally, run

make html

and then open the file build/html/index.html in a browser.

License

Copyright (c) 2018 Frederike Duembgen, Michalina Pacholska, Adam Scholefield

Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:

The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.

THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.

continuous-localization's People

Contributors

adamscholefield avatar dependabot[bot] avatar duembgen avatar micha7a avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar

continuous-localization's Issues

Fix Travis tests

So, I think Travis tests are failing even though they shouldn't, but I don't know how to force run them, so I made a dummy pull request :P See #73 for the crash.

Investigate velocity sensitivity

In the trajectory reconstruction we assume constant velocity. The question is weather a small error in this constant velocity can have a high impact on the reconstruction accuracy.

NIT: mistake in test_exact_solution?

Not really important, to be checked later:
np.testing.assert_almost_equal should take a integer as a third parameter (decimal), but it takes a small float eps, which means that the precision is 10**eps ~= 1. Is it a bug or a feature?

Simplify/refactor things

This is an issue for all things that we could do better, but we don't because it's research code ;)

  • simplify generating trajectory times from distances
  • get trajectory times from timestamps (instead of tango distances)
  • use DF more in reconstructing trajectory
  • get right trajectory type and model size without using specific trajectory
  • make pandas float32 or numpy float64
  • use DF for distance plots
  • clean up the mess I made in the reconstruction notebook
  • remove the .csv.npy files

Failing build: Missing datasets

It seems that the website with the datasets is down since some time :/
I can take care of the build, but I need to know what options do we have:

  • remove the tests that use the datasets
  • put datasets somewhere public (is it legal?)
  • put datasets somewhere non-public (is it possible to then give travis access to the files? and does it even make sense from reproducibility standpoint)

Make simulation of one measurement per time instance

  • change sampling scheme
  • clean up the notebook (remove not relevant plots)
  • make new comparisons with theory
  • see if there is a good way to calculate the limits
  • use sampling scheme in simulations that use solvers
  • rerun the simulations

Make more plots ;)

  • color distance plots according to the timestamp
  • check what is the (original) sampling rate
  • check how the reconstruction behaves in the presence of outliers
  • plot higher order polynomials and see if it seems natural

Identify key solvers

Find out which solvers we want to work with,
and if they are in accordance to the paper.

Cut code that is not used anymore.

Change units to millimeters

Currently some functions take parameters in meters, and some millimeters. This should be easy to fix, only spot that requires thinking is the curvature discretization in the get_left_and_right_arcs.

Code cleaning - TODO list

  • Approve pull request
  • Create release for paper
  • Put all plots into one notebook
  • Find out which functions are unused with some program
  • Clean documents / notebooks
  • Clear some unwanted folders etc.
  • Go through other issues and decide what to do with them
  • Go through "TODOs" in code and clear them / fix them.
  • Decide what to do with get_full_matrix and get_C_constraints
  • Delete algorithms which we do not use / need anymore

Can make the repo public from here on

  • Decide if we want to have a (basic) documentation
  • See if we want to add important images to GenerateAllFigures or README.
  • Good practice with imports of multiple modules at once
  • Add uniform headers for modules and notebooks

Investigate SRLS solution

  • See if there is global solution for the trajectory coefficients if we formulate the problem as a SRLS problem.
  • If yes, implement it.

Submodules, PyCharm and __init__.py

So, I have this issue with PyCharm warning if files are imported locally. For example,

from common import test_prepare

is wrong, but

from test.common import test_prepare

is OK (for PyCharm).
But for python to understand it, there has to be an __init__.py file in the test directory. I'd prefer to add the init file, it will also make it clear that the common file is specifically for tests. But I don't want to do this without a second opinion.

Generate robot "pre-trajectories"

The idea is to start the robot always a the same place, and go to the starting point of the trajectory, then wait, and then do the trajectory, so:

  • we can calibrate the distance using those "pre-trajectories"
  • we will know then the starting point and starting direction

Create trajectory generator

We want to be able to

  • generate some random trajectory (not in model)
  • get simulated measurements along the trajectory

Fit translation/rotation of trajectory to distances

We forgot to measure / did not consider important, the starting point and direction of the trajectory. Maybe we could reconstruct it by solving something like

$$R, T = argmin \sum_n=1^T ||D_n - ||(RP_n + T) - A|| ||_F$$

where A contains all anchor coordinates, R and T are the rotaton and translation matrices/vectors and D_n, P_n contain the distances and position at time n.

Probably this could be solved in a quick and dirty way with some scipy optimization function.

Remove plotting_backup

The functions *_old from plotting_backup should be replaced with newer, cleaner functions. Need to make sure that then we can still reproduce the plots from the paper.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.