GithubHelp home page GithubHelp logo

ozzie00 / hfnet Goto Github PK

View Code? Open in Web Editor NEW

This project forked from ethz-asl/hfnet

0.0 2.0 0.0 99.9 MB

From Coarse to Fine: Robust Hierarchical Localization at Large Scale with HF-Net (https://arxiv.org/abs/1812.03506)

License: MIT License

Python 83.53% Shell 1.75% CMake 0.08% C++ 14.62% Makefile 0.01%

hfnet's Introduction

HF-Net: Robust Hierarchical Localization at Large Scale

This repository accompanies our CVPR 2019 paper From Coarse to Fine: Robust Hierarchical Localization at Large Scale. We introduce a 6-DoF visual localization method that is accurate, scalable, and efficient, using HF-Net, a monolithic deep neural network for descriptor extraction. The proposed solution achieves state-of-the-art accuracy on several large-scale public benchmarks while running in real-time.

The proposed approach won the visual localization challenge of the CVPR 2019 workshop on Long-Term Visual Localization using this codebase. We also provide trained weights for HF-Net and reconstructed SfM 3D models.


Our method is significantly more robust, accurate, and scalable than standard approaches based on direct matching.

This code allows to:

  • Perform state-of-the-art 6-DoF hierarchical localization using a flexible Python pipeline
  • Train HF-Net with multi-task distillation in TensorFlow
  • Evaluate feature detectors and descriptors on standard benchmarks
  • Build Structure-from-Motion models based on state-of-the-art learned features

Setup

Python 3.6 is required. It is advised to run the following command within a virtual environment. By default, TensorFlow 1.12 GPU will be installed. You will be prompted to provide the path to a data folder (subsequently referred as $DATA_PATH) containing the datasets and pre-trained models and to an experiment folder ($EXPER_PATH) containing the trained models, training and evaluation logs, and CNN predictions. Create them wherever you wish and make sure to provide absolute paths. PyTorch 0.4.1 is also required to run the original SuperPoint and perform GPU-accelerated feature matching.

make install  # install Python requirements, setup paths

Refer to our dataset documentation for an overview of the supported datasets and their expected directory structure.

Demo

We provide a minimal example of the inference and localization with HF-Net in demo.ipynb. Download the trained model here and unpack it in $EXPER_PATH/saved_models/.


HF-Net simultaneously computes global descriptors and local features with an efficient architecture.

6-DoF Localization

We provide code to perform and evaluate our hierarchical localization on the three challenging benchmark datasets of Sattler et al: Aachen Day-Night, RobotCar Seasons, and CMU Seasons.

Required assets

Download the datasets as indicated in the dataset documentation. SfM models of Aachen, RobotCar, CMU, and Extended CMU, built SuperPoint and usable with HF-Net, are provided here. Download and unpack the HF-Net weights in $EXPER_PATH/hfnet/. To localize with NV+SP, download the network weights of NetVLAD and SuperPoint and put them in $DATA_PATH/weights/.

Exporting the predictions

We first export the local features and global descriptors for all database and query images as .npz files. For the sake of flexibility, local descriptors are exported as dense maps for database images, but as sparse samples for query images.

For HF-Net or SuperPoint:

python3 hfnet/export_predictions.py \
	hfnet/configs/[hfnet|superpoint]_export_[aachen|cmu|robotcar]_db.yaml \
	[superpoint/][aachen|cmu|robotcar] \
	[--exper_name hfnet] \ # for HF-Net only
	--keys keypoints,scores,local_descriptor_map[,global_descriptor]
python3 hfnet/export_predictions.py \
	hfnet/configs/[hfnet|superpoint]_export_[aachen|cmu|robotcar]_queries.yaml \
	[superpoint/][aachen|cmu|robotcar] \
	[--exper_name hfnet] \ # for HF-Net only
	--keys keypoints,scores,local_descriptors[,global_descriptor]

For NetVLAD:

python3 hfnet/export_predictions.py \
	hfnet/configs/netvlad_export_[aachen|cmu|robotcar].yaml \
	netvlad/[aachen|cmu|robotcar] \
	--keys global_descriptor

Localization

For Aachen:

python3 hfnet/evaluate_aachen.py \
	<sfm_model_name_or_path> \
	<eval_name>_[night|day] \
	--local_method [hfnet|superpoint|sift] \
	--global_method [hfnet|netvlad] \
	--build_db \
	--queries [night_time|day_time] \
	--export_poses

For RobotCar:

python3 hfnet/evaluate_robotcar.py \
	<sfm_model_name_or_path> \
	<eval_name> \
	--local_method [hfnet|superpoint|sift] \
	--global_method [hfnet|netvlad] \
	--build_db \
	--queries [dusk|sun|night|night-rain] \
	--export_poses

For CMU:

python3 hfnet/evaluate_cmu.py \
	<sfm_model_name_or_path> \
	<eval_name> \
	--local_method [hfnet|superpoint|sift] \
	--global_method [hfnet|netvlad] \
	--build_db \
	--slice [2|3|4|5|6|7|8|9|10|17] \
	--export_poses

The localization parameters can be adjusted in hfnet/evaluate_[aachen|robotcar|cmu].py. The evaluation logs and estimated poses are written to $EXPER_PATH/eval/[aachen|robotcar|cmu]/<eval_name>*. Of particular interest are the PnP+RANSAC success rate, the average number of inliers per query, and the average inlier ratio.

Visualization

Successful and failed queries can be visualized in notebooks/visualize_localization_[aachen|robotcar|cmu].ipynb.

Training with multi-task distillation

Instructions to train HF-Net are provided in the training documentation.

Evaluation of local features

Instructions to evaluate feature detectors and descriptors on the HPatches and SfM datasets are provided in the local evaluation documentation.

Building new SfM models

Instructions and scripts to build SfM models using COLMAP for any learned features are provided in colmap-helpers.

Citation

Please consider citing the corresponding publication if you use this work in an academic context:

@inproceedings{sarlin2019coarse,
  title={From Coarse to Fine: Robust Hierarchical Localization at Large Scale},
  author={Sarlin, P.-E. and Cadena, C. and Siegwart, R. and Dymczyk, M.},
  article={CVPR},
  year={2019}
}

hfnet's People

Contributors

dymczykm avatar sarlinpe avatar skydes avatar

Watchers

 avatar  avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.