• 设为首页
  • 点击收藏
  • 手机版
    手机扫一扫访问
    迪恩网络手机版
  • 关注官方公众号
    微信扫一扫关注
    迪恩网络公众号

germain-hug/S2DHM: Sparse-to-Dense Hypercolumn Matching for Long-Term Visual Loc ...

原作者: [db:作者] 来自: 网络 收藏 邀请

开源软件名称(OpenSource Name):

germain-hug/S2DHM

开源软件地址(OpenSource Url):

https://github.com/germain-hug/S2DHM

开源编程语言(OpenSource Language):

Python 99.9%

开源软件介绍(OpenSource Introduction):

Sparse-To-Dense Hypercolumn Matching for Long-Term Visual Localization

This is the official repository for the 3DV 2019 paper Sparse-To-Dense Hypercolumn Matching for Long-Term Visual Localization. We introduce propose a novel approach to feature point matching, suitable for robust and accurate outdoor visual localization in long-term scenarios. The proposed solution achieves state-of-the-art accuracy on several outdoor datasets, in challenging categories such as day-to-night or cross-seasonal changes.

The proposed approach was ranked 2nd in the visual localization challenge of the CVPR 2019 challenge on Long-Term Visual Localization using this codebase, with state-of-the-art results on nighttime and rural environments.


Inlier visualization for (from left to right) Superpoint sparse-to-sparse matching, Superpoint detection with hypercolumn sparse-to-sparse matching, and sparse-to-dense hypercolumn matching.

Installation

Run the following commands to install this repository and the required dependencies:

git clone https://github.com/germain-hug/S2DHM.git
cd S2D_Hypercolumn_Matching/
git submodule update --init --recursive
pip3 install -r requirements.txt
mkdir -p data/triangulation

This code was run and tested on Python 3.7.3, using Pytorch 1.0.1.post2 although it should be compatible with some previous versions. You can follow instructions to install Pytorch here.

Required assets

To run this code, you will first need to download either RobotCar-Seasons or Extended CMU-Seasons from the website of the CVPR 2019 Challenge. Once unpacked, the root of the dataset should be updated accordingly in s2dm/configs/datasets/<your_dataset>.gin.

In addition, we provide a pre-computed reconstruction of both datasets computed using SuperPoint. These triangulations were obtained using scripts borrowed from HF-Net, please to their repository for more details. The triangulation .npz files can be downloaded from this link, and should be placed under data/triangulation/.

The pre-trained weights for the main image retrieval network can be found under checkpoints/.

Running


Overview of our sparse-to-dense hypercolumn matching pipeline.

You can run either of the following modes:

Nearest-neigbor pose approximation

This mode predicts the query pose based as the pose of the top-ranked image in the database.

python3 run.py --dataset [robotcar|cmu] --mode nearest_neighbor

Sparse-to-sparse SuperPoint matching

This mode predicts the query pose using sparse-to-sparse matching with SuperPoint detections and features in the query image.

python3 run.py --dataset [robotcar|cmu] --mode superpoint

Sparse-to-dense hypercolumn matching

This mode performs sparse-to-dense hypercolumn matching, as per the figure above.

python3 run.py --dataset [robotcar|cmu] --mode sparse_to_dense

Performance validation

After running, a .txt file is produced and saved under results/. This is the file that should be uploaded to the CVPR 2019 Visual Localization Challenge website to obtain the quantitative results.

Visualization

To export image logs in logs/ (to allow for visual comparison of the approach) you can add the --log_images argument when running the pipeline.

Configuration files

This codebase uses the gin configuration file system to store all high-level parameters. All configuration files can be found under s2dm/configs/ and can be used to update most of the hyper-parameters and data paths. At runtime, hyperparameters being used will be printed.

Adding your own dataset

This repository provides code to run the pipeline on two dataset: RobotCar-Seasons and Extended CMU-Seasons. It can however be setup for your own dataset.

To do so, you should create a new class inheriting BaseDataset and stored under s2dm/datasets/<your_dataset>.py. You can look at the robotcar_dataset.py or cmu_dataset.py for inspiration.

Then, you will have to add a new config file for your dataset in s2dm/datasets/<your_dataset>.gin and reference it in a new run gin config file under s2dm/runs/run_<mode>_on<your_dataset>.gin.

Lastly, you will have to make sure that you have a reconstruction .npz file stored in data/triangulation/. Please refer to the HF-Net repository to compute such files using COLMAP.

Citation

Please consider citing the corresponding publication if you use this work:

@inproceedings{germain2019sparsetodense,
  title={Sparse-To-Dense Hypercolumn Matching for Long-Term Visual Localization},
  author={Germain, H. and Bourmaud, G. and Lepetit, V.},
  article={International Conference on 3D Vision (3DV)},
  year={2019}
}



鲜花

握手

雷人

路过

鸡蛋
该文章已有0人参与评论

请发表评论

全部评论

专题导读
上一篇:
wangrenlei/localization_ambari: The ambari of localized support include chinese ...发布时间:2022-08-16
下一篇:
DeepFL/DeepFaultLocalization发布时间:2022-08-16
热门推荐
阅读排行榜

扫描微信二维码

查看手机版网站

随时了解更新最新资讯

139-2527-9053

在线客服(服务时间 9:00~18:00)

在线QQ客服
地址:深圳市南山区西丽大学城创智工业园
电邮:jeky_zhao#qq.com
移动电话:139-2527-9053

Powered by 互联科技 X3.4© 2001-2213 极客世界.|Sitemap