开源软件名称(OpenSource Name):GrumpyZhou/visloc-relapose开源软件地址(OpenSource Url):https://github.com/GrumpyZhou/visloc-relapose开源编程语言(OpenSource Language):Jupyter Notebook 96.5%开源软件介绍(OpenSource Introduction):A 2D Visual Localization Framework based on Essential MatricesThis repository provides implementation of our paper accepted at ICRA: To Learn or Not to Learn: Visual Localization from Essential Matrices To use our code, first download the repository:
Setup Running EnvironmentWe have tested the code on Linux Ubuntu 16.04.6 under following environments:
We recommend to use Anaconda to manage packages. Run following lines to automatically setup a ready environment for our code.
Otherwise, one can try to download all required packages separately according to their offical documentation. Prepare DatasetsOur code is flexible for evaluation on various localization datasets. We use Cambridge Landmarks dataset as an example to show how to prepare a dataset:
7Scenes DatasetsWe follow the camera pose label convention of Cambridge Landmarks dataset. Similarly, you can download our pairs for 7Scenes. For other datasets, contact me for information about preprocessing and pair generation. Feature-based: SIFT + 5-Point SolverWe use the SIFT feature extractor and feature matcher in colmap. One can follow the installation guide to install colmap. We save colmap outputs in database format, see explanation. Preparing SIFT featuresExecute following commands to run SIFT extraction and matching on CambridgeLandmarks:
Here CambridgeLandmarks is the folder name that is consistent with the dataset folder. So you can also use other dataset names such as 7Scenes if you have prepared the dataset properly in advance. Evaluate SIFT within our pipelineExample to run sift+5pt on Cambridge Landmarks:
More evaluation examples see: sift_5pt.sh. Check example outputs Visualize SIFT correspondences using notebooks/visualize_sift_matches.ipynb. Learning-based: Direct Regression via EssNetThe pipeline.relapose_regressor module can be used for both training or testing our regression networks defined under networks/, e.g., EssNet, NCEssNet, RelaPoseNet... We provide training and testing examples in regression.sh.
The module allows flexible variations of the setting. For more details about the module options, run TrainingHere we show an example how to train an EssNet model on ShopFacade scene.
This command produces outputs are available online here. Visdom (optional)As you see in the example above, we use Visdom server to visualize the training process. One can adapt the meters to plot inside utils/common/visdom.py.
If you DON'T want to use visdom, just remove the last line Trained models and weightsWe release all trained models that are used in our paper. One can download them from pretrained regression models. We also provide some pretrained weights on MegaDepth/ScanNet. TestingHere is a piece of code to test the example model above.
This testing code outputs are shown in test_results.txt. For convenience, we also provide notebooks/eval_regression_models.ipynb to perform evaluation. Hybrid: Learnable Matching + 5-Point SolverIn this method, the code of the NCNet is taken from the original implementation https://github.com/ignacio-rocco/ncnet. We use their pre-trained model but we only use the weights for neighbourhood consensus(NC-Matching), i.e., the 4d-conv layer weights. For convenience, you can download our parsed version nc_ivd_5ep.pth. The models for feature extractor initialization needs to be downloaded from pretrained regression models in advance, if you want to test them. Testing example for NC-EssNet(7S)+NCM+5Pt (Paper.Tab2)In this example, we use NCEssNet trained on 7Scenes for 60 epochs to extract features and use the pre-trained NC Matching layer to get the point matches. Finally the 5 point solver calculates the essential matrix. The model is evaluated on CambridgeLandmarks.
Example outputs is available in essncn_7sc_60ep+ncn.txt. If you don't want to save THE intermediate matches extracted, remove THE option |
2023-10-27
2022-08-15
2022-08-17
2022-09-23
2022-08-13
请发表评论