开源软件名称(OpenSource Name):mcahny/object_localization_network开源软件地址(OpenSource Url):https://github.com/mcahny/object_localization_network开源编程语言(OpenSource Language):Python 99.9%开源软件介绍(OpenSource Introduction):Learning Open-World Object Proposals without Learning to ClassifyRA-L and ICRA 2022)
Pytorch implementation for "Learning Open-World Object Proposals without Learning to Classify" (Dahun Kim, Tsung-Yi Lin, Anelia Angelova, In So Kweon, and Weicheng Kuo. @article{kim2021oln,
title={Learning Open-World Object Proposals without Learning to Classify},
author={Kim, Dahun and Lin, Tsung-Yi and Angelova, Anelia and Kweon, In So and Kuo, Weicheng},
journal={IEEE Robotics and Automation Letters (RA-L)},
year={2022}
} IntroductionHumans can recognize novel objects in this image despite having never seen them before. “Is it possible to learn open-world (novel) object proposals?” In this paper we propose Object Localization Network (OLN) that learns localization cues instead of foreground vs background classification. Only trained on COCO, OLN is able to propose many novel objects (top) missed by Mask R-CNN (bottom) on an out-of-sample frame in an ego-centric video. Cross-category generalization on COCOWe train OLN on COCO VOC categories, and test on non-VOC categories. Note our AR@k evaluation does not count those proposals on the 'seen' classes into the budget (k), to avoid evaluating recall on see-class objects.
DisclaimerThis repo is tested under Python 3.7, PyTorch 1.7.0, Cuda 11.0, and mmcv==1.2.5. InstallationThis repo is built based on mmdetection. You can use following commands to create conda env with related dependencies.
Please also refer to get_started.md for more details of installation. Prepare datasetsCOCO dataset is available from official websites. It is recommended to download and extract the dataset somewhere outside the project directory and symlink the dataset root to $OLN/data as below.
TestingOur trained models are available for download here. Place it under
Training
ContactIf you have any questions regarding the repo, please contact Dahun Kim (mcahny01@gmail.com) or create an issue. |
2023-10-27
2022-08-15
2022-08-17
2022-09-23
2022-08-13
请发表评论