DKM: Dense Kernelized Feature Matching for Geometry Estimation
Project Page | Paper
DKM: Dense Kernelized Feature Matching for Geometry Estimation
Johan Edstedt, Ioannis Athanasiadis, Mårten Wadenbäck, Michael Felsberg
CVPR 2023
How to Use?
Warp: [B,H,W,4] for all images in batch of size B, for each pixel HxW, we ouput the input and matching coordinate in the normalized grids [-1,1]x[-1,1].
Certainty: [B,H,W] a number in each pixel indicating the matchability of the pixel.
See demo for two demos of DKM.
See api.md for API.
Qualitative Results
Benchmark Results
Megadepth1500
@5 | @10 | @20 | |
---|---|---|---|
DKMv1 | 54.5 | 70.7 | 82.3 |
DKMv2 | 56.8 | 72.3 | 83.2 |
DKMv3 (paper) | 60.5 | 74.9 | 85.1 |
DKMv3 (this repo) | 60.0 | 74.6 | 84.9 |
Megadepth 8 Scenes
@5 | @10 | @20 | |
---|---|---|---|
DKMv3 (paper) | 60.5 | 74.5 | 84.2 |
DKMv3 (this repo) | 60.4 | 74.6 | 84.3 |
ScanNet1500
@5 | @10 | @20 | |
---|---|---|---|
DKMv1 | 24.8 | 44.4 | 61.9 |
DKMv2 | 28.2 | 49.2 | 66.6 |
DKMv3 (paper) | 29.4 | 50.7 | 68.3 |
DKMv3 (this repo) | 29.8 | 50.8 | 68.3 |
Navigating the Code
- Code for models can be found in dkm/models
- Code for benchmarks can be found in dkm/benchmarks
- Code for reproducing experiments from our paper can be found in experiments/
Install
Run pip install -e .
Demo
A demonstration of our method can be run by:
python demo_match.py
This runs our model trained on mega on two images taken from Sacre Coeur.
Benchmarks
See Benchmarks for details.
Training
See Training for details.
Reproducing Results
Given that the required benchmark or training dataset has been downloaded and unpacked, results can be reproduced by running the experiments in the experiments folder.
Using DKM matches for estimation
We recommend using the excellent Graph-Cut RANSAC algorithm: https://github.com/danini/graph-cut-ransac
@5 | @10 | @20 | |
---|---|---|---|
DKMv3 (RANSAC) | 60.5 | 74.9 | 85.1 |
DKMv3 (GC-RANSAC) | 65.5 | 78.0 | 86.7 |
Acknowledgements
We have used code and been inspired by https://github.com/PruneTruong/DenseMatching, https://github.com/zju3dv/LoFTR, and https://github.com/GrumpyZhou/patch2pix. We additionally thank the authors of ECO-TR for providing their benchmark.
BibTeX
If you find our models useful, please consider citing our paper!
@inproceedings{edstedt2023dkm,
title={{DKM}: Dense Kernelized Feature Matching for Geometry Estimation},
author={Edstedt, Johan and Athanasiadis, Ioannis and Wadenbäck, Mårten and Felsberg, Michael},
booktitle={IEEE Conference on Computer Vision and Pattern Recognition},
year={2023}
}