GithubHelp home page GithubHelp logo

cavalli1234 / adalam Goto Github PK

View Code? Open in Web Editor NEW
312.0 9.0 42.0 3.29 MB

AdaLAM is a fully handcrafted realtime outlier filter integrating several best practices into a single efficient and effective framework. It detects inliers by searching for significant local affine patterns in image correspondences.

License: BSD 3-Clause "New" or "Revised" License

Python 100.00%
computervision image-matching

adalam's People

Contributors

cavalli1234 avatar dawars avatar ducha-aiki avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

adalam's Issues

About matching to the same point

Hello @cavalli1234, thank you for the great work.
I am trying the provided code, brilliant btw, but it looks like the matching could match to the same point right?

The following list is the matches generated from my test.

# testing
adamatcher = AdalamFilter()
adamatcher.config['scale_rate_threshold'] = None
adamatcher.config['orientation_difference_threshold'] = None
adamatcher.config['area_ratio'] = 10
adamatcher.config['search_expansion'] = 10

matches = adamatcher.match_and_filter(kpA, kpB, descA, descB, im1shape=None, im2shape=None, o1=None, o2=None, s1=None, s2=None).cpu().numpy()
 
# matches
[ 19 359]
 [ 23 429]
 [ 24 413]
 [ 53 389]
 [ 54 382]
 [ 55 385]
 [ 74 503]
 [ 84 403]
 [ 95 465]
 [ 99 526]
 [115 543] <---
 [118 490]
 [119 543] <---
 [124 543] <---
 [137 465]
 [147 420]
 [152 589]
 [153 495]
 [172 589]

Is this an expected feature? Thank you

setup env for cpu

we have a mac, how to setup env for cpu? � Any help would be appreciated.

some questions In function of confidence_based_inlier_selection

  • Thank you very much for your article and code. You have provided me a great help on outlier filtering. I have read every line of your code and thesis thoughts. Both the idea and the code are extremely good I have seen,but I have a little doubt about the function of confidence_based_inlier_selection in ransac.py.
  • The question is about Code on line 43 in ransac.py( _, inv_indices, res_dup_counts = torch.unique_consecutive(sorted_res_sqr.half().float(), dim=1, return_counts=True, return_inverse=True)
  • I guess you want to delete duplicate value in order to prevent this duplicate value to disorder find inliers in uniformly scattered outliers(maybe my English is hard to understand because i am a Chinese...i'm so sorry about it QAQ)for example,after computing of residuals,i got the value of residuals :(0.0000e+00, 0.0000e+00, 0.0000e+00, 1.9729e-05, 7.6246e-04, 1.8478e-02). in order to equip the real residual with construct continuous residuals,i must to delete the value (0.0000e+00, 0.0000e+00, 0.0000e+00),so i can guarantee the rest of values can continuous growth?
  • But i find ,in each iteration of 128 iterations,they both user the same index we find duplicate value in first iteration ,but i think each iteration have different values,so their index of duplicate value must be different. Unless the offset of these values in xamples is very small so no matter which A can make its residual very littel?
  • I'm so confused about it, so i want to ask for you ,really bother you very much!

Suggestion on batch processing & how to estimate the scale and orientation components of local features

Hello! Thank you very much for releasing the implementation of this work.
The runtime & performance are quite impressive.

  • Here, I would like to ask for your suggestion on how to extend this work. Actually, do you have a version of this work for batch processing over multiple images at the same time? Is it possible?

  • Also, could you please recommend how to estimate the scale and orientation components of local features? I have actually used the SuperPoint as the local feature. I am not sure how to get these two components ?

Question about sqlite3.OperationalError: unable to open database file?

I put the database.db file and the image_pairs_to_match.txt file into the examples folder, and when I run the match_colmap_database_example.py file in the examples folder, using the conmand:

python match_colmap_database_example.py --database_path autodl-tmp/AdaLAM-master/examples --image_pairs_path autodl-tmp/ AdaLAM-master/examples/image_pairs_to_match.txt
which produces an error with the following path:

Traceback (most recent call last):
File "match_colmap_database_example.py", line 11, in
matcher.match_colmap_database(database_path=opt.database_path, image_pairs_path=opt.image_pairs_path)
File "/root/miniconda3/envs/adalam/lib/python3.7/site-packages/adalam/adalam.py", line 189, in match_colmap_database
connection = sqlite3.connect(database_path)
sqlite3.OperationalError: unable to open database file
What is the solution?

Could you help explain in extract_local_patterns Line 60

Hi @cavalli1234 @Dawars !

Thanks so much for sharing the code. I love reading your paper, and it proves that there are so many good combinations that can create a state of the art performance. I am learning about it from your paper.

Here, I tried to understand how the outlier rejection works from the code. However, I have stumped upon extract_local_patterns Line 60. I wonder what are you trying to do ? Could you please help explain?

Line:60 expanded_local_scores = scores[tokp1] + ransidx.type(scores.dtype)

I wonder why you have added them together?
The score and ransidx are variables that are used for a totally different purpose. The score was containing the ratio of the lowest value and the second-lowest value (I guess it is related to Lowe's thresholding). Meanwhile, the ransidx was from Line 51, which are the row of fnn_to_seed_local_consistency_map_corr whose number of neighbors are higher than a specified value of inliers, or specifically, it is...:

**Line:51**    ransidx, tokp1 = torch.where(fnn_to_seed_local_consistency_map_corr)

Or is this the way you re-order to be the same as ransidx? Could you please kindly explain.... :)

use ratio test when no correspondences survive

Hi, thank you for your great work.

While testing the code, I find there are some cases where no correspondences survive after filtering. May be ratio test could be adopted in this situation to be a remedy.

Questions for the parameters modification?

Hello! Thank you very much for releasing the implementation of this work.
The runtime & performance are quite impressive, and it works well for me for the image feature matching.

but how can i modify the parameter listed in the 'DEFAULT_CONFIG' of the 'class AdalamFilter: ', i have tried to modify it, but it does not work for the output.

The default parameter list:
Uploading AdaLAM1.PNG…

The parameter list after modified:
Uploading AdaLAM2.PNG…

No match

Hello, why I imported the db file generated by your code into the colmap and it shows that there is no match. I used SuperPoint+SuperGlue for the feature point and the descriptor.
image

AdaLAM to kornia?

Hi,

Would you be interested in the AdaLAM integration into kornia? We are now developing local features module and AdaLAM would be a great fit.
It would also boost usage and citation of AdaLAM :)

--
Best, Dmytro.

Question about the function of confidence_based_inlier_selection too

hi @cavalli1234 @Dawars

Thanks for sharing the code, I'm marvelous about the result of the code but also get some questions especially in the function of confidence_based_inlier_selection.

in the formula (4) of the paper,
image
Could you explain more about the second equation? What does the R stands for? Why Ep can be substituted by the next formula?

And I found it hard to map the formula to the corresponding code as indicated by the pic below.
image
I supposed it's the place you apply the formula? Or if I'm wrong please tell me.

Thank you in advance

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.