import cv2
import numpy as np
import matplotlib.pyplot as plt
from skimage import io
from numpy import fft
from IPython.html import widgets
%matplotlib inline
Brute-Force matcher is simple. It takes the descriptor of one feature in first set and is matched with all other features in second set using some distance calculation. And the closest one is returned.
For BF matcher, first we have to create the BFMatcher object using cv2.BFMatcher(). It takes two optional params. First one is normType. It specifies the distance measurement to be used. By default, it is cv2.NORM_L2. It is good for SIFT, SURF etc (cv2.NORM_L1 is also there). For binary string based descriptors like ORB, BRIEF, BRISK etc, cv2.NORM_HAMMING should be used, which used Hamming distance as measurement. If ORB is using WTA_K == 3 or 4, cv2.NORM_HAMMING2 should be used.
Second param is boolean variable, crossCheck which is false by default. If it is true, Matcher returns only those matches with value (i,j) such that i-th descriptor in set A has j-th descriptor in set B as the best match and vice-versa. That is, the two features in both sets should match each other. It provides consistant result, and is a good alternative to ratio test proposed by D.Lowe in SIFT paper.
Once it is created, two important methods are BFMatcher.match() and BFMatcher.knnMatch(). First one returns the best match. Second method returns k best matches where k is specified by the user. It may be useful when we need to do additional work on that.
Like we used cv2.drawKeypoints() to draw keypoints, cv2.drawMatches() helps us to draw the matches. It stacks two images horizontally and draw lines from first image to second image showing best matches. There is also cv2.drawMatchesKnn which draws all the k best matches. If k=2, it will draw two match-lines for each keypoint. So we have to pass a mask if we want to selectively draw it.
#messi_original = cv2.imread("public-images/messi5.jpg")[:,:,::-1]
messi_original = cv2.imread("public-images/clahe.png")[:,:,::-1]
rot_mat = cv2.getRotationMatrix2D((messi_original.shape[1]/2, messi_original.shape[0]/2), 90, 1)
messi_rotated = messi_original.transpose((1, 0, 2))#cv2.warpAffine(messi_original, rot_mat, (342, 548) )
fig, axes = plt.subplots(1, 2, figsize = (10, 5))
axes[0].imshow(messi_original)
axes[1].imshow(messi_rotated)
<matplotlib.image.AxesImage at 0x7fa7ced77950>
gray_original = cv2.cvtColor(messi_original, cv2.COLOR_RGB2GRAY)
gray_rotated = cv2.cvtColor(messi_rotated, cv2.COLOR_RGB2GRAY)
## using SIFT ORB
orb = cv2.ORB()
kp_original, des_original = orb.detectAndCompute(gray_original, None)
kp_rotated, des_rotated = orb.detectAndCompute(gray_rotated, None)
plt.imshow(cv2.drawKeypoints(gray_original, kp_original, None))
plt.figure()
plt.imshow(cv2.drawKeypoints(gray_rotated, kp_rotated, None))
<matplotlib.image.AxesImage at 0x7fa7ceb06ed0>
## using brute force matcher
matcher = cv2.BFMatcher(cv2.NORM_HAMMING, crossCheck = True, )
matches = matcher.match(des_original, des_rotated)
plt.imshow(cv2.drawMatches(gray_original, kp_original,
gray_rotated, kp_rotated, matches[:10], None))
<matplotlib.image.AxesImage at 0x7fa7cec98a10>
It contains a collection of algorithms optimized for fast nearest neighbor search in large datasets and for high dimensional features. It works more faster than BFMatcher for large datasets. We will see the second example with FLANN based matcher.