Teach you how to quickly implement SIFT feature matching (including source code)

Blog homepage: CSDN blog of virobotics: LabVIEW deep learning, artificial intelligence blogger
Owned column: “LabVIEW deep learning practice”
Previous article: Using LabVIEW AI Vision Toolkit to quickly implement SIFT feature detection (including source code)
If you think the blogger’s article is well written or helpful to you, I hope everyone will support it! Welcome everyone? Follow,? Like,? Favorite,? Subscribe to the column

Article directory

  • foreword
  • 1. Environment construction
    • 1.1 The environment used when deploying this project
    • 1.2 LabVIEW toolkit download and installation
  • Two, SIFT feature matching specific steps
  • 3. Proposal of SIFT matching method
  • Four, SIFT feature matching actual combat
    • 4.1 LabVIEW implements SIFT feature matching
    • 4.2 Python implements SIFT feature matching
  • 5. Project source code
  • Summarize

Foreword

The previous blog post introduced LabVIEW to implement SIFT feature detection. This article introduces LabVIEW and python to implement SIFT feature matching.

1. Environment construction

1.1 The environment used when deploying this project

  • Operating system: Windows10
  • python: 3.6 and above
  • LabVIEW: 2018 and above 64-bit version
  • AI Vision Toolkit: techforce_lib_opencv_cpu-1.0.0.98.vip

1.2 LabVIEW toolkit download and installation

  • AI vision toolkit download and installation reference:
    https://blog.csdn.net/virobotics/article/details/123656523
  • AI Vision Toolkit Introduction:
    https://blog.csdn.net/virobotics/article/details/123522165

2. SIFT feature matching specific steps

The essence of the SIFT algorithm is to find key points (feature points) in different scale spaces, calculate the size, direction, and scale information of the key points, and use these information to form key points to describe the feature points. The key points that Sift looks for are some “stable” feature points that are very prominent and will not be transformed by factors such as lighting, affine letter, and noise, such as corner points, edge points, bright spots in dark areas, and dark points in bright areas. wait. The matching process is the process of comparing these feature points. This process can be described by the following figure:

3. The proposal of SIFT matching method

In order to exclude the key points without matching relationship caused by image occlusion and background confusion, Lowe, the author of SIFT, proposed a SIFT matching method that compares the nearest neighbor distance and the second nearest neighbor distance:

Take a SIFT key point in one image, and find the first two key points closest to the Euclidean distance in another image. Among these two key points, if the nearest distance is divided by the next closest distance to get The ratio ratio is less than a certain threshold T, then accept this pair of matching points. Because for the wrong match, due to the high dimensionality of the feature space, there may be a large number of other wrong matches with similar distances, so its ratio value is relatively high. Obviously lowering the ratio threshold T, the number of SIFT matching points will decrease, but it will be more stable, and vice versa.

Lowe recommends that the ratio threshold is 0.8, and you can also try other thresholds. During the test, it was found that there are few matching points if the threshold is less than 0.4, and there are a large number of false matching points if the threshold is greater than 0.6. Therefore, the recommended value of ratio is as follows:

  • ratio=0.4: For matching with high accuracy requirements;
  • ratio=0.6: For the matching that requires a relatively large number of matching points;
  • ratio=0.5: under normal circumstances.

4. SIFT feature matching in practice

4.1 LabVIEW implements SIFT feature matching

0Introduction to related VIs and parameters

SIFT_Compute: Calculate the descriptor according to the key points, the descriptor is a matrix

DescriptorMatcher.new.vi: create a matcher;

matcher_knnMatch.vi: Find the best k matches

Parameter Description:

  • d1, d2: Descriptors representing two images;
  • k: means to take the first k key points with the closest Euclidean distance, which is to calculate the Euclidean distance between each descriptor in the first group and all descriptors in the second group, and then take the first k pairs with the smallest distance. When k=1 The result is the same as the match method;
  • Array: An array composed of classes (including DMatches and size), DMatches contains: queryIdx, indicating the index value of the descriptor of the first image, query is the search, which image is used to search; trainIdx, indicating the second image The index value of the image descriptor; distance, the distance between the descriptors, the lower the value, the better, the lower the approximation, or the higher the matching degree;

1 Realize SIFT feature matching

The implementation steps are as follows:

  • read pictures

  • Instantiate sift and detect the key points of the two images

  • Draw keypoints, and compute descriptors from keypoints

  • Instantiate a class that matches feature vectors, using knnmatch for nearest neighbor matching

  • Lowe, the author who used SIFT, proposed a SIFT matching method to compare the nearest neighbor distance and the second nearest neighbor distance to filter matching points

  • Use the event structure to plot the matching results

    Complete the program VI as follows:

2Operation results

Modify the id to view a single matching result

4.2 Python implements SIFT feature matching

1 Realize SIFT feature matching

import cv2

# 1 read image
img1 = cv2.imread("left.jpg")
img2 = cv2.imread("right.jpg")
g1 = cv2.cvtColor(img1, cv2.COLOR_BGR2GRAY)
g2 = cv2.cvtColor(img2, cv2.COLOR_BGR2GRAY)
# 2 sift key point detection and matching
# 2.1 Instantiate the sift object
sift = cv2.SIFT_create()
# 2.2 Key point detection: kp key point information includes direction, scale, position information, de is the descriptor of the key point
kp1, des1 = sift. detectAndCompute(g1, None)
kp2, des2 = sift. detectAndCompute(g2, None)
### 2.3 Feature matching
# BFMatcher with default parameters
bf = cv2.BFMatcher()
# K nearest neighbor algorithm finds the nearest K data points in space, and classifies these data points into one class
matches = bf.knnMatch(des1,des2,k=2)
goodMatch = [] # goodMatch is a filtered high-quality match
threshold = 0.8
for m,n in matches:
    
## If the ratio of the distance of the first match divided by the second closest distance in the two pairings is less than a certain threshold T,
## Basically, it can be explained that this first pairing is a unique, non-repeating feature point in the two images, which can be retained.
    if m.distance/n.distance <threshold:
        goodMatch.append([m])
print("goodMatch:",len(goodMatch))
# 2.4 Draw the matching result on the image
img_out = cv2.drawMatchesKnn(img1,
                             kp1,
                             img2,
                             kp2,
                             goodMatch,
                             None,
                             flags=cv2.DrawMatchesFlags_NOT_DRAW_SINGLE_POINTS)
cv2.imshow('image_match', img_out)#show pictures
cv2.waitKey(0)#wait for the key to be pressed
cv2.destroyAllWindows()#Clear all windows

2Operation results

5. Project source code

Project source code download: https://download.csdn.net/download/virobotics/87808362

Summary

The above is what I want to share with you today, I hope it will be useful to you. See you in the next article~

If the article is helpful to you, welcome to?Follow,?Like,?Favorite,Subscribe to the column

Recommended reading

LabVIEW graphical AI vision development platform (not NI Vision), which greatly reduces the threshold of artificial intelligence development
LabVIEW Graphical AI Vision Development Platform (Non-NI Vision) VI Introduction
Basic usage and properties of LabVIEW AI vision toolkit OpenCV Mat
Teach you to use the LabVIEW artificial intelligence vision toolkit to quickly realize image reading and acquisition
Technical exchange? Study together? Consultation and sharing, please contact