开发者

Is it possible to get the rotation and scale between two images with only a Surf Descriptor of each?

开发者 https://www.devze.com 2023-04-09 09:30 出处:网络
I\'m using Surf for landmark recognition. This is the process I thought: 1) save before hand one Surf Descriptor for each landmark

I'm using Surf for landmark recognition. This is the process I thought:

1) save before hand one Surf Descriptor for each landmark

2) A user takes a photo of a landmark (eg building)

3) A Surf Descriptor is computed for this image (the photo)

4) This descriptor is compared against each landmark descriptor stored and the one with the l开发者_运维技巧owest DMatch.distance between the 11 closest Feature Points is choosen as the landmark recognized

5) I want to calculate the rotation and scale-ratio between the image obtained and the landmark image stored.

My understanding is that I can only get this rotation and scale-ratio through keypoints, because the Feature Descriptor is only a Unique Reduced Representation for a Keypoint. As such I would have to save both the keypoints and Feature Descriptors for each landmark. Is that right?

This is what I'm doing right now:

cv::SurfFeatureDetector surf(4000);
..
surf.detect(image1, keypoints1); 
surf.detect(image2, keypoints2);
..
cv::SurfDescriptorExtractor surfDesc;
surfDesc.compute(image1, keypoints1, descriptor1);
surfDesc.compute(image2, keypoints2, descriptor2);
..
vector<cv::DMatch> descriptorsMatch;
BruteForceMatcher<cv::L2<float> > brute;
brute.match(desc1, desc2, descriptorsMatch);

//Use only the 11 best matched keypoints;
nth_element( descriptorsMatch.begin(), descriptorsMatch.begin()+10, descriptorsMatch.end() );
descriptorsMatch.erase( descriptorsMatch.begin()+11, descriptorsMatch.end() );
..
for ( .. it = descriptorsMatch.begin(); it != descriptorsMatch.end() .. )
{
  distanceAcumulator +=it->distance;
  angleAcumulator += abs(keypoints1[it->queryIdx].angle - keypoints2[it->trainIdx].angle) % 180 ;
  scaleAcumulator1 +=keypoints1[it->queryIdx].size;
  scaleAcumulator2 +=keypoints2[it->trainIdx].size;
}
angleBetweenImages = angleAcumulator/11;
scaleBetweenImages = scaleAcumulator1/scaleAcumulator2;
similarityBetweenImages = distanceAcumulator/11;
.. 


Just comparing scale of descriptors and their rotation is naive and generally will not work. What you need is applying some geometrical model to find relationship between photos. See more in this book http://www.robots.ox.ac.uk/~vgg/hzbook/, especially PART II: Two-View Geometry.

0

精彩评论

暂无评论...
验证码 换一张
取 消

关注公众号