In this work, we present a novel and practical approach to address one of the longstanding problems in computer vision: 2D and 3D affine invariant feature matching. Our Grassmannian Graph (GrassGraph) framework employs a two stage procedure that is capable of robustly recovering correspondences between two unorganized, affinely related feature (point) sets. The first stage maps the feature sets to an affine invariant Grassmannian representation, where the features are mapped into the same subspace. It turns out that coordinate representations extracted from the Grassmannian differ by an arbitrary orthonormal matrix. In the second stage, by approximating the Laplace-Beltrami operator (LBO) on these coordinates, this extra orthonormal factor is nullified, providing true affine-invariant coordinates which we then utilize to recover correspondences via simple nearest neighbor relations. The resulting GrassGraph algorithm is empirically shown to work well in non-ideal scenarios with noise, outliers, and occlusions. Our validation benchmarks use an unprecedented 440,000+ experimental trials performed on 2D and 3D datasets, with a variety of parameter settings and competing methods. State-of-the-art performance in the majority of these extensive evaluations confirm the utility of our method.
Abstract-The past twenty years has seen the explosion of the "shape zoo": myriad shape representations, each with pros and cons. Of the varied denizens, distance transforms and density function shape representations have proven to be the most utile. Distance transforms inherit the numerous geometric advantages of implicit curve representations while density functions are unmatched in their approach to the modeling of uncertainty and noise in shape features. We have not seen much rapprochement between these two representations in general. In this work, we introduce a complex wave representation (CWR) of shape which has the ability to simultaneously carry probabilistic information via its magnitude and geometric information via its phase, achieving an integration of distance transforms and density function shape representations. The CWR is a parametric representation with cluster centers akin to a mixture model and curve normal information akin to signed distance functions. We demonstrate the perceptual gains of the CWR, highlight the advantages of the probabilistic aspect for noisy shape alignment by a likelihood approach, and fusing both aspects we show that the CWR leads to a feature space in which kernel PCA yields approximate closed curves and probability density functions.
For a twice continuously differentiable function S, we define the density function of its gradient (derivative in one dimension) s = S as a random variable transformation of a uniformly distributed random variable using s as the transformation function. Given N values of S sampled at equally spaced locations, we demonstrate using the method of stationary phase that the approximation error between the integral of the scaled, discrete power spectrum of the wave func-and the integral of the true density function of s over an arbitrarily small interval is bounded above by O(1/N ) as N → ∞ (τ → 0). In addition to its easy implementation and fast computability in O(N log N ) that only requires computing the discrete Fourier transform, our framework for obtaining the derivative density does not involve any parameter selection like the number of histogram bins, width of the histogram bins, width of the kernel parameter, number of mixture components etc. as required by other widely applied methods like histograms and Parzen windows.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.