Hyperspectral imaging is a cutting-edge type of remote sensing used for mapping vegetation properties, rock minerals and other materials. A major drawback of hyperspectral imaging devices is their intrinsic low spatial resolution. In this paper, we propose a method for increasing the spatial resolution of a hyperspectral image by fusing it with an image of higher spatial resolution that was obtained with a different imaging modality. This is accomplished by solving a variational problem in which the regularization functional is the directional total variation. To accommodate for possible mis-registrations between the two images, we consider a non-convex blind super-resolution problem where both a fused image and the corresponding convolution kernel are estimated. Using this approach, our model can realign the given images if needed. Our experimental results indicate that the non-convexity is negligible in practice and that reliable solutions can be computed using a variety of different optimization algorithms. Numerical results on real remote sensing data from plant sciences and urban monitoring show the potential of the proposed method and suggests that it is robust with respect to the regularization parameters, mis-registration and the shape of the kernel.AMS classification scheme numbers: 49M37, 65K10, 90C30, 90C90 PACS numbers: 42.30. Va, 42.68.Wt, 95.75.Pq, 95.75.Rs Figure 1. Three example data for image fusion in remote sensing. They each consist of a hyperspectral image (small image, only one channel shown) and an image of higher spatial resolution (large image). The goal is to create an image that has both high spatial and high spectral resolution.
This work is concerned with the gradient flow of absolutely p-homogeneous convex functionals on a Hilbert space, which we show to exhibit finite (p < 2) or infinite extinction time (p ≥ 2). We give upper bounds for the finite extinction time and establish convergence rates of the flow. Moreover, we study next order asymptotics and prove that asymptotic profiles of the solution are eigenfunctions of the subdifferential operator of the functional. To this end, we compare with solutions of an ordinary differential equation which describes the evolution of eigenfunction under the flow. Our work applies, for instance, to local and nonlocal versions of PDEs like p-Laplacian evolution equations, the porous medium equation, and fast diffusion equations, herewith generalizing many results from the literature to an abstract setting.We also demonstrate how our theory extends to general homogeneous evolution equations which are not necessarily a gradient flow. Here we discover an interesting integrability condition which characterizes whether or not asymptotic profiles are eigenfunctions.These equations can also be studied as a fourth order gradient flow in H −1 , i.e.,Another class of examples are the fast diffusion equations for 1 < p < 2, the linear heat equation for p = 2, and the porous medium equation for p > 2, i.e.which, complemented with suitable boundary conditions, can also be interpreted as Hilbert space gradient flows (cf. [29] for the porous medium / fast diffusion case). Furthermore, as long as homogeneity is preserved, our general model covers non-local versions of the equations above, as well. Remarkably, we can also address an eigenvalue problem of the ∞-Laplacian operator [24,31] with our framework by regarding it from an purely energetic point of view. That means we set J(u) = ∇u ∞ for u ∈ W 1,∞ ∩ L 2 and J(u) = ∞ else, which meets all our assumptions under sufficient regularity of the domain, and interpret the ∞-Laplacian as L 2 -subdifferential operator of J.The main objective of this work is to prove that asymptotic profiles of the gradient flow (GF) are eigenfunctions of the subdifferential operator ∂J. By an asymptotic profile we refer to a suitably rescaled version of the actual solution u(t) of the gradient flow. More precisely, we look for a rescaling a(t) such that u(t)/a(t) converges to some w * as t tends to the extinction time of the flow (respectively t → ∞). Here w * is an eigenfunction of ∂J, meaning that λw * ∈ ∂J(w * ) for some λ ∈ R and by "extinction time" we refer to the (finite or infinite) time where the solution of the gradient flow stops changing, meaning ∂ t u(t) = 0 (respectively the minimal time such that that J(u(t)) = 0).The rescaling is chosen in such a way that it amplifies the shape of u(t) immediately before it reaches the state of lowest energy as described by the functional J. Furthermore, it should be noted that eigenfunctions of ∂J are self-similar in the sense that they only shrink under the gradient flow (GF) without changing their shape.If the energy is a quadratic ...
In this work we analyse the functional J (u) = ∇u ∞ defined on Lipschitz functions with homogeneous Dirichlet boundary conditions. Our analysis is performed directly on the functional without the need to approximate with smooth p-norms. We prove that its ground states coincide with multiples of the distance function to the boundary of the domain. Furthermore, we compute the L 2 -subdifferential of J and characterize the distance function as unique non-negative eigenfunction of the subdifferential operator. We also study properties of general eigenfunctions, in particular their nodal sets. Furthermore, we prove that the distance function can be computed as asymptotic profile of the gradient flow of J and construct analytic solutions of fast marching type. In addition, we give a geometric characterization of the extreme points of the unit ball of J .Finally, we transfer many of these results to a discrete version of the functional defined on a finite weighted graph. Here, we analyze properties of distance functions on graphs and their gradients. The main difference between the continuum and discrete setting is that the distance function is not the unique non-negative eigenfunction on a graph.
We consider a family of variational regularization functionals for a generic inverse problem, where the data fidelity and regularization term are given by powers of a Hilbert norm and an absolutely one-homogeneous functional, respectively. We investigate the small and large time behavior of the associated solution paths and, in particular, prove finite extinction time for a large class of functionals. Depending on the powers, we also show that the solution paths are of bounded variation or even Lipschitz continuous. In addition, it will turn out that the models are "almost" mutually equivalent in terms of the minimizers they admit. Finally, we apply our results to define and compare two different nonlinear spectral representations of data and show that only one of it is able to decompose a linear combination of nonlinear eigenvectors into the individual eigenvectors. Finally, we also briefly address piecewise affine solution paths.
Lipschitz learning is a graph-based semisupervised learning method where one extends labels from a labeled to an unlabeled data set by solving the infinity Laplace equation on a weighted graph. In this work we prove uniform convergence rates for solutions of the graph infinity Laplace equation as the number of vertices grows to infinity. Their continuum limits are absolutely minimizing Lipschitz extensions (AMLEs) with respect to the geodesic metric of the domain where the graph vertices are sampled from. We work under very general assumptions on the graph weights, the set of labeled vertices and the continuum domain. Our main contribution is that we obtain quantitative convergence rates even for very sparsely connected graphs, as they typically appear in applications like semisupervised learning. In particular, our framework allows for graph bandwidths down to the connectivity radius. For proving this we first show a quantitative convergence statement for graph distance functions to geodesic distance functions in the continuum. Using the ‘comparison with distance functions’ principle, we can pass these convergence statements to infinity harmonic functions and AMLEs.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.