The solution of an elliptic boundary value problem is an infinitely differentiable function of the coefficient in the partial differential equation. When the (coefficient-dependent) energy norm is used, the result is a smooth, convex output least-squares functional. Using total variation regularization, it is possible to estimate discontinuous coefficients from interior measurements. The minimization problem is guaranteed to have a solution, which can be obtained in the limit from finite-dimensional discretizations of the problem. These properties hold in an abstract framework that encompasses several interesting problems: the standard (scalar) elliptic BVP in divergence form, the system of isotropic elasticity, and others.
The coefficient in a linear elliptic partial differential equation can be estimated from interior measurements of the solution. Posing the estimation problem as a constrained optimization problem with the PDE as the constraint allows the use of the augmented Lagrangian method, which is guaranteed to converge. Moreover, the convergence analysis encompasses discretization by finite element methods, so the proposed algorithm can be implemented and will produce a solution to the constrained minimization problem. All of these properties hold in an abstract framework that encompasses several interesting problems: the standard (scalar) elliptic BVP in divergence form, the system of isotropic elasticity, and others. Moreover, the analysis allows for the use of total variation regularization, so rapidly-varying or even discontinuous coefficients can be estimated.
In this short note, our aim is to investigate the inverse problem of parameter identification in quasi-variational inequalities. We develop an abstract nonsmooth regularization approach that subsumes the total variation regularization and permits the identification of discontinuous parameters. We study the inverse problem in an optimization setting using the output-least squares formulation. We prove the existence of a global minimizer and give convergence results for the considered optimization problem. We also discretize the identification problem for quasi-variational inequalities and provide the convergence analysis for the discrete problem. We give an application to the gradient obstacle problem.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.