Diffeological spaces firstly introduced by J.M. Souriau in the 1980s are a natural generalization of smooth manifolds. In this paper, we give a short overview of the necessary objects for optimization techniques on diffeological spaces. More precisely, we generalize the concepts of tangent spaces, Riemannian structures, retractions and Levi-Civita connections.
Diffeological spaces firstly introduced by J.M. Souriau in the 1980s are a natural generalization of smooth manifolds. However, optimization techniques are only known on manifolds so far. Generalizing these techniques to diffeological spaces is very challenging because of several reasons. One of the main reasons is that there are various definitions of tangent spaces which do not coincide. Additionally, one needs to deal with a generalization of a Riemannian space in order to define gradients which are indispensable for optimization methods. This paper is devoted to a suitable definition of a tangent space in view to optimization methods. Based on this definition, we present a diffeological Riemannian space and a diffeological gradient, which we need in an optimization algorithm on diffeological spaces. We give examples for these novel objects. Moreover, in order to be able to update the iterates in an optimization algorithm on diffeological spaces, we present a diffeological retraction and the Levi-Civita connection on diffeological spaces.
On this poster, we present optimization techniques on diffeological spaces. Diffeological spaces firstly introduced by J.M. Souriau in the 1980s are a natural generalization of smooth manifolds. In order to generalize optimization methods known on manifolds to diffeological spaces, we define various objects like a diffeological tangent space, a diffeological Riemannian space as well as a diffeological gradient. In addition we give the definition of a diffeological retraction. These objects are necessary for formulating the steepest descent method on diffeological spaces. We present this method and apply it on an example.
Shape optimization problems constrained by variational inequalities (VI) are non-smooth and non-convex optimization problems. The non-smoothness arises due to the variational inequality constraint, which makes it challenging to derive optimality conditions. Besides the non-smoothness there are complementary aspects due to the VIs as well as distributed, non-linear, non-convex and infinite-dimensional aspects due to the shapes which complicate to set up an optimality system and, thus, to develop fast and higher order solution algorithms. In this paper, we consider Newton-derivatives in order to formulate optimality conditions. In this context, we set up a Newton-shape derivative scheme. Examples show the application of the proposed scheme.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.