We conclude that DM is associated with increased formation of F2-isoprostanes, as a correlate of impaired glycemic control and enhanced lipid peroxidation. This may provide an important biochemical link between impaired glycemic control and persistent platelet activation. These results provide a rationale for dose-finding studies of antioxidant treatment in diabetes.
Tensor-based methods are receiving a growing interest in scientific computing for the numerical solution of problems defined in high dimensional tensor product spaces. A family of methods called proper generalized decompositions (PGD) methods have been recently introduced for the a priori construction of tensor approximations of the solution of such problems. In this paper, we give a mathematical analysis of a family of progressive and updated PGDs for a particular class of problems associated with the minimization of a convex functional over a reflexive tensor Banach space.
In this paper we study the convergence of the well-known Greedy Rank-One Update Algorithm. It is used to construct the rank-one series solution for full-rank linear systems. The existence of the rank one approximations is also not new, but surprisingly the focus there has been more on the applications side more that in the convergence analysis. Our main contribution is to prove the convergence of the algorithm and also we study the required rank one approximation in each step. We also give some numerical examples and describe its relationship with the Finite Element Method for High-Dimensional Partial Differential Equations based on the tensorial product of one-dimensional bases. We illustrate this situation taking as a model problem the multidimensional Poisson equation with homogeneous Dirichlet boundary condition.
The Proper Generalized Decomposition (PGD) is a methodology initially proposed for the solution of partial differential equations (PDE) defined in tensor product spaces. It consists in constructing a separated representation of the solution of a given PDE. In this paper we consider the mathematical analysis of this framework for a larger class of problems in an abstract setting. In particular, we introduce a generalization of Eckart and Young theorem which allows to prove the convergence of the so-called progressive PGD for a large class of linear problems defined in tensor product Hilbert spaces.
In this paper we introduce and develop the notion of minimal subspaces in the framework of algebraic and topological tensor product spaces. This mathematical structure arises in a natural way in the study of tensor representations. We use minimal subspaces to prove the existence of a best approximation, for any element in a Banach tensor space, by means a tensor given in a typical representation format (Tucker, hierarchical or tensor train). We show that this result holds in a tensor Banach space with a norm stronger that the injective norm and in an intersection of finitely many Banach tensor spaces satisfying some additional conditions. Examples by using topological tensor products of standard Sobolev spaces are given.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.