The scalar shrinkage-thresholding operator (SSTO) is a key ingredient of many modern statistical signal processing algorithms including: sparse inverse problem solutions, wavelet denoising, and JPEG2000 image compression. In these applications, it is customary to select the threshold of the operator by solving a scalar sparsity penalized quadratic optimization. In this work, we present a natural multidimensional extension of the scalar shrinkage thresholding operator. Similarly to the scalar case, the threshold is determined by the minimization of a convex quadratic form plus an euclidean penalty, however, here the optimization is performed over a domain of dimension N ≥ 1. The solution to this convex optimization problem is called the multidimensional shrinkage threshold operator (MSTO). The MSTO reduces to the standard SSTO in the special case of N = 1. In the general case of N > 1 the optimal MSTO threshold can be found by a simple convex line search. We present three illustrative applications of the MSTO in the context of non-linear regression: l 2 -penalized linear regression, Group LASSO linear regression and Group LASSO logistic regression. In this paper, we introduce a multidimensional generalization of the scalar shrinkage thresholding operator. We define this operator as the minimization of a convex quadratic form plus an Euclidean norm penalty. We analyze this nondifferentiable optimization problem and discuss its properties. In particular, in analogy to the scalar shrinkage operator, we show that this generalization yields a multidimensional Shrinkage Thresholding Operator (MSTO) which takes a vector as an input and shrinks it or thresholds it depending on its Euclidean norm. For this purpose, we reformulate the problem as a constrained quadratic problem with a conic constraint. This principle leads to a theoretical result that transforms this multidimensional optimization problem into a simple line search which can be efficiently implemented. We show by simulations that evaluating the MSTO using line search is competitive with state-of-the-art convex solvers.In the second part of the paper, we discuss applications of the MSTO to statistical regression. First, we consider the Euclidean-norm penalized least squares and discuss its relation to ridge regression and robust regression [5]. Next, we address group LASSO linear regression [6]. In the special case of a block-orthogonal design matrix, we show that the problem can be reduced to evaluating the MSTO for each block. For other Group LASSO problems, we propose two iterative applications of the MSTO. In the first approach, we use Block Coordinate Descent to solve the linear regression problem with an arbitrary design matrix. The second approach tackles more complicated cost functions such as the logistic regression objective. Due to its similarity to the wellknown class of Iterative Thresholding Algorithms [1], we name the latter Iterative Group Shrinkage-Thresholding (IGST). In both cases, the MSTO enables one to solve large scale Group LASS...
The scalar shrinkage-thresholding operator (SSTO) is a key ingredient of many modern statistical signal processing algorithms including: sparse inverse problem solutions, wavelet denoising, and JPEG2000 image compression. In these applications, it is customary to select the threshold of the operator by solving a scalar sparsity penalized quadratic optimization. In this work, we present a natural multidimensional extension of the scalar shrinkage thresholding operator. Similarly to the scalar case, the threshold is determined by the minimization of a convex quadratic form plus an euclidean penalty, however, here the optimization is performed over a domain of dimension N ≥ 1. The solution to this convex optimization problem is called the multidimensional shrinkage threshold operator (MSTO). The MSTO reduces to the standard SSTO in the special case of N = 1. In the general case of N > 1 the optimal MSTO threshold can be found by a simple convex line search. We present three illustrative applications of the MSTO in the context of non-linear regression: l 2 -penalized linear regression, Group LASSO linear regression and Group LASSO logistic regression. In this paper, we introduce a multidimensional generalization of the scalar shrinkage thresholding operator. We define this operator as the minimization of a convex quadratic form plus an Euclidean norm penalty. We analyze this nondifferentiable optimization problem and discuss its properties. In particular, in analogy to the scalar shrinkage operator, we show that this generalization yields a multidimensional Shrinkage Thresholding Operator (MSTO) which takes a vector as an input and shrinks it or thresholds it depending on its Euclidean norm. For this purpose, we reformulate the problem as a constrained quadratic problem with a conic constraint. This principle leads to a theoretical result that transforms this multidimensional optimization problem into a simple line search which can be efficiently implemented. We show by simulations that evaluating the MSTO using line search is competitive with state-of-the-art convex solvers.In the second part of the paper, we discuss applications of the MSTO to statistical regression. First, we consider the Euclidean-norm penalized least squares and discuss its relation to ridge regression and robust regression [5]. Next, we address group LASSO linear regression [6]. In the special case of a block-orthogonal design matrix, we show that the problem can be reduced to evaluating the MSTO for each block. For other Group LASSO problems, we propose two iterative applications of the MSTO. In the first approach, we use Block Coordinate Descent to solve the linear regression problem with an arbitrary design matrix. The second approach tackles more complicated cost functions such as the logistic regression objective. Due to its similarity to the wellknown class of Iterative Thresholding Algorithms [1], we name the latter Iterative Group Shrinkage-Thresholding (IGST). In both cases, the MSTO enables one to solve large scale Group LASS...
We present a novel factor analysis method that can be applied to the discovery of common factors shared among trajectories in multivariate time series data. These factors satisfy a precedence-ordering property: certain factors are recruited only after some other factors are activated. Precedence-ordering arise in applications where variables are activated in a specific order, which is unknown. The proposed method is based on a linear model that accounts for each factor's inherent delays and relative order. We present an algorithm to fit the model in an unsupervised manner using techniques from convex and nonconvex optimization that enforce sparsity of the factor scores and consistent precedence-order of the factor loadings. We illustrate the order-preserving factor analysis (OPFA) method for the problem of extracting precedenceordered factors from a longitudinal (time course) study of gene expression data.Index Terms-Dictionary learning, genomic signal processing, misaligned data processing, structured factor analysis.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.