A new multivariate concept of quantile, based on a directional version of Koenker and Bassett's traditional regression quantiles, is introduced for multivariate location and multiple-output regression problems. In their empirical version, those quantiles can be computed efficiently via linear programming techniques. Consistency, Bahadur representation and asymptotic normality results are established. Most importantly, the contours generated by those quantiles are shown to coincide with the classical halfspace depth contours associated with the name of Tukey. This relation does not only allow for efficient depth contour computations by means of parametric linear programming, but also for transferring from the quantile to the depth universe such asymptotic results as Bahadur representations. Finally, linear programming duality opens the way to promising developments in depth-related multivariate rank-based inference. This is an electronic reprint of the original article published by the Institute of Mathematical Statistics in The Annals of Statistics, 2010, Vol. 38, No. 2, 635-669. This reprint differs from the original in pagination and typographic detail. 1 2 M. HALLIN, D. PAINDAVEINE AND M.ŠIMANideas of this definition were exposed in an unpublished master thesis by Laine [21], quoted in [16]. In this paper, we carefully revive Laine's ideas, and systematically develop and prove the main properties of the concept he introduced.A huge literature has been devoted to the problem of extending to a multivariate setting the fundamental one-dimensional concept of quantile; see, for instance, [1, 3-7, 10, 15, 19, 34] and [37] or [33] for a recent survey. An equally huge literature-see [9,22,39] and [40] for a comprehensive account-is dealing with the concept of (location) depth. The philosophies underlying those two concepts at first sight are quite different, and even, to some extent, opposite. While quantiles resort to analytical characterizations through inverse distribution functions or L 1 optimization, depth often derives from more geometric considerations such as halfspaces, simplices, ellipsoids and projections. Both carry advantages and some drawbacks. Analytical definitions usually bring in efficient algorithms and tractable asymptotics. The geometric ones enjoy attractive equivariance properties and intuitive contents, but their probabilistic study and asymptotics are generally trickier, while their implementation, as a rule, leads to heavy combinatorial algorithms; a highly elegant analytical approach to depth has been proposed in [24], but does not help much in that respect.Yet, beyond those sharp methodological differences, quantiles and depth obviously exhibit a close conceptional kinship. In the univariate case, all definitions basically agree that the depth of a point x ∈ R with respect to a probability distribution P with strictly monotone distribution function F should be min(F (x), 1 − F (x)), so that the only points with depth d are x d := F −1 (d) and x 1−d := F −1 (1 − d)-the quantiles of orders ...
This paper sheds some new light on the multivariate (projectional) quantiles recently introduced in Kong and Mizera (2008). Contrary to the sophisticated set analysis used there, we adopt a more parametric approach and study the subgradient conditions associated with these quantiles. In this setup, we introduce Lagrange multipliers which can be interpreted in various interesting ways. We also link these quantiles with portfolio optimization and present an alternative proof that the resulting quantile regions coincide with the halfspace depth ones. Our proof makes the link between halfspace depth contours and univariate quantiles of projections more explicit and results into an exact computation of sample quantile regions (hence also of halfspace depth regions) from projectional quantiles. Throughout, we systematically consider the regression case, which was barely touched in Kong and Mizera (2008). Above all, we study the projectional regression quantile regions and compare them with those resulting from the approach considered in Hallin, Paindaveine, andŠiman (2009). To gain in generality and to make the comparison between both concepts easier, we present a general framework for directional multivariate (regression) quantiles which includes both approaches as particular cases and is of interest in itself.
We describe in detail the algorithm solving the parametric programming problem involved, and illustrate the resulting procedure on simulated and real data. We also evaluate the efficiency of our Matlab implementation of * Corresponding author.Email addresses: dpaindav@ulb.ac.be (Davy Paindaveine), siman@utia.cas.cz the algorithm through extensive simulations. To the best of our knowledge, our code is the first one that allows for computing halfspace depth regions beyond dimension two.
A new quantile regression concept, based on a directional version of Koenker and Bassett's traditional single-output one, has been introduced in [Ann. Statist. (2010) 38 635-669] for multiple-output location/linear regression problems. The polyhedral contours provided by the empirical counterpart of that concept, however, cannot adapt to unknown nonlinear and/or heteroskedastic dependencies. This paper therefore introduces local constant and local linear (actually, bilinear) versions of those contours, which both allow to asymptotically recover the conditional halfspace depth contours that completely characterize the response's conditional distributions. Bahadur representation and asymptotic normality results are established. Illustrations are provided both on simulated and real data.Comment: Published at http://dx.doi.org/10.3150/14-BEJ610 in the Bernoulli (http://isi.cbs.nl/bernoulli/) by the International Statistical Institute/Bernoulli Society (http://isi.cbs.nl/BS/bshome.htm
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.