In the general regression model yi=xfl+ei, for i= 1,..., n and tiER p, the "regression quantile"/(0) estimates the coefficients of thelinear regression function parallel to x/3 and roughly lying above a fraction 0 of the data. As introduced by Koenker and Bassett [Econometrica, 46 (1978), pp. 33-50], these regression quantiles are analogous to order statistics and provide a natural and appealing approach to the analysis of the general linear model. Computation of/(0) can be expressed as a parametric linear programming problem with J, distinct extremal solutions as 0 goes from zero to one. That is, there will be J, breakpoints {0j}, for j-1,... ,J, such that/(0j) is obtained from/(0_) by a single simplex pivot. Each/(0) is characterized by a specific subset of p observations. Although no previous result restricts J to be less than the upper bound (,)= O(n P), practical experience suggests that J grows roughly linearly with n. Here it is shown that, in fact, Jn O(n log n) in probability, where the distributional assumptions are those typical of multiple regression situations. The result is based on a probabilistic rather than combinatoric approach which should have general application to the probabilistic behavior of the number of pivots in (parametric) linear programming problems. The conditions are roughly that the constraint coefficients form random independent vectors, and that the number of variables is fixed while the number of constraints tends to infinity.