JOURNAL ARTICLE

Min-Max Bias Robust Regression

R. Douglas MartinV. J. YohaiRuben H. Zamar

Year: 1989 Journal:   The Annals of Statistics Vol: 17 (4)   Publisher: Institute of Mathematical Statistics

Abstract

This paper considers the problem of minimizing the maximum asymptotic bias of regression estimates over $\\varepsilon$-contamination neighborhoods for the joint distribution of the response and carriers. Two classes of estimates are treated: (i) $M$-estimates with bounded function $\\rho$ applied to the scaled residuals, using a very general class of scale estimates, and (ii) bounded influence function type generalized $M$-estimates. Estimates in the first class are obtained as the solution of a minimization problem, while estimates in the second class are specified by an estimating equation. The first class of $M$-estimates is sufficiently general to include both Huber Proposal 2 simultaneous estimates of regression coefficients and residuals scale, and Rousseeuw-Yohai $S$-estimates of regression. It is shown than an $S$-estimate based on a jump-function type $\\rho$ solves the min-max bias problem for the class of $M$-estimates with very general scale. This estimate is obtained by the minimization of the $\\alpha$-quantile of the squared residuals, where $\\alpha = \\alpha(\\varepsilon)$ depends on the fraction of contamination $\\varepsilon$. When $\\varepsilon \\rightarrow 0.5, \\alpha(\\varepsilon) \\rightarrow 0.5$ and the min-max estimator approaches the least median of squared residuals estimator introduced by Rousseeuw. For the bounded influence class of $GM$-estimates, it is shown the "sign" type nonlinearity yields the min-max estimate. This estimate coincides with the minimum gross-error sensitivity $GM$-estimate. For $p = 1$, the optimal $GM$-estimate is optimal among the class of all equivariant regression estimates. The min-max $S$-estimator has a breakdown point which is independent of the number of carriers $p$ and tends to 0.5 as $\\varepsilon$ increases to 0.5, but has a slow rate of convergence. The min-max $GM$-estimate has the usual rate of convergence, but a breakdown point which decreases to zero with increasing $p$. Finally, we compare the min-max biases for both types of estimates, for the case where the nominal model is multivariate normal.

Keywords:
Mathematics Estimator Bounded function Quantile Statistics Mean squared error Applied mathematics Mathematical analysis

Metrics

146
Cited By
7.20
FWCI (Field Weighted Citation Impact)
20
Refs
0.98
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Advanced Statistical Methods and Models
Physical Sciences →  Mathematics →  Statistics and Probability
Statistical Methods and Inference
Physical Sciences →  Mathematics →  Statistics and Probability
Advanced Statistical Process Monitoring
Social Sciences →  Decision Sciences →  Statistics, Probability and Uncertainty

Related Documents

JOURNAL ARTICLE

Min–max–min robust combinatorial optimization

Christoph BuchheimJannis Kurtz

Journal:   Mathematical Programming Year: 2016 Vol: 163 (1-2)Pages: 1-23
JOURNAL ARTICLE

Robust min-max localization a

Ryan A. ParkerShahrokh Valaee

Year: 2006 Vol: 36 Pages: 1000-1005
JOURNAL ARTICLE

Min max min robust (relative) regret combinatorial optimization

Alejandro Crema

Journal:   Mathematical Methods of Operations Research Year: 2020 Vol: 92 (2)Pages: 249-283
BOOK-CHAPTER

Evolving Granular Fuzzy Min-Max Regression

Alisson PortoFernando Gomide

Advances in intelligent systems and computing Year: 2017 Pages: 162-171
JOURNAL ARTICLE

Robust min-max regret covering problems

Amadeu Almeida CocoAndréa Cynthia SantosThiago F. Noronha

Journal:   Computational Optimization and Applications Year: 2022 Vol: 83 (1)Pages: 111-141
© 2026 ScienceGate Book Chapters — All rights reserved.