Janet Iyabo IdowuA. T. OwolabiOlasunkanmi James OladapoKayode AyindeO. A. OshuoporuA. N. Alao
This study delves into the challenges faced by the ordinary least square (OLS) estimator, traditionally regarded as the Best Linear Unbiased Estimator in classical linear regression models. Despite its reliability under specific conditions, OLS falters in the face of multicollinearity, a problem frequently encountered in regression analyses. To combat this issue, various ridge regression estimators have been developed, characterized as one-parameter and two-parameter ridge-type estimators. In this context, our research introduces novel two-parameter estimators, building upon a recently developed one-parameter ridge estimator to mitigate the impact of multicollinearity in linear regression models. Theoretical analysis and simulation experiments were conducted to assess the performance of the proposed estimators. Remarkably, our results reveal that, under certain conditions, these new estimators outperform existing estimators, displaying a significantly reduced mean square error. To validate these findings, real-life data was employed, aligning with the outcomes derived from theoretical analysis and simulations.
Hina NazIsmail ShahDanish WasimSajid Ali
Abdul MajidShakeel AhmadMuhammad AslamMuhammad Kashif
Danish WasimSyed Muhammad SuhailB. M. Golam KibriaMaha ShabbirShah Nawaz
Masad A. MasadAdewale F. LukmanRasha A. FarghaliAsamh Saleh M. Al Luhayb