The Superiority about a Class of Linear Estimaton of Regression Coefficient under Pitman Closeness Criterion
-
Graphical Abstract
-
Abstract
Let the linear regression model be Y_n \times 1=X_n \times p \beta_p \times 1+\varepsilon_n \times 1, where n≥p, rank(X)=s, and \varepsilon \sim N_n\left(0, \sigma^2 I\right). Suppose that the LS solution and linear estimation of regression coefficient are \widehat\beta=\left(X^\prime X\right)^- X^\prime yand \widetilde\beta_\rho=\left(X^\prime X+\rho \Sigma_0\right)^-1 X^\prime y, where p> 0 is a contant and Σ0 is a positive definite matrix. In this paper we prove that under suitable conditions the linear estimator \widetilde\boldsymbol\beta_\rho is better than \hat\beta by Pitman closeness criterion, and apply this result to the ridge estimators, generalized ridge estimators, shrinkage estimators and Bayes estimators.
-
-