PC准则下回归系数的一类线性估计的优良性

The Superiority about a Class of Linear Estimaton of Regression Coefficient under Pitman Closeness Criterion

  • 摘要: 设线性回归模型为Y_n \times 1=X_n \times p \beta_p \times 1+\varepsilon_n \times 1,,此处np,X的秩为RX)=s,0<sp;\varepsilon \sim N_n\left(0, \sigma^2 I\right),令回归系敷的最小二乘(ls)解和一类线性估计分别为\widehat\beta=\left(X^\prime X\right)^- X^\prime y和\widetilde\beta_\rho=\left(X^\prime X+\rho \Sigma_0\right)^-1 X^\prime y,其中p>0为常数,∑0为正定阵。本文证明了:在适当条件下\widetilde\boldsymbol\beta_\rho于PC准则下优于\hat\beta,并将这一结果应用于回归系数的岭估计、广义岭估计、压缩估计和Bayes估计。

     

    Abstract: Let the linear regression model be Y_n \times 1=X_n \times p \beta_p \times 1+\varepsilon_n \times 1, where np, rank(X)=s, and \varepsilon \sim N_n\left(0, \sigma^2 I\right). Suppose that the LS solution and linear estimation of regression coefficient are \widehat\beta=\left(X^\prime X\right)^- X^\prime yand \widetilde\beta_\rho=\left(X^\prime X+\rho \Sigma_0\right)^-1 X^\prime y, where p> 0 is a contant and Σ0 is a positive definite matrix. In this paper we prove that under suitable conditions the linear estimator \widetilde\boldsymbol\beta_\rho is better than \hat\beta by Pitman closeness criterion, and apply this result to the ridge estimators, generalized ridge estimators, shrinkage estimators and Bayes estimators.

     

/

返回文章
返回