对部分线性模型用惩罚最小二乘估计时最优光滑化的注记

A Note on the Optimal Smoothing in Partial Linear\centerline Models with Penalized Least Squares Estimator

  • 摘要: 部分线性模型也就是响应变量关于一个或者多个协变量是线性的, 但对于其他的协变量是非线性的关系\bd 对于部分线性模型中的参数和非参数部分的估计方法, 惩罚最小二乘估计是重要的估计方法之一\bd 对于这种估计方法, 广义交叉验证法提供了一种确定光滑参数的方法\bd 但是, 在部分线性模型中, 用广义交叉验证法确定光滑参数的最优性还没有被证明\bd 本文证明了利用惩罚最小二乘估计对于部分线性模型估计时, 用广义交叉验证法选择光滑参数的最优性\bd 通过模拟验证了本文中所提出的用广义交叉验证法选择光滑参数具有很好的效果, 同时, 本文在模拟部分比较了广义交叉验证和最小二乘交叉验证的优劣.

     

    Abstract: Partially linear models are assumed to be linearly related to one or more variable, but the relation to an additional variable or variables is not assumed to be easily parameterized. One primary approach to estimate the parameter and nonparametric part is the method of penalized least squares method, generalized cross-validation (GCV) approach is a popular method for selecting the smoothing parameters. However, the optimality of GCV in the partial linear model with penalized least squares has not been proved. In this article, we provide the support for using GCV through its optimality of the smoothing parameter. Simulation studies are employed to investigate the empirical performance of generalized cross-validation and that of cross-validation for comparison in the context.

     

/

返回文章
返回