相依观察值下新的伯恩斯坦不等式及其在学习理论中的应用

New Bernstein's Inequalities for Dependent Observations and Applications to Learning Theory

  • 摘要: 经典的集中不等式描述了基于独立同分布随机变量的函数与其数学期望的偏离程度, 并且这些不等式在统计学和机器学习理论中都有许多重要的应用. 在本文, 我们超出了独立同分布随机变量这个经典框架来建立了基于-混合序列、一致遍历马氏链的两个新的伯恩斯坦不等式. 作为这些不等式的应用, 我们又建立了基于-混合序列的经验风险最小化算法的一致偏差速率的界.

     

    Abstract: The classical concentration inequalities deal with the deviations of functions of independent and identically distributed (i.i.d.) random variables from their expectation and these inequalities have numerous important applications in statistics and machine learning theory. In this paper we go far beyond this classical framework by establish two new Bernstein type concentration inequalities for -mixing sequence and uniformly ergodic Markov chains. As the applications of the Bernstein's inequalities, we also obtain the bounds on the rate of uniform deviations of empirical risk minimization (ERM) algorithms based on -mixing observations.

     

/

返回文章
返回