-���, ������, һ��ƫ��, ���������С��.," />
Ӧ�ø���ͳ�� 2014, 30(6) 570-584 DOI:      ISSN: 1001-4268 CN: 31-1256

����Ŀ¼ | ����Ŀ¼ | ������� | �߼�����                                                            [��ӡ��ҳ]   [�ر�]
Supporting info
�ѱ����Ƽ������� -���, ������, һ��ƫ��, ���������С��.

����ƪ�����£��������Ƽ��������������ַ��" name=neirong>
Email Alert
-���zz')" href="#">-���
�ޱ�, ��Զ��, ������, ���
������ѧ��ѧ��ͳ��ѧѧԺ, ���Ŵ�ѧ�Ƽ�ѧԺ, ������ѧ���������Ϣ����ѧԺ

��ƫ��̶�, ������Щ����ʽ��ͳ��ѧ�ͻ���ѧϰ�����ж��������Ҫ��Ӧ��. �ڱ���,
�����µIJ���˹̹����ʽ. ��Ϊ��Щ����ʽ��Ӧ��,

�ؼ����� ���в���ʽ       -���zz')" href="#"> -���   ������   һ��ƫ��   ���������С��.  
New Bernstein's Inequalities for Dependent Observations and Applications to Learning Theory
Zou Bin, Tang Yuanyan, Li Luoqing, Xu Jie
Faculty of Mathematics and and Statistics, Hubei University; Faculty of Science and Technology, University of Macau; School of Computer Science and Information Engineering, Hubei University

The classical concentration inequalities deal with the deviations
of functions of independent and identically distributed (i.i.d.) random variables from their
expectation and these inequalities have numerous important applications in statistics and
machine learning theory. In this paper we go far beyond this classical framework by establish
two new Bernstein type concentration inequalities for -mixing sequence and uniformly
ergodic Markov chains. As the applications of the Bernstein's inequalities, we also obtain
the bounds on the rate of uniform deviations of empirical risk minimization (ERM) algorithms
based on -mixing observations.

Keywords: Concentration inequality,    -mixing, Markov chains, uniform
deviation, empirical risk minimization.zz')" href="#"> -mixing, Markov chains, uniform
deviation, empirical risk minimization.
�ո�����  �޻�����  ����淢������  

ͨѶ����: �ޱ�


Copyright by Ӧ�ø���ͳ��