Abstract:
In this paper the limiting properties of relative entropy densities ((
fn)
n≥1 are discussed. Whence, at sufficient condition that (
Xn)
n>1 follows the strong law of large numbers is derived. Definition. A binary information source is a sequence of random variables
Xn,
n≥1, such that 1. Each
Xn takes values in 0, 1; 2. For all ,
n≥1
p(
x1, …,
xn)=
PX1,…,
Xn=
xn>0,
xi∈0, 1, 1≤
i≤
n In the following theorems we assume that
Xn,
n≥1 is an arbitrary binary source
Sn(
ω) =
X1(
ω)+…+
Xn(
ω),
H(
x, 1-
x) is the entropy of Bernoulli distribution, that is,
H(
x, 1-
x)= -
x log
x-(1-
x) log (1-
x) where the logarithm base is 2, and \varphi_n(\omega)=\frac1n \sum_i=1^n\leftX_i \log p_i+\left(1-X_i\right) \log \left(1-p_k\right)\right-\frac1n \log p\left(X_1, \cdots, X_n\right) where
pn∈(0, 1),
n=1, 2, … is a given sequence of real numbers. Theorem 1. \limsup _n \rightarrow \infty \varphi_n(\omega)<0 \quad a.e. Corollary 1. \undersetn \rightarrow \infty\lim \sup \left-\frac1n \log p\left(X_1, \cdots, X_n\right)\right<1 a.e. Corollary 2. If \undersetn \rightarrow \infty\lim \sup \fracS_n(\omega)n \geqslant p \quad \text a.e., \quad \frac12 \leqslant p \leqslant 1 or \limsup _n \rightarrow \infty \fracS_n(\omega)n \leqslant p \quad \text a.e., 0 \leqslant p \leqslant \frac12 then \limsup _n \rightarrow \infty\left-\frac1n \log p\left(X_1, \cdots, X_n\right)\right \leqslant H(p, 1-p) \text a.e. Theorem 2. If \liminf _n \rightarrow \infty\left-\frac1n \log p\left(X_1, \cdots, X_n\right)\right \geqslant C \quad \text a.e. then \liminf _n \rightarrow \infty H\left(\fracS_n(\omega)n, \quad 1-\fracS_n(\omega)n\right) \geqslant C \quad \text a.e. Theorem 3. Letting \begingathered D=\left\\omega: \undersetn \rightarrow \infty\liminf \varphi_n(\omega) \geqslant 0\right\ ; \\ S_n(\omega)=X_1(\omega)+\cdots+X_n(\omega) \endgathered we have \lim _n \rightarrow \infty \fracS_n(\omega)-\left(p_1+\cdots+p_n\right)n=0 \quad \text a.e. in D .