Equations of States in Singular Statistical Estimation

Return to Sumio Watanabe Homepage

Equations of States in Singular Statistical Estimation, arXiv:0712.0653

We obtained a new result, Equations of States in Singular Statistical Estimation. We proved that there is a formula which holds among Bayes generalization, Bayes training, Gibbs generalization, and Gibbs training.

Bg : Bayes Generalization Error,
Bt : Bayes Training Error,
Gg : Gibbs Generalization Error,
Gt : Gibbs Training Error,
b : Inverse Temperature of the a posteriori distribution.

Equations of States : E[Bg] - E[Bt] = E[Gg] - E[Gt] = 2b ( E[Gt] - E[Bt] )

where E[ ] shows the expectation value. The formula holds for any true distribution, any learning machine, any a priori distribution, and any singularities. Hence we can predict the Bayes and Gibbs generalization errors from Bayes and Gibbs training errors without any knowledge of the true distribuion.

E[Bg] = 2b ( E[Gt] - E[Bt] ) + E[Bt]
E[Gg] = 2b ( E[Gt] - E[Bt] ) + E[Gt]

which is said to be a widely applicable information criteria (WAIC). If a learning machine is regular (Fisher information matrix is positive definite), then 2bE[(Gt-Bt)] is equal to the dimension of the parameter space. The proofs of the theorems are based on singular learning theory. (5/Dec/2007).


P.8, eq.(6), lim_{\beta\rightarrow\infty} should be removed.

P.9, Table,1, the theoretical Bayes generation error should be
0.0135, 0.0150, 0.0160, 0.0170

International Conference Paper

Sumio Watanabe, ``A formula of equations of states in singular learning machines," Proceedings of IEEE World Congress of Computational Intelligence, (Hong Kong, China) 2008.