Return to Sumio Watanabe Homepage
We propose that neither the maximum likelihood method nor the maximum a posteriori method is appropriate for statistical estimation in singular learning machines, because the maximum likelihood estimator often does not exist, or even if it exists, the generalization errors are far larger than that of the Bayes estimation.
This is caused by the fact that the sup-norm is not appropriate as the functional topology for singular likelihood ratio functions. We propose that the likelihood ratio function should be studied as the Shwarz distribution or Sato hyperfunction.
However, we can derive the relation between training errors and generalization errors based on algebraic geometry. The following theorem shows that, if the training error is made smaller, the generalization error becomes larger.
These results can be seen in the following papers.
Sumio Watanabe,"Algebraic geometry of singular learning machines and symmetry of generalization and training errors," Neurocomputing, Vol.67, pp.198-213,2005.
Sumio Watanabe, "Almost all learning machines are singular," Invited Paper in IEEE International Symposium FOCI 2007.