* Following papers may have been revised if I found typos after publication.
Thesis
* The following items are basically ordered in inverse time series. So, please refer to the explanations of previous (lower) works if you find unknown words, which can be explained or defined there.
This research provides a theoretical discussion on the work by Professor Masaaki Sato of ATR,
Japan, referred as [4] in the ICANN2006 paper, where the VB approach was applied to the
automatic relevance determination (ARD) model in a brain current estimation by MEG
data, i.e., a linear inverse problem. Also in this case, the VB approach is closely
related to the positive-part James-Stein type shrinkage estimation.
Conference : ICANN2007(Porto, Portugal, 2007.9.9-13)
''Generalization Error of Automatic Relevance Determination.''
Conference : ICANN2006(Athens, Greece, 2006.9.10-14)
''Analytic Solution of Hierarchical Variational Bayes in Linear Inverse Problem.''
<paper,
presentation>
Meeting : IBIS2006(Osaka, 2006.10.31-11.2)
''Difference between Variational Bayes and Shrinkage Estimation in Linear Inverse Problem.''
Meeting : IEICE Neurocomputing Technical Meeting (Tokyo, 2006.3.15-17)
''Analysis of Hierarchical Variational Bayes Approach in Linear Inverse Problem.''
- Variational Bayes Approach in Linear Neural Networks
The generalization error and the training error,
as well as the free energy,
of the variational Bayes (VB)
approach was clarified, which is the first time that the generalization error of the VB
approach in any singular model has been theoretically shown,
although the VB free energy in mixture models was previously clarified
by
Kazuho Watanabe.
By solving the variational
condition, the VB approach was shown to be asymptotically equivalent to the positive-part
James-Stein type shrinkage estimation, like the SB approach below.
Thus, the generalization
properties were derived in the same way as the analysis of the SB approach.
It was shown that
the asymptotic behavior of the free energy and that of the generalization error are not simply related to
each other
in the VB approach,
unlike in the Bayes estimation.
Hence the variation of the VB approach providing smaller free energy does not
necessarily provides better generalization performance.
Journal : Neural Computation, vol.19, no.4, pp.1112-1153, 2007
''Variational Bayes Solution of Linear Neural Networks and its Generalization Performance.''
<paper>
Conference : ICONIP2005(Taipei, Taiwan, 2005.10.30-11.2)
''Generalization Error and Free Energy of Linear Neural Networks in Variational Bayes Approach.''
Meeting : IBIS2005(Tokyo, 2005.11.9-11)
''Generalization Properties of Variational Bayes Approach in Linear Neural Networks.''
Meeting : IEICE Neurocomputing Technical Meeting (Tokyo, 2005.3.28-30)
''Generalization Error of Variational Bayes Approach in Reduced Rank Regression.''
- Subspace Bayes Approach (an Empirical Bayes Approach) in Linear Neural Networks
We proposed a subspace Bayes (SB) approach,
which is the EB approach where a part of the parameters are regarded as
hyperparameters, and analyzed it.
The SB solution in linear neural networks was shown to be asymptotically equivalent
to the positive-part James-Stein shrinkage estimator.
The generalization error and the training error were
derived in a similar fashion to the analysis of the ML estimation, referred as [14] in the
IEICE transactions paper.
Journal : IEICE Transactions, vol.E89-D, No.3, pp.1128-1138, 2006
''Generalization Performance of Subspace Bayes Approach in Linear Neural Networks.''
<paper>
Conference : IJCAI2005(Edinburgh, U.K., 2005.8.2-5)
''Generalization Error of Linear Neural Networks in an Empirical Bayes Approach.''
<paper,
presentation>
Meeting : IEICE Neurocomputing Technical Meeting (ATR, 2005.10.17-18)
''Analysis of Subspace Bayes Approach in Linear Neural Networks.''
Meeting : IBIS2004(Tokyo, 2004.11.8-10)
''Generalization Error of an Empirical Bayes Approach.''
Meeting : IEICE Neurocomputing Technical Meeting (Tokyo, 2004.7.26-27)
''Generalization Error of the Hyperparameter Optimization Method.''
Meeting : IBIS2003(Kyoto, 2003.11.11-12)
''Outlier Rejection with Mixture Models in Alignment for Lithography.''
Meeting : IEICE Neurocomputing Technical Meeting (Tokyo, 2003.7.28-29)
''Outlier Rejection in Alignment for Lithography.''