Return

(2) S.Watanabe, M.Yoneyama, "A three-dimensional object recognition method using acoustical imaging and neural networks," The Journal of the Acoustical Society of Japan, Vol.47, No.11, pp.825-833, 1991.

(3) S.Watanabe, M.Yoneyama, "An Ultrasonic 3-D visual Sensor Using Neural Networks", IEEE Trans. on Robotics and Automation," Vol.6, No.2, pp.240-249, 1992.

(4) S.Watanabe, M.Yoneyama, "A restoration method of acoustic images using a neural network," The Journal of the Acoustical Society of Japan, Vol.48, No.10, pp.711-719, 1992.

(5) S.Watanabe, M.Yoneyama,"A classification method of 3-D objects by a neuro-ultrasonic visual sensor using position and rotation invariant feature values," The Journal of the Acoustical Society of Japan, Vol.48, No.10, pp.720-726, 1992.

(6) K.Takatsu, H.Sawai, S.Watanabe, M.Yoneyama, "Genetic algorithms applied to Bayesian image restoration," IEICE Trans., Vol.J77-D-2, No.9, pp.1768-1777, 1994.

(7) S.Watanabe, K.Fukumizu, "Probabilistic design of Layered Neural networks based on their unified framework," IEEE Transactions on Neural Networks, Vol.6, No.3, pp.691-702, 1995.

(8) S.Watanabe, "A modified information criterion for automatic model and parameter selection in neural network learning," IEICE Transactions, Vol.E78-D, No.4, pp.490-499, 1995.

(9) S.Ishii, K.Fukumizu, S.Watanabe, "A Network of Chaotic Elements for Information Processing," Neural Networks, Vol.9, No.1, pp.25-40, 1996.

(10) S.Watanabe, "Solvable models of layered neural networks based on their differential structure," Advances in Computational Mathematics, Vol.5, No.1, pp.205-231, 1996.

(11) K.Fukumizu, S.Watanabe, "Optimal training data and predictive error of polynomial approximation," IEICE Trans., Vol.J79-A, No.5, pp.1100-1108, 1996.

(12) S.Watanabe, "A finite wavelet decomposition method," IEICE Trans., Vol.J79-A, No.12, pp.1948-1956, 1996.

(13) N. Ishimasa, Y.Yokota, S.Watanabe, "Route Optimization of Mobil Service Station by Genetic Algorithms with the Variable Number of Gene," IEICE Trans. Vol.J81-A, No.9, pp.1221-1229,1998.

(14) S.Watanabe, "On the generalization error by a layered statistical model with Bayesian estimation," IEICE Trans., Vol.J81-A, No.10, pp.1442-1452, 1998.

English Version : Electronics and Communications in Japan,(2000) pp.95-104.

(15) M.Yoneyama, K.Yuasa, S.Watanabe, "Identification of system characteristics of the ultrasonic imaging system using genetic algorithm," The Journal of Acoustic Society of Japan, Vol.55, No.1,pp.3-11, 1999.

(16) S.Watanabe,"Algebraic Analysis for Non-identifiable Learning Machines," Neural Computation, Vol.13, No.4, pp.899-933, 2001. Article, Postscript file, gzipped . This paper clarified the complete asymptotic form of the stochastic complexity or the freee energy. It is different from that of the regular statistical model. Based on algebraic analysis, algebraic geometry, and theory of functions of several complex variabales, it is clarified how the algebraic structure of the Fisher metric determines the learning efficiency of a complex learning machine. I would like to say, if you are interested in mathematical information theory, then this is a paper worth reading. Even if you consider that almost all papers in neural computing have no essential advances, I promise you that this paper truly discovers a new structure in complex learning machines. The relation between algebraic geometry and complex learning machines was firstly discovered. As you know, the communicator of this paper is Professor David Mumford who is the Fields Medalist, 1974 by the researches in algebraic geometry.

(17) S. Watanabe, "Training and generalization error of learning machines with algebraic singularities." IEICE Transactions, Vol.J84-A, No.1, pp.99-108, Jan, 2001.

(18) S. Watanabe, "Algebraic geometry of learning machines with singularities and their prior distributions," Japanese Journal of Artificial Intelligence, Vol.16, No.2, pp.308-315, March, 2001.

(19) S. Watanabe, "Algebraic geometrical methods for hierarchical learning machines," International Journal of Neural Networks, Vol.14, No.8,pp.1049-1060, 2001.

(20) S. Watanabe, "Learning efficiency of redundant neural networks in Bayesian estimation," IEEE Transactions on Neural Networks, Vol.12, No.6, 1475-1486,2001.

Errata: IEEE Transactions on Neural Networks, Vol.13, No.1,pp.254, 2002.

(21) K.Yamazaki, S.Watanabe,"A probabilistic algorithm to calculate the learning curves of hierarchical learning machines with singularities," Trans. on IEICE, Vol.J85-D-II,No.3,pp.363-372,Mar. 2002.

(22) K.Nishiue, S.Watanabe,"Effects of priors in model selection of learning machines with singularities," To appear in IEICE Trans., Vol.J86-D-II,No.1,Jan.2003.

(23) K. Watanabe, S.Watanabe,"On the Bayes generalization error of the reduced rank regression," To appear in IEICE Trans., Vol.J86-A,No.3,2003.

(24) S.Watanabe, S.-I.Amari,"Learning coefficients of layered models when the true distribution mismatches the singularities", Neural Computation, Vol.15,No.5,1013-1033, 2003.

(25) K.Yamazaki, S.Watanabe,``Singularities in mixture models and upper bounds of stochastic complexity." International Journal of Neural Networks, Vol.16, No.7, pp.1029-1038,2003.

(26) K.Yamazaki, S.Watanabe,`` Singularities in Complete bipartite graph-type Boltzmann machines and upper bounds of stochastic complexities", IEEE Trans. on Neural Networks, Vol. 16 (2), pp 312-324, 2005.

(28) S.Watanabe, K.Fukumizu,K.Hagiwara, S.Amari,``Learning Theory of Singular Statistical Models," "Vol.J88-D2 No.2 pp.159-169,2005.(Survay Paper).

(28) K. Yamazaki and S. Watanabe, "Algebraic geometry and stochastic complexity of hidden Markov models", Neurocomputing,Vol.69,pp.62-84,2005.

(29) S.Watanabe,``Algebraic geometry of singular learning machines and symmetry of generalization and training errors," Neurocomputing, Vol.67,pp.198-213,2005.

(30) M.Aoyagi, S.Watanabe,``Stochastic complexities of reduced rank regression in Bayesian estimation," Neural Networks, Vol.18,No.7,pp.924-933,2005.

(31) K.Nagata, S.Watanabe,``A method to estimate the generalization error of singular learning machines by decomposition of Kullback information," Vol.J88-II, No.6, pp.994-1002,2005.

(32) N. Nakano, K.Takahashi, S.Watanabe,``A method to estimate the efficiency of Markov chain Monte Carlo in singular learning machines," Vol.J88-D-II,No.10,pp.2011-2020,2005.

(33) M.Aoyagi,S.Watanabe,``Resolution of singularities and generalization error with Bayesian estimation for layered neural network," Vol.J88-D-II,No.10,pp.2112-2124,2005.

(34) S.Nakajima,S.Watanabe,``Generalization performance of subspace Bayes approach in linear neural networks, " Vol.E89-D, no.3, pp.1128-1138, 2006.

(35) T.Hosino, K.Watanabe,S.Watanabe,``Stochastic complexity of Hidden Markov Models on the Variational Bayesian Learning," Vol.J89-D-II,No.6, pp.1279-1287, 2006.

(36) Kazuho Watanabe, Sumio Watanabe, "Stochastic complexities of gaussian mixtures in variational bayesian approximation," Journal of Machine Learning Research, Vol.7, pp.625-644, 2006.

(37) Shinichi Nakajima, Sumio Watanabe, "Generalization Performance of Subspace Bayes Approach in Linear Neural Networks," IEICE Transactions, Vol.E89-D, no.3, pp.1128-1138, 2006.

(38) Kazuho Watanabe, S.Watanabe, ``Stochastic complexities of general mixture models in variational Bayesian learning," Neural Networks, Vol.20, No.2, March, pp.210-217, 2007. (The best paper award of 2008 Japanese Neural Network Society).

(39) S. Nakajima, S.Watanabe, ''Variational Bayes Solution of Linear Neural Networks and its Generalization Performance.''Neural Computation, vol.19, no.4, pp.1112-1153, 2007.

(40) K. Watanabe, S. Watanabe, Estimating the Data Region Using Minimum and Maximum Values, Interdisciplinary Information Sciences, Vol. 13 , No. 2, pp. 151-161, 2007.

(41) K. Watanabe, S. Watanabe, Stochastic complexity for mixture of exponential families in generalized variational Bayes, Theoretical Computer Science, Vol.387, pp.4-17, 2007.

(42) Kenji Nagata, Sumio Watanabe, ``Asymptotic Behavior of Exchange Ratio in Exchange Monte Carlo Method,'' International Journal of Neural Networks, Vol. 21, No. 7, pp. 980-988, 2008.

(43) Kenji Nagata, Sumio Watanabe, ``Exchange Monte Carlo Sampling From Bayesian Posterior for Singular Learning Machines," IEEE Transactions on Neural Networks, Vol.19, No.7, pp.1253-1266, 2008.

(44) Kenji Nagata, Sumio Watanabe, ``Theory and Experiments of Exchange Ratio for Exchange Monte Carlo Method'', Neural Information Processing - Letters and Reviews, Vol.12, No. 1-3, pp.21-30, 2008.

(45) K.Watanabe, M.Shiga, S.Watanabe,"Upper bound for variational free energy of Bayesian networks," Machine Learning, vol.75, no.2, pp.199-215, 2009.

(46) Y. Nishiyama, S. Watanabe, ``Accuracy of Loopy Belief Propagation in Gaussian Models," Neural Networks, Vol.22, No.4, pp.385-394, May 2009.

(47) K. Yamazaki, M. Aoyagi, S. Watanabe, ``Asymptotic Analysis of Bayesian Generalization Error with Newton Diagram Neural Networks", Neural Networks, Vol.23, No.1, pp.35-43, 2010.

(48) Sumio Watanabe, "Equations of states in singular statistical estimation", Neural Networks, Vol.23, No.1, pp.20-34, 2010.

(49) Sumio Watanabe, "Equations of states in statistical learning for an unrealizable and regular case," IEICE Transactions, Vol.E93-A, No.3, pp.617-626, 2010.

(50) Sumio Watanabe, "A limit theorem in singular regression problem," Advanced Studies of Pure Mathematics, Vol.57, pp.473-492, 2010.

(51) K. Yamazaki, M. Aoyagi, S. Watanabe, ``Asymptotic Analysis of Bayesian Generalization Error with Newton Diagram", Neural Networks, Vol.23, No.1, pp.35-43, 2010.

(52) Daisuke Kaji, Kazuho Watanabe, Sumio Watanabe, ``Phase transition of variational Bayes learning in Bernoulli mixtute," Australian Journal of Intelligent Information Processing Systems, Vol.11, No.4,pp.35-41, 2010.

(52) Daisuke Kaji, Sumio Watanabe, ``Two Design Methods of Hyperparameters in Variational Bayes Learning for Bernoulli Mixtures," Neurocomputing, to appear.

(53) Sumio Watanabe, ``Asymptotic Equivalence of Bayes Cross Validation and Widely Applicable Infomraiton Criterion in Singular Learning Theory," Journal of Machine Learning Research, to appear.

(2) S.Watanabe, M.Yoneyama, ``The Ultrasonic Robot Eye System Using Neural Network", Proc. of 13th Intern. Cong. on Acoustics, (Belgrade), pp.91-95, 1989.

(3) S.Watanabe, M.Yoneyama, ``An Ultrasonic Robot Eye for Object Recognition Using Neural Network", Proc. IEEE Ultrason. Symp., (Montreal), pp.1083-1086, 1989.

(4) S.Watanabe, M.Yoneyama, ``An Ultrasonic Robot Eye for Three-Dimensional Object Recognition Uisng Neural Network", Proc. of EUSIPCO-90, (Barcelona), pp.1687-1690, 1990.

(5) S.Watanabe, M.Yoneyama, ``Three-Dimensional Object Recognition Sysytem Combining Acoustical Imaging with Neural Network", Proc. of ISITA-90, (Honolulu), pp.655-658, 1990.

(6) S.Watanabe, M.Yoeneyama, ``An Ultrasonic Robot Eye System for Three- dimensional Object Recognition Using Neural Network", Proc. of IEEE Ultrasonics Symp., (Honolulu), pp.351-354, 1990.

(7) S.Watanabe, H.Watanabe, A.Saitou. M.Yoneyama, ``An Application of Neural Networks to an Ultrasonic 3-D Visual Sensor", Proc. of IJCNN, pp.1397 -1402, (Singapole) 1991.

(8) S.Watanabe, M.Yoneyama, ``A 3-D Visual Sensor Using Neural Networks and Its Application for Factory Automation", Proc. of FENDT91, (Seoul), pp.379-386, 1991.

(9) S.Watanabe, M.Yoneyama, ``An Ultrasonic Visual Sensor Using a Neural Network and Its Application for Automatic Object Recognition," IEEE Ultrasonics Symp. (Florida) pp.781-784, 1991.

(10) S.Watanabe, K.Fukumizu, ``The Unified Neural Network Theory and Proposal of New models," 2nd Int. conf. on Fuzzy logic and Neural Networks, (Iizuka) pp.725-728, 1992.

(11) S.Watanabe, M.Yoneyama, ``An Ultrasonic Robot Eye Using Neural Networks", Acoustical Imaging, Plrenum Press, New York, Vol.18, pp.83-95, 1992.

(12) K.Takatsu, S.Watanabe, H.Sawai, M.Yoneyama, ``A Proposal of image restoration using Genetic Algorithms," Proc. of IJCNN (Beijing), Vol.1, pp.642-647, 1992.

(13) S.Watanabe, K.Fukumizu, ``The Unified Neural Network Theory and Its Application to New Models," Proc. of IJCNN (Beijing), Vol.2, pp.381-386, 1992.

(14) S.Watanabe, M.Yoneyama, ``An Ultrasonic 3-D Object Recognition Method Based on the Unified Neural Network Theory," Proc. of IEEE US Symp. (Tucson, Arizona), pp.1191-1194, 1992.

(15) S.Watanabe, M.Yoneyama, S. Ueha, ``An ultrasonic 3-D object identification system combining ultrasonic imaging with a probability competition neural network," Proc. of Ultrasonics International 93 conference, (Vienna), pp.767-770, 1993.

(16) S.Watanabe, ``Differential equations accompanying neural networks and solvable nonlinear learning machines," Proc. of IJCNN (Nagoya), pp.2968-2971, 1993.

(17) K.Fukumizu, S.Watanabe, ``Probability density estimation by regularization method," Proc. of IJCNN (Nagoya), pp.1727-1730, 1993.

(18) S.Ishii, K.Fukumizu, S.Watanabe, ``Associative memory using spatiotemporal chaos," Proc. of IJCNN (Nagoya), pp.2638-2641, 1993.

(19) S.Ishii, K. Fukumizu, S.Watanabe, ``Globally coupled map model for information processing," Proc. of International Symp., on Nonlinear Theory and Its Applications, (Honolulu),pp.1157-1160, 1993.

(20) K.Fukumizu, S.Watanabe, ``Error estimation and learning data arrangement for neural networks," proc. of IEEE world congress on computational intelligence, (Florida), Vol.2 pp.777-780, 1994.

(21) S.Watanabe, ``Solvable moldes of artificial neural netwroks," Advances in Neural Information Processing Systems, Morgan Kauffmann, New York, Vol.6, pp.423-430, 1994.

(22) S.Watanabe, M.Yoneyama, ``A 3-D Object Classification Method Combining Acoustical Imaging with Probability Competition Neural Networks," Acoustical Imaging, Plenum Press, New York, Vol.20, pp.65-72, 1994.

(23) S.Watanabe, ``An optimization method of artificial neural networks based on the modified infromation criterion," Advances in Neural Information Processing Systems, Morgan Kaufmann, New York, Vol.6, pp.293-300, 1994.

(24) S.Watanabe, ``A generalized Bayesian framework for neural networks with singular Fisher information matrices," Proc. of International Symposium on Nonlinear Theory and Its applications, (Las Vegas), pp.207-210, 1995.

(25) S.Watanabe, M.Yoneyama, ``A nonlinear ultrasonic imaging method based on the modified information criterion," Acoustical Imaging, Vol.22, Plenum Press, New York, pp.549-554, 1996.

(26) S.Watanabe,"On the essential difference between neural networks and regular statistical models," Proc. of Int. Conf. on Computational Intelligence and Neuroscience, Vol.2, pp.149-152, 1997.

(27) S.Watanabe, "Realizable approximation bounds for a solvable neural network," Approximation Theory, Vol.1, Vaderbilt University Press, pp.347-354, 1998.

(28) S.Watanabe,"Inequalities of Generalization Errors for Layered Neural Networks in Bayesian Learning," Proc. of Int. Conf. on Neural Information Processing, pp.59-62, 1998.

(29) S.Watanabe, "Approximation bounds for layered learning machines and environmental probability measures," Proc. of Int. Conf. on Comupational Intelligence and Neuraoscience, Vol.2, pp.135-138, 1998.

(30) S.Watanabe, "Algebraic analysis for neural network learning", Proc. of IEEE SMC symp., 1999.

(31) S.Watanabe,"Algebraic analysis for singular statistical estimation," Lecture Notes in Computer Sciences, Vol.1720, pp.39-50.

(32) S.Watanabe,"Algebraic analysis for non-regular learning machines," Advances in Neural Information Processing Systems, Vol.12, 2000, 356-362. Article, postscript, gzipped

(33) S.Watanabe,"Algebraic information geometry for learning machines with singularities", Advances in Neural Information Processing Systems,(Denver, USA), pp.329-336. 2001. Article, postscript, gzipped

(34) Sumio Watanabe, "Algebraic geometry of neural network learning," Special session in AMS 2002 Fall Central Section Meeting will be held in Madison, Wisconsin, October 12-13, 2002 University of Wisconsin.

(35) Keisuke Yamazaki and Sumio Watanabe, ``Resolution of singularities in mixture models and the upper bounds of the stochastic complexity." Proc. of International conference on Neural Information Processing, CD-ROM, 2002.

(36) Sumio Watanabe and Shun-ich Amari, ``Singularities in Neural Networks Make Bayes Generalization Errors Smaller Even if They Do Not Contain the True," Proc. of International conference on Neural Information Processing, CD-ROM, 2002.

(37) S. Watanabe, S-I. Amari,"The effect of singularities when the true parameter do not lie on such singularities," NIPS*2002, Vacouver, canada, 2002.

To be continued.