IEEE Transactions on neural networks vol:13 issue:1 pages:160-187
The minor component analysis (MCA) deals with the recovery of the eigenvector associated to the smallest eigenvalue of the autocorrelation matrix of the input data and is a very important tool for signal processing and data analysis. It is almost exclusively solved by linear neurons. This paper presents a linear neuron endowed with a novel learning law, called MCA EXINn and analyzes its features. The neural literature about MCA is very poor, in the sense that both a little theoretical basis is given (almost always focusing on the ODE asymptotic approximation) and only experiments on toy problems (at most four-dimensional problems) are presented, without any numerical analysis. This work addresses these problems and lays sound theoretical foundations for the neural MCA theory. In particular, it classifies the MCA neurons according to the Riemannian metric and justifies, from the analysis of the degeneracy of the error cost, the different behavior in approaching convergence. The cost landscape is studied and used as a basis for the analysis of the asymptotic behavior. All the phases of the dynamics of the MCA algorithms are investigated in detail and, together with the numerical analysis, lead to the identification of three possible kinds of divergence, here called sudden, dynamic, and numerical. The importance of the choice of low initial conditions is also explained. A lot of importance is given to the experimental part, where simulations on high-dimensional problems are presented and analyzed. The orthogonal regression or total least squares (TLS) technique is also presented, together with a real-world application on the identification of the parameters of an electrical machine. It can be concluded that MCA EXIN is the best MCA neuron in terms of stability (no finite time divergence), speed, and accuracy.