Some mathematical aspects of neural self-organization and multilayer perceptrons.

Authors
Publication date
1992
Publication type
Thesis
Summary This thesis is devoted to the mathematical study of two types of neural networks and is composed of two distinct parts. The first part focuses on the multilayer perceptron, and more particularly on the problems of convexity of the error function. We show that weak conditions are sufficient to ensure convexity for all perceptrons without hidden layers, while convexity becomes impossible for perceptrons with one or more hidden layers, even taking into account the group of geometric transformations that leave the error invariant. We then propose a new learning algorithm using the information obtained and which we have tested on the classical problem of print recognition. The second part is based on the work of r. Linsker on vision. This researcher has proposed, since 1986, a rather simple model (based on a hebbian learning rule) which allows to explain the emergence of orientation selective cells in prenatal vision. We propose a mathematical study of this model. The operator that defines the time evolution of cortical connections depends on two real parameters and is diagonalizable in an orthonormal basis of eigenfunctions. We show that, depending on the value of these parameters, these eigenfunctions are, after renormalization, either hermits functions in two variables or sums of these hermits functions. In order to determine the asymptotic behavior of the synaptic weight function, we then study the relative position of the eigenvalues according to the parameters. This allows us, finally, to derive explicit conditions to force such or such mode (symmetric or not) to become preponderant in time.
Topics of the publication
  • ...
  • No themes identified
Themes detected by scanR from retrieved publications. For more information, see https://scanr.enseignementsup-recherche.gouv.fr