Large deviations for slow learning processes with discontinuous statistics on a surface.

Authors
Publication date
1995
Publication type
Thesis
Summary We establish the principle of large deviations for markov chains whose transition probabilities are different depending on whether the process is on one side or the other of a surface. The large deviations are then observed over a fixed time interval for the properly normalized process. On each probability field, we make the continuity assumptions that usually allow us to obtain the large deviations principle. This continuity is lost on the boundary. Such dynamics are used in many stochastic algorithms, in particular in some learning algorithms for neural networks. Depending on the configuration of the supports of the measurements in the vicinity of the boundary, two types of behavior are possible for the process. In one, the number of border crossings is not bounded. A new cost function appears, combining the Cramer transforms of each field. It corresponds to the mixture of the two fields. The action functional is computed only on the paths for which there is a finite number of intervals where they remain either in one of the two half-spaces or on the border. In the other type of behavior, the process locally crosses the boundary at most once. The cost functional is obtained by integrating successively each cramer transform. In the last chapter, we give the equations verified by a minimum cost trajectory between two points. Their solution can be used for accelerated simulations of rare events.
Topics of the publication
  • ...
  • No themes identified
Themes detected by scanR from retrieved publications. For more information, see https://scanr.enseignementsup-recherche.gouv.fr