CHOPIN Nicolas

< Back to ILB Patrimony
Affiliations
  • 2012 - 2017
    Centre de recherche en économie et statistique de l'Ensae et l'Ensai
  • 2012 - 2017
    Centre de recherche en économie et statistique
  • 2002 - 2003
    Université Paris 6 Pierre et Marie Curie
  • 2019
  • 2018
  • 2017
  • 2016
  • 2015
  • 2013
  • 2012
  • 2003
  • Theoretical contributions to Monte Carlo methods, and applications to Statistics.

    Lionel RIOU DURAND, Nicolas CHOPIN, Christian p. ROBERT, Nicolas CHOPIN, Christian p. ROBERT, Arnaud GUILLIN, Pierre e. JACOB, Arnak s. DALALYAN, Stephanie ALLASSONNIERE, Arnaud GUILLIN, Pierre e. JACOB
    2019
    The first part of this thesis concerns the inference of non-standardized statistical models. We study two inference methods based on random sampling: Monte-Carlo MLE (Geyer, 1994), and Noise Contrastive Estimation (Gutmann and Hyvarinen, 2010). The latter method was supported by a numerical justification of better stability, but no theoretical results had yet been proven. We prove that Noise Contrastive Estimation is more robust to the choice of the sampling distribution. We evaluate the gain in accuracy as a function of the computational budget. The second part of this thesis concerns approximate random sampling for high dimensional distributions. The performance of most sampling methods deteriorates rapidly as the dimension increases, but several methods have proven their efficiency (e.g. Hamiltonian Monte Carlo, Langevin Monte Carlo). Following some recent work (Eberle et al., 2017 . Cheng et al., 2018), we study some discretizations of a process known as kinetic Langevin diffusion. We establish explicit convergence speeds to the sampling distribution, which have a polynomial dependence in the dimension. Our work improves and extends the results of Cheng et al. for log-concave densities.
  • Computational learning noise in human decision-making.

    Charles FINDLING, Etienne KOECHLIN, Nicolas CHOPIN, Jean DAUNIZEAU, Christopher SUMMERFIELD, Alexandre POUGET, Mate LENGYEL, Adam SANBORN
    2018
    In uncertain and changing environments, making decisions requires the analysis and weighting of past and present information. To model human behavior in such environments, computational approaches to learning have been developed based on reinforcement learning or Bayesian inference. In order to better account for behavioral variability, these approaches assume noise in the selection of the action. In the first part of my work, I argue that the noise in the action selection is insufficient to explain the behavioral variability and I show the presence of learning noise reflecting computational inaccuracies. To this end, I introduce noise into the learning algorithm by allowing for random deviations from the noise-free update rule. The addition of this noise provides a better explanation of human behavioral performance (Findling C., Skvortsova V., et al., 2018a, in preparation). In the second part of my work, I show that this noise has virtuous adaptive properties in learning processes elicited in changing (volatile) environments. Using the Bayesian modeling framework, I show that a simple learning model, assuming stable external contingencies, but with noise in the learning, performs as well as the optimal Bayesian model that infers the volatility of the environment. Furthermore, I establish that this noise model better explains human behavior in changing environments (Findling C. et al., 2018b, in preparation).
  • High dimensional Bayesian computation.

    Alexander BUCHHOLZ, Nicolas CHOPIN, Christian p. ROBERT, Nicolas CHOPIN, Christian p. ROBERT, Sylvia RICHARDSON, Kerrie l. MENGERSEN, Robin RYDER, Estelle KUHN, Christian p. ROBERT, Sylvia RICHARDSON
    2018
    Computational Bayesian statistics builds approximations of the a posteriori distribution either by sampling or by building tractable approximations. The contribution of this thesis to the field of Bayesian statistics is the development of new methodology by combining existing methods. Our approaches are better adapted to the dimension or lead to a reduction of the computational cost compared to existing methods.Our first contribution improves the approximate Bayesian computation (ABC) by using the quasi-Monte Carlo (QMC). ABC allows Bayesian inference in models with intractable likelihood. QMC is a variance reduction technique that provides more accurate estimators of integrals. Our second contribution uses QMC for variational inference(VI). VI is a method for constructing tractable approximations to the a posteriori distribution. The third contribution develops an approach to adapt Sequential Monte Carlo (SMC) samplers when using Hamiltonian Monte Carlo (HMC) mutation kernels. SMC samplers allow an unbiased estimation of the model evidence, but they tend to lose performance when the dimension increases. HMC is a Markov chain Monte Carlo technique that has interesting properties when the dimension of the target space increases but is difficult to adapt. By combining the two, we build a sampler that takes advantage of both.
  • Leave Pima Indians Alone: Binary Regression as a Benchmark for Bayesian Computation.

    Nicolas CHOPIN, James RIDGWAY
    Statistical Science | 2017
    No summary available.
  • Theoretical study of some statistical procedures applied to complex data.

    Vincent r. COTTET, Nicolas CHOPIN, Pierre ALQUIER, Arnaud GUYADER, Nicolas CHOPIN, Pierre ALQUIER, Arnaud GUYADER, Isma?l CASTILLO, Peter d. GR?NWALD, Olivier CATONI, Isma?l CASTILLO, Peter d. GR?NWALD
    2017
    The main part of this thesis focuses on developing the theoretical and algorithmic aspects of three distinct statistical procedures. The first problem is the completion of binary matrices. We propose an estimator based on a pseudo-Bayesian variational approximation using a loss function different from those used previously. We can compute non-asymptotic bounds on the integrated risk. The proposed estimator is much faster to compute than an MCMC type estimate and we show on examples that it is efficient in practice. The second problem is the study of the theoretical properties of the empirical risk minimizer for lipschitzian loss functions. We can then apply the main results on logistic regression with SLOPE penalization and on matrix completion. The third chapter develops an Expectation-Propagation approximation when the likelihood is not explicit. We then use the ABC approximation in a second step. This procedure can be applied to many models and is much more accurate and fast. It is applied as an example on a model of spatial extremes.
  • The stability of the nonlinear filter in continuous time.

    Van bien BUI, Sylvain RUBENTHALER, Eric MOULINES, Sylvain RUBENTHALER, Eric MOULINES, Nicolas CHOPIN, Cedric BERNARDIN, Francois DELARUE, Bruno REMILLARD, Dan CRISAN, Nicolas CHOPIN
    2016
    The filtering problem consists in estimating the state of a dynamic system, called a signal which is often a Markovian process, from noisy observations of past states of the system. In this paper, we consider a continuous time filtering model for the diffusion process. The goal is to study the stability of the optimal filter with respect to its initial condition beyond the (strong) mixing assumption for the transition kernel ignoring the ergodicity of the signal.
  • On the properties of variational approximations of Gibbs posteriors.

    Nicolas CHOPIN, Pierre ALQUIER, James RIDGWAY
    Journal of Machine Learning Research | 2016
    No summary available.
  • On some recent advances on high dimensional Bayesian statistics.

    Nicolas CHOPIN, Sebastien GADAT, Benjamin GUEDJ, Arnaud GUYADER, Elodie VERNET
    ESAIM: Proceedings and Surveys | 2015
    This paper proposes to review some recent developments in Bayesian statistics for high dimensional data. After giving some brief motivations in a short introduction, we describe new advances in the understanding of Bayes posterior computation as well as theoretical contributions in non parametric and high dimensional Bayesian approaches. From an applied point of view, we describe the so-called SQMC particle method to compute posterior Bayesian law, and provide a nonparametric analysis of the widespread ABC method. On the theoretical side, we describe some recent advances in Bayesian consistency for a nonparametric hidden Markov model as well as new PAC-Bayesian results for different models of high dimensional regression.
  • Sequential Monte Carlo on large binary sampling spaces.

    Christian SCHAFER, Nicolas CHOPIN
    Statistics and Computing | 2013
    No summary available.
  • Monte Carlo methods for sampling high-dimensional binary vectors.

    Christian SCHAFER, Nicolas CHOPIN
    2012
    This thesis is devoted to the study of Monte Carlo methods for sampling high dimensional binary vectors from complex target laws. If the state-space is too large for an exhaustive enumeration, these methods allow to estimate the expectation of a given law with respect to a function of interest. Standard approaches are mainly based on random walk Markov chain Monte Carlo methods, where the stationary law of the chain is the distribution of interest and the trajectory mean converges to the expectation by the ergodic theorem. We propose a new sampling algorithm based on sequential Monte Carlo methods that are more robust to the multimodality problem thanks to a simulated annealing step. The performance of the sequential Monte Carlo sampler depends on the ability to sample according to auxiliary laws that are, in some sense, close to the law of interest. The main work of this thesis presents strategies to construct parametric families for sampling binary vectors with dependencies. The usefulness of this approach is demonstrated in the context of Bayesian variable selection and combinatorial optimization of pseudo-Boolean functions.
  • Applications of sequential Monte Carlo methods to Bayesian statistics.

    Nicolas CHOPIN, Christian p. ROBERT
    2003
    No summary available.
Affiliations are detected from the signatures of publications identified in scanR. An author can therefore appear to be affiliated with several structures or supervisors according to these signatures. The dates displayed correspond only to the dates of the publications found. For more information, see https://scanr.enseignementsup-recherche.gouv.fr