BRUNEL Nicolas

< Back to ILB Patrimony
Affiliations
  • 2012 - 2015
    University of Chicago
  • 1992 - 1993
    Université Paris 6 Pierre et Marie Curie
  • 2018
  • 2015
  • 2014
  • 2013
  • 1993
  • Author response: Cerebellar learning using perturbations.

    Guy BOUVIER, Johnatan ALJADEFF, Claudia CLOPATH, Celian BIMBARD, Jonas RANFT, Antonin BLOT, Jean pierre NADAL, Nicolas BRUNEL, Vincent HAKIM, Boris BARBOUR
    2018
    No summary available.
  • Population Density Model.

    Nicolas BRUNEL, Vincent HAKIM
    Encyclopedia of Computational Neuroscience | 2015
    No summary available.
  • Fokker-Planck Equation.

    Nicolas BRUNEL, Vincent HAKIM
    Encyclopedia of Computational Neuroscience | 2015
    No summary available.
  • Methods and models in neurophysics.

    Alla BORISYUK, Y. ROUDI, O. l. WHITE, Emery n. BROWN, Eve MARDER, Mike SHELLEY, Haim SOMPOLINSKY, Naftali TISHBY, Alessandro TREVES, Misha TSODYKS, Fred WOLF, Christophe POUZAT, Carl VAN VREESWIJK, David GOLOMB, David h. TERMAN, German MATO, John RINZEL, Nicolas BRUNEL, Paul c. BRESSLOFF, Carson c. CHOW, Boris GUTKIN, David HANSEL, Claude MEUNIER, Jean DALIBARD
    2014
    No summary available.
  • Memory Maintenance in Synapses with Calcium-Based Plasticity in the Presence of Background Activity.

    David HIGGINS, Michael GRAUPNER, Nicolas BRUNEL
    PLoS Computational Biology | 2014
    Most models of learning and memory assume that memories are maintained in neuronal circuits by persistent synaptic modifications induced by specific patterns of pre- and postsynaptic activity. For this scenario to be viable, synaptic modifications must survive the ubiquitous ongoing activity present in neural circuits in vivo. In this paper, we investigate the time scales of memory maintenance in a calcium-based synaptic plasticity model that has been shown recently to be able to fit different experimental data-sets from hippocampal and neocortical preparations. We find that in the presence of background activity on the order of 1 Hz parameters that fit pyramidal layer 5 neocortical data lead to a very fast decay of synaptic efficacy, with time scales of minutes. We then identify two ways in which this memory time scale can be extended: (i) the extracellular calcium concentration in the experiments used to fit the model are larger than estimated concentrations in vivo. Lowering extracellular calcium concentration to in vivo levels leads to an increase in memory time scales of several orders of magnitude. (ii) adding a bistability mechanism so that each synapse has two stable states at sufficiently low background activity leads to a further boost in memory time scale, since memory decay is no longer described by an exponential decay from an initial state, but by an escape from a potential well. We argue that both features are expected to be present in synapses in vivo. These results are obtained first in a single synapse connecting two independent Poisson neurons, and then in simulations of a large network of excitatory and inhibitory integrate-and-fire neurons. Our results emphasise the need for studying plasticity at physiological extracellular calcium concentration, and highlight the role of synaptic bi- or multistability in the stability of learned synaptic structures.
  • Fokker-Planck Equation.

    Nicolas BRUNEL, Vincent HAKIM
    Encyclopedia of Computational Neuroscience | 2014
    No summary available.
  • Population Density Models.

    Nicolas BRUNEL, Vincent HAKIM
    Encyclopedia of Computational Neuroscience | 2013
    No summary available.
  • Memory capacity of networks with stochastic binary synapses.

    Alexis DUBREUIL, Yali AMIT, Nicolas BRUNEL
    BMC Neuroscience | 2013
    No summary available.
  • Fokker-Planck Equation.

    Nicolas BRUNEL, Vincent HAKIM
    Encyclopedia of Computational Neuroscience | 2013
    No summary available.
  • Neural networks: from statistical physics to neurophysiology.

    Nicolas BRUNEL, Jean pierre NADAL
    1993
    Several works concerning neural networks are presented. In the first part, statistical physics techniques are applied to several situations. One of them is the study of categorization and the occurrence of prosopagnosia in neural networks with structured attractors. In the second part, more realistic neural networks are studied. First, the conditions on the input stimuli are determined so that they can be learned, and then a model of an autonomously learning network is studied. This network includes analog neurons and a stochastic learning dynamic on synaptic efficiencies. Finally, the same type of network, studied in the context of learning a fixed sequence of stimuli, allows us to confront the model in a fruitful way with the neurophysiology experiments of miyashita.
Affiliations are detected from the signatures of publications identified in scanR. An author can therefore appear to be affiliated with several structures or supervisors according to these signatures. The dates displayed correspond only to the dates of the publications found. For more information, see https://scanr.enseignementsup-recherche.gouv.fr