Skip to Main content Skip to Navigation
Journal articles

Large-Deviation Approach to Random Recurrent Neuronal Networks: Parameter Inference and Fluctuation-Induced Transitions

Abstract : We here unify the field-theoretical approach to neuronal networks with large deviations theory. For a prototypical random recurrent network model with continuous-valued units, we show that the effective action is identical to the rate function and derive the latter using field theory. This rate function takes the form of a Kullback-Leibler divergence which enables data-driven inference of model parameters and calculation of fluctuations beyond mean-field theory. Lastly, we expose a regime with fluctuation-induced transitions between mean-field solutions.
Document type :
Journal articles
Complete list of metadata

https://hal.sorbonne-universite.fr/hal-03407444
Contributor : Hal Sorbonne Université Gestionnaire Connect in order to contact the contributor
Submitted on : Thursday, October 28, 2021 - 2:19:30 PM
Last modification on : Wednesday, November 17, 2021 - 12:34:05 PM

File

PhysRevLett.127.158302(1).pdf
Publication funded by an institution

Identifiers

Citation

Alexander van Meegen, Tobias Kühn, Moritz Helias. Large-Deviation Approach to Random Recurrent Neuronal Networks: Parameter Inference and Fluctuation-Induced Transitions. Physical Review Letters, American Physical Society, 2021, 127 (15), ⟨10.1103/physrevlett.127.158302⟩. ⟨hal-03407444⟩

Share

Metrics

Record views

7

Files downloads

13