Large-Deviation Approach to Random Recurrent Neuronal Networks: Parameter Inference and Fluctuation-Induced Transitions - Sorbonne Université Accéder directement au contenu
Article Dans Une Revue Physical Review Letters Année : 2021

Large-Deviation Approach to Random Recurrent Neuronal Networks: Parameter Inference and Fluctuation-Induced Transitions

Résumé

We here unify the field-theoretical approach to neuronal networks with large deviations theory. For a prototypical random recurrent network model with continuous-valued units, we show that the effective action is identical to the rate function and derive the latter using field theory. This rate function takes the form of a Kullback-Leibler divergence which enables data-driven inference of model parameters and calculation of fluctuations beyond mean-field theory. Lastly, we expose a regime with fluctuation-induced transitions between mean-field solutions.
Fichier principal
Vignette du fichier
PhysRevLett.127.158302(1).pdf (507.02 Ko) Télécharger le fichier
Origine : Publication financée par une institution

Dates et versions

hal-03407444 , version 1 (28-10-2021)

Identifiants

Citer

Alexander van Meegen, Tobias Kühn, Moritz Helias. Large-Deviation Approach to Random Recurrent Neuronal Networks: Parameter Inference and Fluctuation-Induced Transitions. Physical Review Letters, 2021, 127 (15), ⟨10.1103/physrevlett.127.158302⟩. ⟨hal-03407444⟩
11 Consultations
36 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More