Riemannian batch normalization for SPD neural networks - Sorbonne Université Access content directly
Conference Papers Year : 2019

Riemannian batch normalization for SPD neural networks


Covariance matrices have attracted attention for machine learning applications due to their capacity to capture interesting structure in the data. The main challenge is that one needs to take into account the particular geometry of the Riemannian manifold of symmetric positive definite (SPD) matrices they belong to. In the context of deep networks, several architectures for these matrices have recently been proposed. In our article, we introduce a Riemannian batch normalization (batch-norm) algorithm, which generalizes the one used in Euclidean nets. This novel layer makes use of geometric operations on the manifold, notably the Riemannian barycenter, parallel transport and non-linear structured matrix transformations. We derive a new manifold-constrained gradient descent algorithm working in the space of SPD matrices, allowing to learn the batchnorm layer. We validate our proposed approach with experiments in three different contexts on diverse data types: a drone recognition dataset from radar observations, and on emotion and action recognition datasets from video and motion capture data. Experiments show that the Riemannian batchnorm systematically gives better classification performance compared with leading methods and a remarkable robustness to lack of data.
Fichier principal
Vignette du fichier
brooks_nips19.pdf (363.89 Ko) Télécharger le fichier
Origin Files produced by the author(s)

Dates and versions

hal-02422458 , version 1 (22-12-2019)


  • HAL Id : hal-02422458 , version 1


Daniel A Brooks, Olivier Schwander, Frédéric Barbaresco, Jean-Yves Schneider, Matthieu Cord. Riemannian batch normalization for SPD neural networks. Thirty-third Annual Conference on Neural Information Processing Systems., Dec 2019, Vancouver, Canada. ⟨hal-02422458⟩
299 View
121 Download


Gmail Mastodon Facebook X LinkedIn More