Deep Neural Networks Under Stress

Abstract : In recent years, deep architectures have been used for transfer learning with state-of-the-art performance in many data-sets. The properties of their features remain, however, largely unstudied under the transfer perspective. In this work, we present an extensive analysis of the resiliency of feature vectors extracted from deep models, with special focus on the trade-off between performance and compression rate. By introducing perturbations to image descriptions extracted from a deep convolutional neural network, we change their precision and number of dimensions, measuring how it affects the final score. We show that deep features are more robust to these disturbances when compared to classical approaches, achieving a compression rate of 98.4%, while losing only 0.88% of their original score for Pascal VOC 2007.
Type de document :
Communication dans un congrès
IEEE International Conference on Image Processing (ICIP 2016), Sep 2016, Phoenix, AZ, United States. 〈http://2016.ieeeicip.org/〉. 〈10.1109/ICIP.2016.7533200〉
Liste complète des métadonnées

Littérature citée [21 références]  Voir  Masquer  Télécharger

https://hal.sorbonne-universite.fr/hal-01340298
Contributeur : Micael Carvalho <>
Soumis le : jeudi 30 juin 2016 - 17:43:45
Dernière modification le : jeudi 22 novembre 2018 - 14:43:54
Document(s) archivé(s) le : samedi 1 octobre 2016 - 13:15:28

Fichier

2016_05_13_carvalho_et_al_icip...
Fichiers produits par l'(les) auteur(s)

Identifiants

Collections

Citation

Micael Carvalho, Matthieu Cord, Sandra Avila, Nicolas Thome, Eduardo Valle. Deep Neural Networks Under Stress. IEEE International Conference on Image Processing (ICIP 2016), Sep 2016, Phoenix, AZ, United States. 〈http://2016.ieeeicip.org/〉. 〈10.1109/ICIP.2016.7533200〉. 〈hal-01340298〉

Partager

Métriques

Consultations de la notice

332

Téléchargements de fichiers

221