SyMIL: MinMax Latent SVM for Weakly Labeled Data - Sorbonne Université
Journal Articles IEEE Transactions on Neural Networks and Learning Systems Year : 2018

SyMIL: MinMax Latent SVM for Weakly Labeled Data

Abstract

Designing powerful models able to handle weakly labeled data is a crucial problem in machine learning. In this paper, we propose a new Multiple Instance Learning (MIL) framework. Examples are represented as bags of instances, but we depart from standard MIL assumptions by introducing a symmetric strategy (SyMIL) that seeks discriminative instances in positive and negative bags. The idea is to use the instance the most distant from the hyper-plan to classify the bag. We provide a theoretical analysis featuring the generalization properties of our model. We derive a large margin formulation of our problem, which is cast as a difference of convex functions, and optimized using CCCP. We provide a primal version optimizing with stochastic sub-gradient descent and a dual version optimizing with one-slack cutting-plane. Successful experimental results are reported on standard MIL and weakly-supervised object detection datasets: SyMIL significantly outperforms competitive methods (mi/MI/Latent-SVM), and gives very competitive performance compared to state-of-the-art works. We also analyze the selected instances of symmetric and asymmetric approaches on weakly-supervised object detection and text classification tasks. Finally we show complementarity of SyMIL with recent works on learning with label proportions on standard MIL datasets.
Fichier principal
Vignette du fichier
2018TNNLScord_sans marque.pdf (3.35 Mo) Télécharger le fichier
Origin Files produced by the author(s)
Loading...

Dates and versions

hal-01972163 , version 1 (07-01-2019)

Identifiers

Cite

Thibaut Durand, Nicolas Thome, Matthieu Cord. SyMIL: MinMax Latent SVM for Weakly Labeled Data. IEEE Transactions on Neural Networks and Learning Systems, 2018, 29 (12), pp.6099-6112. ⟨10.1109/TNNLS.2018.2820055⟩. ⟨hal-01972163⟩
212 View
194 Download

Altmetric

Share

More