Continual Learning of Long Topic Sequences in Neural Information Retrieval - Sorbonne Université Accéder directement au contenu
Communication Dans Un Congrès Année : 2022

Continual Learning of Long Topic Sequences in Neural Information Retrieval

Thomas Gerald
  • Fonction : Auteur
  • PersonId : 1125660
Laure Soulier

Résumé

In information retrieval (IR) systems, trends and users' interests may change over time, altering either the distribution of requests or contents to be recommended. Since neural ranking approaches heavily depend on the training data, it is crucial to understand the transfer capacity of recent IR approaches to address new domains in the long term. In this paper, we first propose a dataset based upon the MSMarco corpus aiming at modeling a long stream of topics as well as IR property-driven controlled settings. We then in-depth analyze the ability of recent neural IR models while continually learning those streams. Our empirical study highlights in which particular cases catastrophic forgetting occurs (e.g., level of similarity between tasks, peculiarities on text length, and ways of learning models) to provide future directions in terms of model design.
Fichier principal
Vignette du fichier
ecir2022_ContinualRanking_cr (1).pdf (435.55 Ko) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-03563308 , version 1 (09-02-2022)

Identifiants

  • HAL Id : hal-03563308 , version 1

Citer

Thomas Gerald, Laure Soulier. Continual Learning of Long Topic Sequences in Neural Information Retrieval. 44th European Conference on Information Retrieval (ECIR 2022), Apr 2022, Stavanger, Norway. ⟨hal-03563308⟩
42 Consultations
77 Téléchargements

Partager

Gmail Facebook X LinkedIn More