Continual Learning of Long Topic Sequences in Neural Information Retrieval - Sorbonne Université Access content directly
Conference Papers Year : 2022

Continual Learning of Long Topic Sequences in Neural Information Retrieval

Thomas Gerald
  • Function : Author
  • PersonId : 1125660
Laure Soulier

Abstract

In information retrieval (IR) systems, trends and users' interests may change over time, altering either the distribution of requests or contents to be recommended. Since neural ranking approaches heavily depend on the training data, it is crucial to understand the transfer capacity of recent IR approaches to address new domains in the long term. In this paper, we first propose a dataset based upon the MSMarco corpus aiming at modeling a long stream of topics as well as IR property-driven controlled settings. We then in-depth analyze the ability of recent neural IR models while continually learning those streams. Our empirical study highlights in which particular cases catastrophic forgetting occurs (e.g., level of similarity between tasks, peculiarities on text length, and ways of learning models) to provide future directions in terms of model design.
Fichier principal
Vignette du fichier
ecir2022_ContinualRanking_cr (1).pdf (435.55 Ko) Télécharger le fichier
Origin : Files produced by the author(s)

Dates and versions

hal-03563308 , version 1 (09-02-2022)

Identifiers

  • HAL Id : hal-03563308 , version 1

Cite

Thomas Gerald, Laure Soulier. Continual Learning of Long Topic Sequences in Neural Information Retrieval. 44th European Conference on Information Retrieval (ECIR 2022), Apr 2022, Stavanger, Norway. ⟨hal-03563308⟩
43 View
79 Download

Share

Gmail Facebook X LinkedIn More