Skip to Main content Skip to Navigation
Journal articles

On the Study of Transformers for Query Suggestion

Agnès Mustar 1 Sylvain Lamprier 1 Benjamin Piwowarski 1 
1 MLIA - Machine Learning and Information Access
ISIR - Institut des Systèmes Intelligents et de Robotique
Abstract : When conducting a search task, users may find it difficult to articulate their need, even more so when the task is complex. To help them complete their search, search engine usually provide query suggestions. A good query suggestion system requires to model user behavior during the search session. In this article, we study multiple Transformer architectures applied to the query suggestion task and compare them with recurrent neural network (RNN)-based models. We experiment Transformer models with different tokenizers, with different Encoders (large pretrained models or fully trained ones), and with two kinds of architectures (flat or hierarchic). We study the performance and the behaviors of these various models, and observe that Transformer-based models outperform RNN-based ones. We show that while the hierarchical architectures exhibit very good performances for query suggestion, the flat models are more suitable for complex and long search tasks. Finally, we investigate the flat models behavior and demonstrate that they indeed learn to recover the hierarchy of a search session.
Complete list of metadata
Contributor : Benjamin Piwowarski Connect in order to contact the contributor
Submitted on : Thursday, February 10, 2022 - 1:02:49 PM
Last modification on : Saturday, February 12, 2022 - 3:46:20 AM
Long-term archiving on: : Wednesday, May 11, 2022 - 6:03:54 PM


Files produced by the author(s)



Agnès Mustar, Sylvain Lamprier, Benjamin Piwowarski. On the Study of Transformers for Query Suggestion. ACM Transactions on Information Systems, Association for Computing Machinery, 2022, 40 (1), pp.18. ⟨10.1145/3470562⟩. ⟨hal-03541893⟩



Record views


Files downloads