Capturing Entity Hierarchy in Data-to-Text Generative Models
Abstract
We aim at generating summary from structured data (i.e. tables, entity-relation triplets, ...). Most previous approaches relies on an encoder-decoder architecture in which data are linearized into a sequence of elements. In contrast, we propose to take into account entities forming the data structure in a hierarchical model. More-over, we introduce the Transformer encoder in data-to-text models to ensure robust encoding of each element/entities in comparison to all others, no matter their initial positioning. Our model is evaluated on the RotoWire benchmark (statistical tables of NBA basketball
games). This paper has been accepted at ECIR 2020.