Minimization with respect to divergences and applications
Abstract
We apply divergences to project a prior guess discrete probability law on pq elements towards a subspace defined by fixed margins constraints µ and ν on p and q elements respectively. We justify why the Kullback-Leibler and the Chi-square divergences are two canonical choices based on a 1991 work of Imre Csiszár. Besides we interpret the so called indetermination resulting from the second divergence as a construction to reduce couple matchings. Eventually, we demonstrate how both resulting probabilities arise in two information theory applications: guessing problem and task partitioning where some optimization remains to minimize a divergence projection.
Fichier principal
Bertrand et al. - 2021 - Minimization with Respect to Divergences and Appli.pdf (307.72 Ko)
Télécharger le fichier
Origin | Files produced by the author(s) |
---|