Measuring and Calibrating Trust in Artificial Intelligence - Equipe COgnition, Models and Machines for Engaging Digital Interactive Applications Access content directly
Preprints, Working Papers, ... Year : 2024

Measuring and Calibrating Trust in Artificial Intelligence

Mesurer et calibrer la confiance pour l'intelligence artificielle

Abstract

Interactive systems based on Artificial Intelligence (AI) algorithms are raising new challenges, including establishing a bond of trust between users and AI. This trust must be calibrated to match the degree of reliability of AI in order to avoid over-trusting and under-trusting. However, trust is a subjective characteristic that is difficult to assess as it can vary from one person to another. This paper explores how it is possible to estimate the trust of users, especially through behavioral and physiological sensing. It also explains how, from trust assessment, it becomes possible to develop techniques for calibrating trust.
Fichier principal
Vignette du fichier
Trust_Calibration_in_Artificial_Intelligence-1.pdf (159.42 Ko) Télécharger le fichier
Origin : Files produced by the author(s)

Dates and versions

hal-04493669 , version 1 (07-03-2024)

Identifiers

  • HAL Id : hal-04493669 , version 1

Cite

Mathias Bollaert, Olivier Augereau, Gilles Coppin. Measuring and Calibrating Trust in Artificial Intelligence. 2024. ⟨hal-04493669⟩
45 View
62 Download

Share

Gmail Facebook X LinkedIn More