Automatic Nonverbal Behavior Generation from Image Schemas - Télécom Paris
Communication Dans Un Congrès Année : 2018

Automatic Nonverbal Behavior Generation from Image Schemas

Résumé

One of the main challenges when developing Embodied Conversational Agents is to give them the ability to autonomously produce meaningful and coordinated verbal and nonverbal behaviors. The relation between these means of communication is more complex than a direct mapping that has often been applied in previous models. In this paper, we propose an intermediate mapping approach we apply on metaphoric gestures first but that could be extended to other representational gestures. Leveraging from previous work in text analysis, embodied cognition and co-verbal behavior production, we introduce a framework articulating speech and metaphoric gesture invariants around a common mental representation: Image Schemas. We establish the components of our framework, detailing the different steps leading to the production of the metaphoric gestures, and we present some preliminary results and demonstrations. We end the paper by laying down the perspectives to integrate, evaluate and improve our model.

Fichier non déposé

Dates et versions

hal-02287759 , version 1 (13-09-2019)

Identifiants

  • HAL Id : hal-02287759 , version 1

Citer

Brian Ravenet, Chloé Clavel, Catherine Pelachaud. Automatic Nonverbal Behavior Generation from Image Schemas. International Conference on Autonomous Agents and Multiagent Systems, Jul 2018, Stockholm, Sweden. ⟨hal-02287759⟩
74 Consultations
0 Téléchargements

Partager

More