Modeling Emotional Valence Integration From Voice and Touch - Laboratoire Interdisciplinaire des Sciences du Numérique Accéder directement au contenu
Article Dans Une Revue Frontiers in Psychology Année : 2018

Modeling Emotional Valence Integration From Voice and Touch

Résumé

In the context of designing multimodal social interactions for Human–Computer Interaction and for Computer–Mediated Communication, we conducted an experimental study to investigate how participants combine voice expressions with tactile stimulation to evaluate emotional valence (EV). In this study, audio and tactile stimuli were presented separately, and then presented together. Audio stimuli comprised positive and negative voice expressions, and tactile stimuli consisted of different levels of air jet tactile stimulation performed on the arm of the participants. Participants were asked to evaluate communicated EV on a continuous scale. Information Integration Theory was used to model multimodal valence perception process. Analyses showed that participants generally integrated both sources of information to evaluate EV. The main integration rule was averaging rule. The predominance of a modality over the other modality was specific to each individual.
Fichier principal
Vignette du fichier
fpsyg-09-01966.pdf (1.9 Mo) Télécharger le fichier
Origine : Publication financée par une institution

Dates et versions

hal-04275538 , version 1 (11-01-2024)

Identifiants

Citer

Mohamed Yacine Tsalamlal, Michel-Ange Amorim, Jean-Claude Martin, Mehdi Ammi. Modeling Emotional Valence Integration From Voice and Touch. Frontiers in Psychology, 2018, 9, pp.1966. ⟨10.3389/fpsyg.2018.01966⟩. ⟨hal-04275538⟩
74 Consultations
11 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More