Header

UZH-Logo

Maintenance Infos

Multimodal Communication in Aphasia: Perception and Production of Co-speech Gestures During Face-to-Face Conversation


Preisig, Basil C; Eggenberger, Noëmi; Cazzoli, Dario; Nyffeler, Thomas; Gutbrod, Klemens; Annoni, Jean-Marie; Meichtry, Jurka R; Nef, Tobias; Müri, René M (2018). Multimodal Communication in Aphasia: Perception and Production of Co-speech Gestures During Face-to-Face Conversation. Frontiers in Human Neuroscience, 12:200.

Abstract

The role of nonverbal communication in patients with post-stroke language impairment (aphasia) is not yet fully understood. This study investigated how aphasic patients perceive and produce co-speech gestures during face-to-face interaction, and whether distinct brain lesions would predict the frequency of spontaneous co-speech gesturing. For this purpose, we recorded samples of conversations in patients with aphasia and healthy participants. Gesture perception was assessed by means of a head-mounted eye-tracking system, and the produced co-speech gestures were coded according to a linguistic classification system. The main results are that meaning-laden gestures (e.g., iconic gestures representing object shapes) are more likely to attract visual attention than meaningless hand movements, and that patients with aphasia are more likely to fixate co-speech gestures overall than healthy participants. This implies that patients with aphasia may benefit from the multimodal information provided by co-speech gestures. On the level of co-speech gesture production, we found that patients with damage to the anterior part of the arcuate fasciculus showed a higher frequency of meaning-laden gestures. This area lies in close vicinity to the premotor cortex and is considered to be important for speech production. This may suggest that the use of meaning-laden gestures depends on the integrity of patients' speech production abilities.

Abstract

The role of nonverbal communication in patients with post-stroke language impairment (aphasia) is not yet fully understood. This study investigated how aphasic patients perceive and produce co-speech gestures during face-to-face interaction, and whether distinct brain lesions would predict the frequency of spontaneous co-speech gesturing. For this purpose, we recorded samples of conversations in patients with aphasia and healthy participants. Gesture perception was assessed by means of a head-mounted eye-tracking system, and the produced co-speech gestures were coded according to a linguistic classification system. The main results are that meaning-laden gestures (e.g., iconic gestures representing object shapes) are more likely to attract visual attention than meaningless hand movements, and that patients with aphasia are more likely to fixate co-speech gestures overall than healthy participants. This implies that patients with aphasia may benefit from the multimodal information provided by co-speech gestures. On the level of co-speech gesture production, we found that patients with damage to the anterior part of the arcuate fasciculus showed a higher frequency of meaning-laden gestures. This area lies in close vicinity to the premotor cortex and is considered to be important for speech production. This may suggest that the use of meaning-laden gestures depends on the integrity of patients' speech production abilities.

Statistics

Citations

Dimensions.ai Metrics
1 citation in Web of Science®
2 citations in Scopus®
Google Scholar™

Altmetrics

Downloads

17 downloads since deposited on 31 Oct 2019
17 downloads since 12 months
Detailed statistics

Additional indexing

Item Type:Journal Article, refereed, original work
Communities & Collections:06 Faculty of Arts > Institute of Psychology
Dewey Decimal Classification:150 Psychology
Language:English
Date:2018
Deposited On:31 Oct 2019 13:52
Last Modified:31 Oct 2019 13:53
Publisher:Frontiers Research Foundation
ISSN:1662-5161
OA Status:Gold
Free access at:Publisher DOI. An embargo period may apply.
Publisher DOI:https://doi.org/10.3389/fnhum.2018.00200
PubMed ID:29962942
Project Information:
  • : FunderSNSF
  • : Grant ID320030_138532
  • : Project TitleAphasia and co-speech gestures

Download

Gold Open Access

Download PDF  'Multimodal Communication in Aphasia: Perception and Production of Co-speech Gestures During Face-to-Face Conversation'.
Preview
Content: Published Version
Language: English
Filetype: PDF
Size: 1MB
View at publisher
Licence: Creative Commons: Attribution 4.0 International (CC BY 4.0)