In every conversation you have, there is an unspoken code – a set of social rules that guide you. When to stop talking, where to look, when to listen and when to talk…
While scientists have long understood this turn-taking behavior, less known has been what affects this ability in patients with aphasia, a disorder that affects language abilities. A new study finds that the more complex the sentence structure, the harder it is for aphasia patients to take a cue and take turns in a conversation.
“Using eye tracking in patients with aphasia, we aimed at gaining new insights into real-time processing of ongoing turns by analyzing the frequency and the timing of turn-transition-related gaze shifts during video observation,” says René Müri of the University Hospital Bern, a co-author of the new paper published in the Journal of Cognitive Neuroscience.
In the study, patients with aphasia and healthy controls observed videos of dialogues between two people. In the videos, two actors were standing at a table and discussed a theme from daily life, such as cooking, sports, or shopping. While the participants watched the videos, the researchers recorded their eye movements. The researchers then analyzed the eye movements of the observers in relation to turn-taking in the videos. They then correlated the speech in the dialogues – the “lexico-syntactic complexity” – and eye movements of the observer.
“This is a quite new way to analyze the observer’s behavior,” Müri says. “The analysis of the eye movements of the observer allows us to evaluate whether the observer is able to project the end of a conversation turn or not.”
CNS spoke with Müri about this new method, the results, and its implications for both the patient and non-patient populations.
CNS: How did you become personally interested in this research area?
Müri: As a clinical researcher working in the field of neurorehabilitation, I was always interested in the consequences of aphasia in patients with acquired brain lesions. Aphasia – the loss of language – has severe implications on quality of life and social interaction of the patients. Therefore, more research is needed to understand how to improve patients’ quality of life.
Communication is a complex interplay involving speech production and comprehension, but also other non-verbal components such as gesture, facial expression,or gazing. Receiving a grant from the Swiss National Research Foundation, our team had the chance to study the interaction of verbal and non-verbal behavior during conversation in healthy subjects and patients with aphasia using eye tracking as an additional method. The analysis of eye movements and fixations is a well-established technique to study the real-time processing of ongoing conversation turns between speaker and listener in non-involved healthy observers.
CNS: How do you define lexico-syntactic complexity?
Müri: The lexico-syntactic complexity index was calculated considering both the number and the median lexical frequency of the words during each turn taking of the dialog. The number of words was taken as a measure of the syntactic load, and the median lexical frequency was used as an indicator of lexical complexity, as more common words are usually correctly perceived at much lower speech to‐noise ratios than less common words.
An important finding was that aphasic patients showed a lower probability to shift their gaze at turn transitions than healthy participants.
CNS: What have we known previously about the role of eye gaze in conversations?
Müri: The concept of the conversation turn-taking system has been known for many years, and it may be referred to as a speech exchange system, which organizes the opportunities to speak during social interaction. For example, a speaker has either the option to actively pass the turn to the next speaker (speaker’s selection), or the turn can be taken by the listener at the next possible completion (self-selection). Following these rules ensures that in a dialog there is only one speaker at a time. Obviously, self-selection requires that the listener is able to project the end of a conversation turn. This ability relies on the knowledge of the structure of the linguistic units, which enables us to project the ending in advance.
CNS: What were you most excited to find? Were any findings surprising?
Müri: An important finding was that aphasic patients showed a lower probability to shift their gaze at turn transitions than healthy participants. The probability whether a gaze shift would occur or not depended on the lexico-syntactic complexity of the video content preceding a particular turn transition. Even more exciting was that in healthy subjects, higher lexico-syntactic complexity lead to higher gaze shift probabilities, and that in aphasic patients, we found the opposite: a decreasing gaze shift probability associated with higher lexico-syntactic complexity.
CNS: What is the significance of your findings for aphasia patients?
Müri: The important result for aphasic patients is that increasing complexity of speech content reduces their chance to take over the dialog at the right time.
CNS: What do you most want people in the non-patient population to understand about this work?
Müri: For the non-patient population, the results are important for maintaining adequate behavior in conversation with aphasic patients. It is important to adapt conversation style using less complex sentences and high frequency words, otherwise the aphasic dialog partner has no chance to take over the conversation.
CNS: What’s next for this line of work?
Müri: In the next step, and we already started with experiments, we will examine conversation turn-taking and gaze behavior directly during dialogs between aphasic patients and examiner. Thus, patients or controls no longer observe videos, but interact directly in the dialog: The gaze behavior participants wear a helmet-mounted eye tracking system that allows measuring gaze behavior during dialog. In addition to the previous research, we will be able to evaluate also the role of direct gazing between the two participants, i.e. how they explore the face of their conversation partner. The face and facial expression plays another important role in the human conversation, and we hope to understand whether aphasic patients process such visual information differently.
-Lisa M.P. Munoz