
Comprehension of natural language is a complex capacity, depending on several cognitive and neural systems. Over the last years, knowledge of the brain processes underlying single word and sentence processing has grown by examining phonological, semantic and syntactic/sentence processing networks. But not only speech is a communicative source, features such as tone of voice, facial expression, body posture, and gestures also transmit meaning that is decoded and supports comprehension.
Fig. 2: The fMRI compatible EEG system
In healthy subjects the brain seems to not be distracted by the simultaneously incoming auditory (verbal) and visual (gestural) communicative information. Instead, it is even able to integrate speech and gesture information and takes advantage of both information channels.

The research of the gesture-language-projects of the TNM-Lab (EEG-CON, Gestik3) is mainly concerned with the question of
- how speech and gesture meaning interact during comprehension
- how this interaction is dependent on the abstractness of the communicated information and
- whether patients with schizophrenia demonstrate aberrant integration of speech and gesture
We use behavioral rating studies, EEG and fMRI to shed some light on these issues.
With regard to speech-gesture integration, we could already show that activity in the left posterior temporal lobe is related to integration of concrete speech and gesture information (iconic co-verbal gestures) (Green et al., 2009; Straube, Green, Bromberger, & Kircher, 2011a). In a different set of experiments we found that the processing of abstract speech and gesture information (metaphoric co-verbal gestures) leads to additional activity in the left inferior frontal gyrus (Kircher et al., 2009; Straube, Green, et al., 2011a).
In patients with schizophrenia we found that the area in the left posterior superior/medial temporal gyrus was activated to the same extent as in healthy control subjects when iconic co-verbal gestures were contrasted to unimodal control conditions (speech-alone, gesture-alone) (Straube, Green, Sass, Kirner-Veselinovic, & Kircher, 2013). However, for the integration of metaphoric gestures we found aberrant processing in the patient group. For this type of co-verbal gestures it is important to build an abstract relation between concrete visual and abstract verbal information. For building this relation, additional online integration or unification processes in distantly located regions of the brain seem to be relevant (Kircher et al., 2009; Straube, Green, Bromberger, & Kircher, 2011b; Straube et al., 2013) and are likely to be disturbed in patients.
Further research is concerned with contextual factors such as body orientation. Here it is of interest, how contextual factors influence the neural processing of speech and gesture information (Straube, Green, Chatterjee, & Kircher, 2011; Straube, Green, Jansen, Chatterjee, & Kircher, 2010; Nagels, Kircher, Steines, & Straube, 2015) and further interact with for example social in contrast to object related communication content (Straube et al., 2010).
Finally, we are interested in episodic memory processes which contribute to speech and gesture integration and comprehension (Straube, Green, Chatterjee, et al., 2011; Straube, Green, Weis, Chatterjee, & Kircher, 2009; Straube, Meyer, Green, & Kircher, 2014).
Actual research is focused on
- temporal aspects of neural speech and gesture integration using EEG and fMRI (He et al., 2015)
- functional connectivity during integration of auditory and visual information (Straube et al., 2018)
- predictive mechanisms during comprehension
- social aspects of speech and gesture processing and related dysfunctions in schizophrenia
- gesture processing in a natural language context (Cuevas et al., 2019)
Please participate: Short online questionair (German)