Pazen, M., Uhlmann, L., van Kemenade, B.M., Steinsträter, O., Straube, B. & Kircher, T. (accepted). Predictive perception of self-generated movements: Commonalities and differences in the neural processing of tool and hand actions. DOI: 10.1016/j.neuroimage.2019.116309: NeuroImage IF: 5.426
Tool use is one of the most remarkable skills of the human species, enabling complex interactions with the environment. To establish such interactions, we predict the sensory consequences of our actions based on a copy of the motor command (efference copy), leading to an attenuated perception and neural suppression of the sensory input. Here, we investigated whether and how tools can be incorporated into these predictions. We hypothesized that similar predictive mechanisms are used for both hand and tool use actions, but that additional resources are needed to integrate the tool.
During fMRI data acquisition, 19 healthy participants used either their right hand or a tool to hold the handle of a movement device. To manipulate the effect of the efference copy, the handle was moved either actively by the participants or passively by the movement device. The sensory outcome, consisting of a real-time video of the hand or tool movement shown on a screen, was presented with varying delays (0-417 ms). Participants reported their perception of such delays.
The processing of hand and tool movements yielded largely similar results when comparing active against passive conditions: Active movements were in both cases associated with worse delay detection performances. Moreover, during both hand and tool use actions, active movements led to a downregulation of sensory (somatosensory, visual) areas as well as the right cerebellum and right posterior parietal cortex, as assessed by a conjunction analysis. By contrast, an interaction analysis indicated differential processing of active vs. passive movements in hand vs. tool conditions in the left postcentral gyrus, right middle temporal gyrus (MTG), and bilateral caudate nuclei.
Our findings provide behavioral and neural support that hand and tool actions share similar mechanisms for sensory predictions. We propose that the MTG and (sensori)motor areas (postcentral gyrus, caudate nuclei) contribute to these predictions by optimizing them to the physics of the end effector (hand or tool). Collectively, these results suggest that the brain dynamically adjusts sensorimotor predictive models to anticipate the dynamics of the end effector, be it a hand or a tool.