Speaker: Prof.Marc Ernst, Applied Cognitive Psychology, Ulm University

Time: 2018-09-25 15:00

Venue: Room 1113, Wang Kezhen Building

Abstract: Vision is by far the best studied sensory modality. Compared to this, our knowledge about the sense of touch is very limited. On the one hand, this is due to the complexity involved in generating stimuli for the sense of touch in a controlled and largely automatic fashion. That is, for rendering better haptic stimuli, we would need a significant advance in haptic display technology. On the other hand, this is due to the complexity integral to the sense of touch. E.g., the sense of touch is inherently multisensory, and for a coherent representation of the external world the brain has to constantly combine tactile information with proprioceptive and kinesthetic information. Further, the sense of touch not only receives information about the world passively, but gathers information actively. And finally, compared to the eye, the primary organ of the human sense of touch–the hand–has many more degrees of freedom (28 for each hand compared to 3 for the eye). Despite these challenges, the human brain has to continuously integrate and perceptually organize the incoming information in order to form a robust and stable representation of the external world with which we interact. In this talk I will briefly review some of our recent studies on the integration of information in the sense of touch. Furthermore, I will draw analogies in the processing of information between vision and touch, and demonstrate that many of the illusions well known in vision caused by eye movements have an analog in touch when we actively explore and scan objects with our hands. These results demonstrate the existence of common mechanisms in visual and haptic motion perception and for achieving spatial constancy. I will end my talk by outlining some of our recent findings of exploiting the redundancy of the motor system during manual interactions.