social linkedin box blue 32
social facebook box blue 32
social twitter box blue 32
social facebook box blue 32

iit-rbcs-logo-v3

Home RBCS ■ Visuo-Haptic Perception

Visuo-Haptic Perception

All senses provide simultaneous information about our environment and this information needs to be combined into a single percept. The principal aim of our activity is focused on the study of how unisensory and multisensory perceptual capabilities change during development in children with and without sensory disabilities. The goal of such activity is to exploit this knowledge to understand the brain to create new rehabilitation programs and to develop new mechatronic devices with the aim of incrementing sensory-motor and interaction abilities of children with sensory disabilities.

Research Topics :

EXTERNAL PROJECTS:

ABBI

The Audio Bracelet for Blind Interaction (ABBI) project aims at developing and testing a new rehabilitation approach for visually impaired children. The project is based on a new understanding of neural processes involved in building the sense of space. To that end, we are developing ABBI, a wearable wireless device which provides audio and possibly tactile feedback about the movements of the body.

CODEFROR

COgnitive DEvelopment for Friendly RObots and Rehabilitation. The objective of the joint exchange project CODEFROR is to investigate aspects of human cognitive development with the double goal of developing robots able to interact with humans in a friendly way and of designing and testing protocols and devices for sensory and motor rehabilitation of disabled children.

SELECTED PUBLICATIONS:


  • *Gori M., Tinelli F., *Sandini G., Cioni G. and Burr D. C. (2012)
    Impaired visual size-discrimination in children with movement disorders
    Neuropsychologia, vol. 50, (no. 8), pp. 1838-1843
  • Gori M., Sandini G., Martinoli C. and Burr D. C. (2010)
    Poor Haptic Orientation Discrimination in Nonsighted Children May Reflect Disruption of Cross-Sensory Calibration
    Current Biology, vol. 20, (no. 3), pp. 223-225, 0960-9822
  • Gori M., Del Viva M., Sandini G. and Burr D. C. (2008)
    Young children do not integrate visual and haptic form information
    Current Biology, vol. 18, (no. 9), pp. 694-698, 0960-9822
  • *Gori M., *Sciutti A., Burr D. C. and *Sandini G. (2011)
    Direct and indirect haptic calibration of visual size judgments
    PLoS ONE, vol. 6, (no. 10), pp. e25599, 1932-6203
  • *Tomassini A., *Gori M., *Burr D. C., *Sandini G. and *Morrone C. (2012)
    Active movement restores veridical event-timing after tactile adaptation
    Journal of Neurophysiology, vol. 108, (no. 8), pp. 2092-2100, 0022-3077
  • *Sciutti A., *Bisio A., *Nori F., *Metta G., *Fadiga L., *Pozzo T. and *Sandini G. (2012)
    Measuring human-robot interaction through motor resonance
    International Journal of Social Robotics, vol. 4, (no. 3), pp. 223-234, 1875-4791
 
alt Sensory Rehabilitation
  • Vision
  • Touch
  • Hearing
  • Space Cognition
  • Multisensory integration
  • Development
  • Motor Control
  • Training
  • Psychophysics
  • Assistive device
People involved:

Monica Gori Giulio Sandini David Burr Tiziana Vercillo Elena Cocchi

We recently investigated how unisensory and multisensory perceptual capacities change during development in children with and without sensory and motor impairments. Main achievements here have been some new results on multisensory development and have been used for the development of new devices for the rehabilitation of children and adults with visual disability. During the past years we have developed a new devices for the audio spatial orientation rehabilitation in the blind. We have already applied this device for rehabilitation in blindfolded sighted individuals. The results suggest that the rehabilitative training is useful to improve spatial audio orientation under different conditions. We are now applying this instrument for the rehabilitation of adult individuals with visual disabilities.

Burr D. C. and Gori M. (2011)
Multisensory integration develops late in humans
Frontiers in the Neural Bases of Multisensory Processes

Burr D. C., Binda P. and Gori M. (2011)
Combining information from different senses: dynamic adjustment of combination weights, and the development of cross-modal integration in children.
Book of Sensory Cue Integration, vol. In Press, 9780195387247

alt Cross modal integration in children
  • Multisensory integration
  • Vision
  • Haptics
  • Hearing
  • Psychophysics
  • Proprioception
  • Motor Control
  • Development
  • Action
  • Action-Perception
  • Perception
People involved:

Monica Gori David Burr Tiziana Vercillo Giulio Sandini

Our past studies on cross modal integration in children showed that prior to eight years of age, integration of visual and haptic spatial information is far from optimal, with either vision or touch dominating totally, even in conditions where the dominant sense is far less precise than the other. By eight-ten years, the integration becomes statistically optimal, like in temporal domain (Gori et al.2012). This auditory dominance is on the other hand compromised in deaf children with cochlear implants for which the integration mechanisms works differently with respect to the normally hearing control group (Gori et al. 2012). adults (Gori et.al. Current Biology, 2008). In the last years we extended these two studies to the investigation of how visual and auditory information are integrated during development in children with and without auditory disabilities (Gori et al. 2012). We studied visual and auditory integration in the spatial and temporal domain and in agreement with our cross sensory calibration hypothesis we found a dominance of vision on the spatial domain and an auditory dominance in the temporal domain.

Gori M., Del Viva M., Sandini G. and Burr D. C. (2008)
Young children do not integrate visual and haptic form information
Current Biology, vol. 18, (no. 9), pp. 694-698, 0960-9822

Burr D. C., Binda P. and Gori M. (2011)
Combining information from different senses: dynamic adjustment of combination weights, and the development of cross-modal integration in children.
Book of Sensory Cue Integration, vol. In Press, 9780195387247

Gori, M., Del Viva, M.M, Sandini, G., Burr, D.C. (2007)
Five and six-year-old children do not integrate visual-haptic information optimally
IMRF Sydney, Sydney Australia 2007

Gori M., Del Viva M., Sandini G., Beccani L. and Burr D. C. (2007)
Integration of information between senses develops late in humans
Perception, vol. 36, (no. ECVP Abstract Supplement), 0301-0066

*Gori M., Del Viva M., *Sandini G. and Burr D. C. (2007)
Six-year-old children do not integrate visual-haptic information optimally
Journal of Vision, vol. 7, (no. 9)

*Gori M., *Sandini G. and Burr D. C. (2012)
Development of visuo-auditory integration in space and time
Frontiers in Integrative Neuroscience, vol. 6, (no. 77), 1662-5145

alt Haptic and visual perception in children with and without visual and motor disabilities
  • Assistive device
  • Vision
  • Haptics
  • Hearing
  • Touch
  • Multisensory integration
  • Social Interaction
  • Action
  • Action-Perception
  • Development
  • Psychophysics
  • Proprioception
  • Space Cognition
People involved:

Giulio Sandini David Burr Giovanni Cioni Francesca Tinelli Cristina Martinoli Elena Cocchi

Our previous studies on haptic and visual perception in children with and without visual and motor disabilities showed that integration develops late and that the absence of one sense in blind and low vision children impact on the haptic system that have to be calibrated by the visual one. In the last year we extended this approach to the study of the impact that the absence of motor skills have on the development of correct visual size perception studying children with motor disabilities [31]. In support of our “cross-modal calibration theory” we found that children with motor disabilities have problems in visual size discrimination.

Gori M., Sandini G., Martinoli C. and Burr D. C. (2010)
Poor Haptic Orientation Discrimination in Nonsighted Children May Reflect Disruption of Cross-Sensory Calibration
Current Biology, vol. 20, (no. 3), pp. 223-225, 0960-9822

*Gori M., Tinelli F., *Sandini G., Cioni G. and Burr D. C. (2012)
Impaired visual size-discrimination in children with movement disorders
Neuropsychologia, vol. 50, (no. 8), pp. 1838-1843

Gori M., Sandini G., Martinoli C. and Burr D. C. (2009)
Cross-Modal calibration occurs before optimal multimodal integration: results in blind and low-vision children confirm this hypothesis
ESF Conference Gene Expression to Neurobiology and Behaviour: human brain development and developmental disorders, Sant Feliu de Guixols, Spain, September 20-25, 2009

Gori M., Sandini G., Martinoli C. and Burr D. C. (2009)
Haptic orientation discrimination is severly impaired in blind and low-vision children
32rd European Conference on Visual Perception (ECVP '09), vol. 38, Regensburg, Germany, August 24 - 28, 2009

alt Visuo-haptic multimodal integration and calibration by observing actions
  • Multisensory integration
  • Motor Control
  • Vision
  • Touch
  • Haptics
  • Action-Perception
People involved:

Monica Gori Alessandra Sciutti David Burr Giulio Sandini Luana Giuliana

To test our cross sensory calibration theory, we studied visuo-haptic multimodal integration and calibration by observing actions and in particular how size perception of object in depth develops within and outside the haptic workspace (Gori et. al 2012) and how the observation of an action modifies the perceived object visual size outside the haptic workspace (Gori et al. 2012). These studies demonstrated that visual perception is biassed outside the haptic workspace and that a specific illusion can be corrected by observing an action on it. We also collaborated with different groups within RBCS testing the development of force and position cue integration in haptic shape perception (in collaboration with the Baud-Bovy group), the development of haptic active and passive perception (with the Masia group), the space representation after virtual haptic exploration (in collaboration with the Brayda group).

*Gori M., Giuliana L., *Sandini G. and Burr D. C. (2012)
Visual size perception and haptic calibration during development
Developmental Science, vol. 15, (no. 6), pp. 854-62, 1467-7687

*Gori M., *Sciutti A., Burr D. C. and *Sandini G. (2011)
Direct and indirect haptic calibration of visual size judgments
PLoS ONE, vol. 6, (no. 10), pp. e25599, 1932-6203

Gori M., Sciutti A., Sandini G. and Burr D. C. (2009)
Multimodal combination of visual information about object size with observation of an actor: cue integration by the mirror neuron system?
32rd European Conference on Visual Perception (ECVP '09), vol. 38, pp. 141, Regensburg, Germany, August 24 - 28, 2009

Gori M., Sciutti A., Sandini G. and Burr D. C. (2009)
Multimodal combination of visual information about object size with observation of an actor: cue integration by the mirror neuron system?
10th International Multisensory Research Forum, New York City, (USA), June 29 - July 2

*Gori M., Giuliana L., *Sciutti A., *Sandini G. and Burr D. C. (2010)
Calibration of the visual by the haptic system during development
Journal of Vision, vol. 10, (no. 7), Naples, Florida

alt Visuo-tactile multisensory integration and interaction
  • Vision
  • Touch
  • Haptics
  • Action
  • Action-Perception
  • Multisensory integration
  • Space Cognition
  • Development
  • Social Interaction
People involved:

Monica Gori David Burr Alessandra Sciutti Maria Morrone Giulio Sandini Marco Jacono

We study with psychophysical techniques unimodal and bimodal velocity discrimination thresholds for flow perception with real wheels etched with a sinewave profile of different spatial frequencies. Our system gives us the possibility to drive each wheel at a specific velocity and to create conflictual stimulation between the two modalities (Gori et.al. 2008). With this project we can answer many questions related to the analysis of unimodal visual tactile and multimodal flow perception and we have the possibility to investigate if a supramodal system of analysis may be present. The visual, tactile and bimodal transient motion perception of accelerating and decelerating stimuli is also investigated

 

*Gori M., Mazzilli G., *Sandini G. and Burr D. C. (2011)
Cross-sensory facilitation reveals neural interactions between visual and tactile motion in humans
Frontiers in Psychology, vol. 2, (no. 55), 1664-1078

Gori M., Sandini G. and Burr D. C. (2008)
A characteristic dipper function for bimodal and unimodal visual and tactile motion discrimination and facilitation between modalities
Perception, vol. 37, (no. ECVP Abstract Supplement), pp. 6, 0301-0066

*Gori M., *Sciutti A., *Jacono M., *Sandini G., Morrone C. and Burr D. C. (2013)
Long integration time for accelerating and decelerating visual, tactile and visuo-tactile stimuli
Multisensory Research, vol. 26, (no. 1-2), pp. 53-68, 2213-4794

Jacono M., Gori M., Sciutti A., Sandini G. and Burr D. C. (2008)
Perception of acceleration and deceleration in visual, tactile and visuo-tactile stimuli
Perception, vol. 37, pp. 49, Utrecht, The Netherlands, August 24 - 28, 2008, 0301-0066

alt Time Perception
  • Time Cognition
  • Touch
  • Haptics
  • Action-Perception
  • Psychophysics
  • Action
People involved:

Alice Tomassini Monica Gori David Burr Giulio Sandini Maria Morrone

Recent evidence suggests that time perception does not rely on a unique and specialized brain mechanism as traditionally postulated, but it is rather deeply integrated in local sensory and motor processes. Psychophysical studies have shown temporal distortions that depend on low-level sensory features (e.g., Kanai et al., 2006) and that are highly selective for either the modality of the stimulus (Morrone et al., 2005) or its location in space (Johnston et al., 2006). An increasing set of data also shows that action influences perceived time in many different ways (Yarrow et al., 2001; Haggard et al., 2002; Morrone et al., 2005; Tomassini et al., 2012). The firm interaction between time perception and action systems is further supported by neurophysiological evidence showing a major involvement of motor areas in time-keeping functions.

Specific projects:

-Time Perception across Senses:  we use psychophysical methods to investigate event-timing mechanisms across different sensory systems and their role in multisensory processing (Tomassini et al., 2011).

-Time Perception in Action: we investigate the dynamic changes in time perception at the moment of action and their role in sensory-motor functions (Tomassini et al., 2012; Tomassini et al., submitted).  

Johnston A, Arnold DH, Nishida S (2006)
Spatially localized distortions of event time
Current biology : CB 16:472-479

Haggard P, Clark S, Kalogeras J (2002)
Voluntary action and conscious awareness
Nature neuroscience 5:382-385

Yarrow K, Haggard P, Heal R, Brown P, Rothwell JC (2001)
Illusory perceptions of space and time preserve cross-saccadic perceptual continuity
Nature 414:302-305

Kanai R, Paffen CL, Hogendoorn H, Verstraten FA (2006)
Time dilation in dynamic visual display
Journal of vision 6:1421-1430

*Tomassini A., *Gori M., Burr D. C., *Sandini G. and *Morrone C. (2011)
Perceived duration of visual and tactile stimuli depends on perceived speed
Frontiers in Integrative Neuroscience, vol. 5, (no. 51), 1662-5145

Morrone MC, Ross J, Burr D (2005)
Saccadic eye movements cause compression of time as well as space
Nature neuroscience 8:950-954

*Tomassini A., *Gori M., *Burr D. C., *Sandini G. and *Morrone C. (2012)
Active movement restores veridical event-timing after tactile adaptation
Journal of Neurophysiology, vol. 108, (no. 8), pp. 2092-2100, 0022-3077

Tomassini A, Gori M, Baud-Bovy G, Sandini G, Morrone MC.
Motor commands induce time compression for tactile stimuli
Submitted

alt Perception and Interaction
  • Human-Machine Interaction
  • Social Interaction
  • Space Cognition
  • Intention
  • Action-Perception
  • Gaze
  • Robotics
People involved:

Alessandra Sciutti Giulio Sandini Ambra Bisio Francesco Nori Giorgio Metta Luciano Fadiga Andrea Del Prete Lorenzo Natale David Burr Monica Gori

Humans are naturally social agents. Since their birth they are instinctively motivated to interact with other people, developing very early the ability to be helped and to help (Warneken and M. Tomasello 2006). With age this skill progressively improves, leading to an astonishing proficiency in interaction and collaboration. The aim of our research is to investigate the sensory and motor mechanisms underlying such proficiency, also by using a humanoid robot as a controllable simulator of interactive cues.

In particular, we have evaluated the possibility for a humanoid robot to be perceived as a goal oriented agent, by investigating whether during the observation of robot actions subjects exhibit the same tendency to gaze anticipatorily to the action target. Our results show that the observation of robotic goal-directed actions elicits proactivity in subjects’ gaze as much as human actions do, thus indicating that the basic mechanisms of social behavior are active also in presence of robotic agents (Sciutti et al. 2012).

In a different set of studies we have investigated whether the interaction with a humanoid robot changes subjects’ strategy in analyzing the properties of the environment. Perceptual judgments are indeed not just based on what is currently in front of our eyes, but also on our previous experiences, i.e. on the statistics of the world. We evaluated whether the relevance of the statistical context differed in interactive and non-interactive scenarios. In two conditions the task was exactly the same (i.e. reproducing a length), while only task-independent parameters of robotic behavior (robot autonomy and robot gazing) were modified to manipulate the interactive nature of the task. We found that during interactive tasks, subjects rely less on the statistical context and more on the current stimulation. Moreover, an autonomous behavior and a human-like eye-gazing motion seemed to be the prerequisites to determine the occurrence of an interaction (Sciutti et al. 2013).

Other studies on human-robot interaction have been conducted also in collaboration with the Cognitive Humanoids Laboratory 

*Sciutti A., *Bisio A., *Nori F., *Metta G., *Fadiga L., *Pozzo T. and *Sandini G. (2012)
Measuring human-robot interaction through motor resonance
International Journal of Social Robotics, vol. 4, (no. 3), pp. 223-234, 1875-4791

*Sciutti A., *Nori F., *Jacono M., *Metta G., *Sandini G. and *Fadiga L. (2011)
Proactive Gaze Behavior: Which Observed Action Features Do Influence The Way We Move Our Eyes?
Journal of Vision, vol. 11, (no. 11), Naples, Florida

INFORMATION NOTICE ON COOKIES

IIT's website uses the following types of cookies: browsing/session, analytics, functional and third party cookies. Users can choose whether or not to accept the use of cookies and access the website.
By clicking on further information, the full information notice on the types of cookies used will be displayed and you will be able to choose whether or not to accept cookies whilst browsing on the website.

Try our new site and tell us what you think
Take me there