Multisensory object representation. Insights from studies of vision and touch

Research output: Chapter in Book/Report/Conference proceedingChapter

31 Scopus citations

Abstract

Behavioral studies show that the unisensory representations underlying within-modal visual and haptic object recognition are strikingly similar in terms of view- and size-sensitivity, and integration of structural and surface properties. However, the basis for these attributes differs in each modality, indicating that while these representations are functionally similar, they are not identical. Imaging studies reveal bisensory, visuo-haptic object selectivity, notably in the lateral occipital complex and the intraparietal sulcus, that suggests a shared representation of objects. Such a multisensory representation could underlie visuo-haptic cross-modal object recognition. In this chapter, we compare visual and haptic within-modal object recognition and trace a progression from functionally similar but separate unisensory representations to a shared multisensory representation underlying cross-modal object recognition as well as view-independence, regardless of modality. We outline, and provide evidence for, a model of multisensory object recognition in which representations are flexibly accessible via top-down or bottom-up processing, the choice of route being influenced by object familiarity and individual preference along the object-spatial continuum of mental imagery.

Original languageEnglish (US)
Title of host publicationProgress in Brain Research
PublisherElsevier B.V.
Pages165-176
Number of pages12
DOIs
StatePublished - 2011

Publication series

NameProgress in Brain Research
Volume191
ISSN (Print)0079-6123
ISSN (Electronic)1875-7855

All Science Journal Classification (ASJC) codes

  • Neuroscience(all)

Fingerprint

Dive into the research topics of 'Multisensory object representation. Insights from studies of vision and touch'. Together they form a unique fingerprint.

Cite this