Limit this search to....

Crossmodal Space and Crossmodal Attention
Contributor(s): Spence, Charles (Author)
ISBN: 0198524862     ISBN-13: 9780198524861
Publisher: Oxford University Press, USA
OUR PRICE:   $96.80  
Product Type: Paperback
Published: June 2004
* Not available - Not in print at this time *Annotation: Many organisms possess multiple sensory systems, such as vision, hearing, touch, smell, and taste. The possession of such multiple ways of sensing the world offers many benefits. These benefits arise not only because each modality can sense different aspects of the environment, but also
because different senses can respond jointly to the same external object or event, thus enriching the overall experience-for example, looking at an individual while listening to them speak. However, combining the information from different senses also poses many challenges for the nervous system. In
recent years, there has been dramatic progress in understanding how information from different sensory modalities gets integrated in order to construct useful representations of external space; and in how such multimodal representations constrain spatial attention. Such progress has involved
numerous different disciplines, including neurophysiology, experimental psychology, neurological work with brain-damaged patients, neuroimaging studies, and computational modelling. This volume brings together the leading researchers from all these approaches, to present the first integrative
overview of this central topic in cognitive neuroscience.
Additional Information
BISAC Categories:
- Psychology | Cognitive Psychology & Cognition
- Psychology | Neuropsychology
- Psychology | Physiological Psychology
Dewey: 152.1
LCCN: 2004302311
Physical Information: 0.77" H x 6.58" W x 9.46" (1.27 lbs) 344 pages
 
Descriptions, Reviews, Etc.
Publisher Description:
Many organisms possess multiple sensory systems, such as vision, hearing, touch, smell, and taste. The possession of such multiple ways of sensing the world offers many benefits. These benefits arise not only because each modality can sense different aspects of the environment, but also
because different senses can respond jointly to the same external object or event, thus enriching the overall experience-for example, looking at an individual while listening to them speak. However, combining the information from different senses also poses many challenges for the nervous system. In
recent years, there has been dramatic progress in understanding how information from different sensory modalities gets integrated in order to construct useful representations of external space; and in how such multimodal representations constrain spatial attention. Such progress has involved
numerous different disciplines, including neurophysiology, experimental psychology, neurological work with brain-damaged patients, neuroimaging studies, and computational modelling. This volume brings together the leading researchers from all these approaches, to present the first integrative
overview of this central topic in cognitive neuroscience.