English
 
Help Privacy Policy Disclaimer
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT

Released

Poster

Visual object categorization with conflicting auditory information

MPS-Authors
/persons/resource/persons83773

Adam,  R
Research Group Cognitive Neuroimaging, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Max Planck Institute for Biological Cybernetics, Max Planck Society;

/persons/resource/persons84112

Noppeney,  U
Research Group Cognitive Neuroimaging, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Max Planck Institute for Biological Cybernetics, Max Planck Society;

Fulltext (restricted access)
There are currently no full texts shared for your IP range.
Fulltext (public)
There are no public fulltexts stored in PuRe
Supplementary Material (public)
There is no public supplementary material available
Citation

Adam, R., & Noppeney, U. (2009). Visual object categorization with conflicting auditory information. Poster presented at 15th Annual Meeting of the Organisation for Human Brain Mapping (HBM 2009), San Francisco, CA, USA.


Cite as: https://hdl.handle.net/11858/00-001M-0000-0013-C42D-5
Abstract
Introduction

Spatio-temporally coincident sensory signals usually emanate from the same event or object. Ideally, the human brain should integrate sensory information derived from a common source, while avoiding mergence of information from different sources. Indeed, it has long been recognized that multisensory integration breaks down, when sensory estimates are brought into large conflict. Nevertheless, conflicting task-irrelevant sensory information has been shown to interfere with decisions on task-relevant sensory input in a selective crossmodal attention paradigm. Combining psychophysics and fMRI, this study investigates how the human brain forms decisions about multisensory object categories when the senses disagree.
Methods

18 subjects participated in this fMRI study (Siemens Trio 3T scanner, GE-EPI, TE=40 ms, 38 axial slices, TR=3.08 s). The 2×2 factorial design manipulated (i) Visual category: Animal vs. Landmark, and (ii) Auditory category: Animal vocalization vs. Sound associated with Landmark. In a visual selective attention paradigm, subjects categorized degraded pictures while ignoring their accompanying intact sounds that could be either semantically congruent or incongruent. These particular stimulus categories were used as they have been associated with selective activations in the fusiform face (FFA) vs. parahippocampal place (PPA) area. To localize FFA and PPA, subjects were also presented pictures of faces and landmarks in target detection task. The activation trials were interleaved with 6 s fixation blocks. To allow for a random-effects analysis (SPM5), contrast images for each subject were entered into a second level ANOVA. In additional regression analyses, subjects’ category-selective responses (localizer) were regressed on their reaction time (RT) increases for incongruent relative to congruent trials (separately for landmarks and animals). We tested for (1) Incongruent vs. Congruent, (2) Landmark vs Animal (3) Incongruency × Category interaction and (4) Landmark vs Animal predicted by RT incongruency effects. Within the object processing system, results are reported at p<0.05 whole brain cluster level corrected with p<0.001 voxel threshold.
Results

Behaviour: 2-way ANOVA of RT revealed main effect of incongruency, in the absence of a main effect of category or an interaction. No significant effects were revealed for performance accuracy.

fMRI:

1) Incongruency effects: left inferior precentral sulcus, bilateral insula (only at uncorrected threshold)
2) Landmark-selective: bilateral PPA; Animal-selective: bilateral FFA
3) Incongruent > Congruent for Landmark > Animal: bilateral auditory cortex (due to stimulus confounds)
4) Landmark > Animal predicted by RT incongruency effects for landmarks: left fusiform

Conclusions

At the behavioural level, conflicting task-irrelevant auditory information interfered with subjects' categorization as indicated by longer reaction times for incongruent relative to congruent stimuli. At the neural level, this behavioural interference was mediated by increased activation in bilateral insula and left inferior precentral sulcus. No incongruency effects were observed commonly for all subjects in category-selective areas. However, in the left fusiform (FFA), activation for Landmark > Animal was increased for subjects that showed strong behavioural interference during categorization of visual landmarks accompanied by animal sounds. Thus, less modular and more distributed object representations in the occipitotemporal cortex may render subjects more susceptible to interfering object information from the auditory modality.