Deutsch
 
Hilfe Datenschutzhinweis Impressum
  DetailsucheBrowse

Datensatz

DATENSATZ AKTIONENEXPORT

Freigegeben

Poster

Visual Influences On Neurons In Voice-Sensitive Cortex

MPG-Autoren
/persons/resource/persons84132

Perrodin,  C
Department Physiology of Cognitive Processes, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Research Group Physiology of Sensory Integration, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Max Planck Institute for Biological Cybernetics, Max Planck Society;

/persons/resource/persons84006

Kayser,  C
Research Group Physiology of Sensory Integration, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Department Physiology of Cognitive Processes, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Max Planck Institute for Biological Cybernetics, Max Planck Society;

/persons/resource/persons84063

Logothetis,  NK
Department Physiology of Cognitive Processes, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Max Planck Institute for Biological Cybernetics, Max Planck Society;

/persons/resource/persons84136

Petkov,  CI
Department Physiology of Cognitive Processes, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Max Planck Institute for Biological Cybernetics, Max Planck Society;

Externe Ressourcen
Volltexte (beschränkter Zugriff)
Für Ihren IP-Bereich sind aktuell keine Volltexte freigegeben.
Volltexte (frei zugänglich)
Es sind keine frei zugänglichen Volltexte in PuRe verfügbar
Ergänzendes Material (frei zugänglich)
Es sind keine frei zugänglichen Ergänzenden Materialien verfügbar
Zitation

Perrodin, C., Kayser, C., Logothetis, N., & Petkov, C. (2012). Visual Influences On Neurons In Voice-Sensitive Cortex. Poster presented at 8th Forum of European Neuroscience (FENS 2012), Barcelona, Spain.


Zitierlink: https://hdl.handle.net/11858/00-001M-0000-0013-B6E2-B
Zusammenfassung
Many animals use cross-sensory information during communication, but it remains unclear how the brain integrates face and voice information. Functional imaging evidence suggests that the brains of human and nonhuman primates contain voice- and face-sensitive regions, and some of the human studies have suggested that multisensory interactions occur in these regions. Yet, to date neurons in monkey voice/face regions have been studied exclusively with unisensory stimuli. We targeted neurons in a recently identified voice-sensitive cluster in the right hemisphere on the supratemporal plane to investigate how neurons in the monkey brain combine auditory voice and visual face information. Extracellular recordings were conducted in two Rhesus macaques participating in a visual fixation task. Dynamic face and voice stimuli (movies of vocalizing monkeys and humans imitating monkey “coo” calls) were presented in auditory only, visual only and audio-visual stimulation conditions, including congruent and incongruent audio-visual pairs. In this region, we identified spiking activity driven by the presence of auditory stimuli (n = 130 single- and multi-units), 42 of which demonstrated visual influences. Most of the visual modulation (36 of responsive units) consisted of nonadditive multisensory effects, where the audiovisual responses significantly deviated from the sum of both unimodal responses. The magnitude of the visual influences was differentially sensitive to stimulus features such as call type, speaker identity and familiarity. Human voices elicited qualitatively similar auditory and audiovisual responses as monkey voices. Finally, we found that incongruent stimuli elicited a larger proportion of sublinear audiovisual interactions, relative to congruent audiovisual pairs. Our results identify visual influences at the neuronal level in a primate auditory 'voice' region. Together, with results from functional imaging studies in humans, these findings extend our understanding of the multisensory influences at voice regions, which might also be evident in neurons at face-sensitive regions.