University of Tübingen
Biomedical Magnetic Resonance
The issue of where in the human cortex coding of sound location is represented still is a matter of debate. It is unclear whether there are cortical areas that are specifically activated depending on the location of sound. Are identical... more
The issue of where in the human cortex coding of sound location is represented still is a matter of debate. It is unclear whether there are cortical areas that are specifically activated depending on the location of sound. Are identical or distinct cortical areas in one hemisphere involved in processing of sounds from the left and right? Also, the possibility has not been investigated so far that distinct areas have a preference for processing of central and eccentric sound locations. The present study focussed on these issues by using functional magnetic resonance imaging (fMRI). Activations evoked by left, right and central sounds were analysed separately, and contrasts were computed between these conditions. We did not find areas, which were involved in the processing of exclusively left, right or central sound positions. Large overlapping areas rather were observed for the three sound stimuli, located in the temporal, parietal and frontal cortices of both hemispheres. This result argues for the idea of a widely distributed bilateral network accessing an internal representation of the body to encode stimulus position in relation to the body median plane. However, two areas (right BA 40 and left BA 37) also were found to have preferences for sound position. In particular, BA 40 turned out to be significantly more activated by processing central positions, compared to eccentric stimuli. In line with previous findings on visual perception, the latter observation supports the assumption that the right inferior parietal cortex may be preferentially involved in the perception of central stimulus positions in relation to the body.
- by Ulrike Zimmer and +1
- •
The integration of auditory and visual spatial information is an important prerequisite for accurate orientation in the environment. However, while visual spatial information is based on retinal coordinates, the auditory system receives... more
The integration of auditory and visual spatial information is an important prerequisite for accurate orientation in the environment. However, while visual spatial information is based on retinal coordinates, the auditory system receives information on sound location in relation to the head. Thus, any deviation of the eyes from a central position results in a divergence between the retinal visual and the head-centred auditory coordinates. It has been suggested that this divergence is compensated for by a neural coordinate transformation, using a signal of eye-in-head position. Using functional magnetic resonance imaging, we investigated which cortical areas of the human brain participate in such auditory-visual coordinate transformations. Sounds were produced with different interaural level differences, leading to left, right or central intracranial percepts, while subjects directed their gaze to visual targets presented to the left, to the right or straight ahead. When gaze was to the left or right, we found the primary visual cortex (V1/V2) activated in both hemispheres. The occipital activation did not occur with sound lateralization per se, but was found exclusively in combination with eccentric eye positions. This result suggests a relation of neural processing in the visual cortex and the transformation of auditory spatial coordinates responsible for maintaining the perceptual alignment of audition and vision with changes in gaze direction.
- by Ulrike Zimmer and +1
- •
Functional magnetic resonance imaging was used to determine the activation of the amygdala while seven social phobics and five healthy controls were exposed to slides of neutral faces as well as aversive odor stimuli. The amygdala was... more
Functional magnetic resonance imaging was used to determine the activation of the amygdala while seven social phobics and five healthy controls were exposed to slides of neutral faces as well as aversive odor stimuli. The amygdala was selectively activated in the social phobics during presentation of the face stimuli. The data show for the first time that the amygdala is active in human phobics when they are exposed to potentially fear-relevant stimuli. Further research is needed to determine the extent to which overactivation of the amygdala precedes or is a consequence of phobia.
We used fMRI to map foot, elbow, fist, thumb, index finger, and lip movements in 30 healthy subjects. For each movement type confidence intervals of representational sites in the primary motor cortex (M1) were evaluated. In order to... more
We used fMRI to map foot, elbow, fist, thumb, index finger, and lip movements in 30 healthy subjects. For each movement type confidence intervals of representational sites in the primary motor cortex (M1) were evaluated. In order to improve the precision of their anatomical localization and to optimize the mapping of cortical activation sites, we used both the assessment of locations in the conventional 3D system and a 2D projection method. In addition to the computation of activation maxima of activation clusters within the precentral gyrus, centers of gravity were determined. Both methods showed a high overlap of their representational confidence intervals. The 2D-projection method revealed statistically significant distinct intralimb locations, e.g., elbow versus index finger movements and index finger versus thumb movements. Increased degree of complexity of finger movements resulted in a spread of the somatotopic location toward the arm representation. The 2D-projection method-based fMRI evaluation of limb movements showed high precision and was able to reveal differences in intralimb movement comparisons. fMRI activation revealed a clear somatotopic order of movement representation in M1 and also reflected different degrees of complexity of movement.
During acoustic communication among human beings, emotional information can be expressed both by the propositional content of verbal utterances and by the modulation of speech melody (affective prosody). It is well established that... more
During acoustic communication among human beings, emotional information can be expressed both by the propositional content of verbal utterances and by the modulation of speech melody (affective prosody). It is well established that linguistic processing is bound predominantly to the left hemisphere of the brain. By contrast, the encoding of emotional intonation has been assumed to depend specifically upon right-sided cerebral structures. However, prior clinical and functional imaging studies yielded discrepant data with respect to interhemispheric lateralization and intrahemispheric localization of brain regions contributing to processing of affective prosody. In order to delineate the cerebral network engaged in the perception of emotional tone, functional magnetic resonance imaging (fMRI) was performed during recognition of prosodic expressions of five different basic emotions (happy, sad, angry, fearful, and disgusted) and during phonetic monitoring of the same stimuli. As compared to baseline at rest, both tasks yielded widespread bilateral hemodynamic responses within frontal, temporal, and parietal areas, the thalamus, and the cerebellum. A comparison of the respective activation maps, however, revealed comprehension of affective prosody to be bound to a distinct right-hemisphere pattern of activation, encompassing posterior superior temporal sulcus (Brodmann Area [BA] 22), dorsolateral (BA 44/45), and orbitobasal (BA 47) frontal areas. Activation within left-sided speech areas, in contrast, was observed during the phonetic task. These findings indicate that partially distinct cerebral networks subserve processing of phonetic and intonational information during speech perception.
63.0 million researchers use this site every month. Ads help cover our server costs.