Integracio nMultisensorialCiegos
Integracio nMultisensorialCiegos
Integracio nMultisensorialCiegos
Hearing Research
journal homepage: www.elsevier.com/locate/heares
Research paper
a r t i c l e i n f o a b s t r a c t
Article history: Studying blind humans is an excellent opportunity to investigate how experience might shape auditory
Received 22 April 2009 processing. In everyday life, blind humans rely more on auditory information than sighted humans to rec-
Received in revised form 28 July 2009 ognize people, localize events, or process language. A growing number of studies have provided evidence
Accepted 28 July 2009
that the increased use of the auditory system results in compensatory behavior in the blind. Blind
Available online 3 August 2009
humans perform better in perceptual auditory tasks, like pitch or duration discrimination, and in auditory
language and memory tasks. Neural plasticity at different levels of the auditory processing stream has
Keywords:
been linked to these behavioral benefits. In everyday life, many events stimulate more than one sensory
Blind
Auditory perception
system. Multisensory research has cumulated evidence that the integration of information across modal-
Multisensory ities facilitates perception and action control. Neurophysiological correlates of multisensory interactions
Neuronal plasticity have been described for various subcortical and cortical areas. There is evidence that vision plays a piv-
otal role in setting up multisensory functions during ontogeny. This article summarizes evidence for a
reorganization of multisensory brain areas and reduced crossmodal interactions on the behavioral level
following congenital visual deprivation.
! 2009 Elsevier B.V. All rights reserved.
0378-5955/$ - see front matter ! 2009 Elsevier B.V. All rights reserved.
doi:10.1016/j.heares.2009.07.012
166 K. Hötting, B. Röder / Hearing Research 258 (2009) 165–174
2. Auditory processing ability of neural circuits. Röder et al. (1999a) varied the
interstimulus interval for auditory stimuli and measured the
When talking about auditory processing in humans, it is useful amplitude of the auditory N1 in congenitally blind and sighted par-
to distinguish between certain levels of processing, i.e., basic sen- ticipants. They found that the N1 recovered faster in the blind than
sory analyses to higher cognitive auditory functions. Data on audi- in the sighted and, again, blind participants showed shorter reac-
tory perception in the blind will be presented first, with a special tion times to auditory targets than the sighted group. As men-
section on spatial perception. A summary of data on higher cogni- tioned above, activity in primary and secondary auditory cortex
tive functions like language and memory will follow. These differ- seem to contribute to the auditory N1 (Näätänen and Picton,
ent levels of processing, however, are not independent from each 1987) and therefore, an enhanced excitability of the auditory cor-
other. Most likely, plasticity on the perceptual level contributes tex might contribute to enhanced perceptual skills in the blind.
to performance in other cognitive domains like language and Studies using complementary neuroscientific methods support
memory. the idea of a reorganization in auditory cortex after visual depriva-
tion. Elbert et al. (2002), for example, showed an expansion of
tonotopic maps in the auditory cortex of blind humans by measur-
2.1. Auditory perception
ing the magnetoencephalographic response (MEG) to auditory
stimuli. Moreover, in a recent functional magnetic resonance imag-
In blind humans sensory thresholds as revealed by standard
ing (fMRI) study, Stevens and Weaver (2009) found a reduced
audiometry have not been found to differ from those of sighted hu-
hemodynamic response to auditory stimulation in the superior
mans (Collignon et al., 2006; Niemeyer and Starlinger, 1981; Star-
and middle temporal lobe in early blind humans as compared to
linger and Niemeyer, 1981). Early blind humans, however, have
late blind and sighted participants. The authors concluded that
been found to outperform sighted controls in pitch discrimination
the blinds’ auditory cortex processes stimuli more efficiently, at
tasks (Gougoux et al., 2004). Moreover, a higher prevalence of
least under low-demanding task conditions.
‘‘absolute pitch”1 has been reported in a sample of blind musicians
Thus, the behavioral and neurophysiological data reviewed so
than in a sample of sighted musicians (Hamilton et al., 2004).
far support the hypothesis of enhanced discrimination abilities
Blind humans seem to have a better temporal resolution for
for basic auditory features like the pitch, timing and the duration
auditory stimuli than sighted humans as well. They showed better
of sounds in congenitally and early blind humans, probably due
discrimination abilities for auditory defined time intervals2
to a reorganization of auditory cortices.
(Rammsayer, 1992), they were able to identify a gap between two
noise bursts at shorter intervals than sighted humans (Muchnik
2.2. Spatial processing
et al., 1991) and they outperformed sighted participants in auditory
temporal order judgment tasks (Stevens and Weaver, 2005). Spectral
Vision is the sense with the highest spatial resolution. Studies in
and temporal discrimination are known as central auditory skills and
ferrets and barn owls have shown that vision dominates the func-
therefore, it has been hypothesized that better auditory-perceptual
tional organization of spatial auditory maps (Knudsen and Brai-
skills in the blind are due to faster or more efficient processing in
nard, 1991). Therefore, it could be argued that vision is necessary
cortical auditory areas (Muchnik et al., 1991). Event-related brain
to build up spatial representations. When visual input is lacking,
potentials (ERPs) allow a more direct measurement of central pro-
like in congenitally blind humans, this could lead to impaired spa-
cessing of auditory stimuli. ERPs are extracted from the ongoing
tial skills. On the other hand, blind humans rely more on auditory
electroencephalogram (EEG) by averaging the signal time-locked to
cues to localize events in their environment than sighted humans.
a perceptual, cognitive, or motor event. The measured scalp poten-
This might result in use-dependent plasticity and better spatial
tials arising from the synchronous activity of neuronal populations
skills in the intact modalities. In favor of the latter idea are results
allow researchers to disentangle information processing stages with
in cats showing enhanced localization abilities for sounds in visu-
a temporal resolution in the order of milliseconds (Hillyard and Pic-
ally deprived animals (Rauschecker and Kniepert, 1994) that were
ton, 1987; Luck, 2005). Röder et al. (1996) measured the latency of
accompanied by sharper spatial tuning of auditory neurons in mul-
the auditory N1 in an auditory stimulus discrimination task in con-
timodal cortex (Korte and Rauschecker, 1993). Blind humans have
genitally blind and sighted participants. The N1 is a negative ERP
shown similar performance as compared to sighted participants
deflection seen approximately 100 ms after an auditory stimulus
when localizing auditory targets in the frontal field (Röder et al.,
with generators in the auditory cortex, temporal and parietal associ-
1999b; Voss et al., 2004; Zwiers et al., 2001a,b) and enhanced per-
ation cortices and motor/premotor areas (Näätänen and Picton,
formance for peripheral stimuli (Després et al., 2005; Röder et al.,
1987). Blind participants had shorter latencies of the N1 and they de-
1999b; Voss et al., 2004). In an ERP experiment Röder et al.
tected auditory stimuli faster than sighted participants. Shorter
(1999b) tested auditory spatial tuning of the neuronal representa-
latencies of the auditory N1 in early blind humans have been re-
tions for stimuli in the central vs. peripheral auditory field. They
ported in other studies as well (Elbert et al., 2002; Niemeyer and
presented auditory stimuli from eight loudspeakers positioned in
Starlinger, 1981). Thus, these results support the assumption that a
front of (speaker 1–4) or lateral (5–8) to the participants. Adjacent
reorganization of auditory cortex in the blind result in superior audi-
speakers were separated by six degrees in azimuth. In different
tory skills.
experimental blocks, congenitally blind and sighted participants
Another ERP study from our lab has suggested that the excit-
were asked to attend to the most frontal or the most lateral speak-
ability of the auditory cortex is higher in the blind than in sighted
er, respectively, and to react to deviant stimuli (which differed in
humans. It is known that fast stimulus repetitions within one sen-
pitch from frequent standard stimuli) presented at the attended
sory modality cause an amplitude reduction of ERPs (‘‘relative
speaker only. Responses to adjacent speakers were counted as er-
refractoriness” or ‘‘recovery cycle”). The time needed for the recov-
rors. For the central speakers, blind participants and sighted partic-
ery of the full amplitude has been used as an index for the excit-
ipants were highly accurate at detecting the target sounds at the
attended speaker and they committed only few false alarms to
1
Absolute pitch: The ability to identify a particular pitch of the musical scale
adjacent speakers. The error rates increased for the lateral speak-
without any reference tone.
2
Participants had to judge which of two sequentially presented tones (condition 1)
ers: both groups made more false alarms to sounds next to the at-
or silent intervals marked by an onset and offset tone (condition 2), were longer. Time tended speaker. The error rates of the blind participants, however,
discrimination thresholds were computed from their responses. were significantly lower than those of the sighted suggesting more
K. Hötting, B. Röder / Hearing Research 258 (2009) 165–174 167
precise auditory localization abilities for lateral positions. This con- tence made sense. As expected from previous ERP studies with sim-
clusion was supported by the ERP data recorded during the task. ilar paradigms (summarized in Kutas and Federmeier, 2000, for
The gradient of auditory spatial attention, indicated by an ampli- example), incongruent final words elicited an enhanced negativity
tude modulation of the auditory response with a latency of starting 400 ms after word onset as compared to congruent words
100 ms after stimulus presentation, was sharper in the blind than (N400 effect). The onset of the N400 was earlier in the blind than
in the sighted. These results suggest that improved auditory spatial in the sighted suggesting faster word recognition in this group.
processing might follow as a consequence of lacking visual spatial Moreover, the topography of the N400 effect differed between blind
information. Interestingly, Münte et al. (2001) reported similar en- and sighted participants: while the auditory N400 was left lateral-
hanced auditory spatial attention mechanisms in conductors as ized in the sighted group, the N400 had a much broader bilateral
compared to other musicians and non-musicians. Conductors are and posterior distribution in the blind. The finding of a more bilat-
highly trained to monitor sounds particularly in the lateral space eral language representation in the blind was confirmed later by an
and thus, use-dependent plasticity might account for their en- fMRI study of our lab. In this study, auditory speech processing in
hanced performance. congenitally blind humans activated not only classical left hemi-
There are differences between plasticity in the developing and spheric perisylvian language areas as seen in a sighted control
adult brain. In a recent study, Fieger et al. (2006) used the same group, but additionally activated homologue right-hemispheric
paradigm as Röder et al. (1999b) to study auditory spatial tuning structures and extrastriate and striate cortices (Röder et al.,
in late blind adults. Late blind participants lost sight after the age 2002). The less pronounced lateralization of language functions in
of 9 years. As the congenitally blind group, they were more accu- the blind was related to their use of the Braille letter system for
rate than sighted participants in localizing sounds in the periphery reading. It has been suggested that Braille reading relies more upon
(see Voss et al., 2004, for similar results in late blind humans). The spatial processing associated with the right hemisphere than read-
ERP data indicate that the compensatory behavior, however, was ing printed language (Karavatos et al., 1984).
mediated by different neural mechanisms in the two blind groups: Improved auditory speech discrimination abilities have been re-
whereas congenitally blind humans showed a more sharply tuned ported in the blind, especially in the context of a noisy background
processing at 100 ms after stimulus onset, the behavioral advan- (Muchnik et al., 1991; Niemeyer and Starlinger, 1981). Since there
tage of late blind individuals correlated with the spatial tuning of was no difference in absolute thresholds for simple auditory stim-
ERPs indicating later processing stages of target discrimination uli in these studies, the authors attributed the advantage of the
and recognition (approximately 300 ms post-stimulus). blind to a more efficient language processing. Röder et al. (2003)
Moreover, at least a subgroup of blind humans has been shown directly tested semantic and syntactic processing in the blind. They
to outperform sighted humans when localizing sounds monaurally measured lexical decision times in a priming paradigm. In each
(Gougoux et al., 2005; Lessard et al., 1998). However, under some trial, an adjective preceded a noun or a pseudo-word. Participants
specific conditions blind participants have been found to perform had to decide as fast as possible whether or not the second word
worse than sighted controls including localizing sounds in the ver- was a legal German word. The adjective was or was not semanti-
tical plane (Lewald, 2002), especially under low signal-to-noise cally related to the subsequent noun. Moreover, in half of the trials,
conditions (Zwiers et al., 2001a), or in distance judgment tasks the adjective was either correctly or incorrectly inflected for gen-
(Wanet and Veraart, 1985). der with respect to the following noun. Both blind and sighted par-
Taken together, vision seems to play a critical role during devel- ticipants gained similarly from semantic and syntactic priming.
opment in shaping some auditory spatial representations. How- The blind, however, had shorter reaction times than sighted partic-
ever, compensatory mechanisms, for example a higher ipants for both words and pseudo-words. Thus, it was concluded
excitability and spatial tuning of auditory neural assemblies, might that the advantage of the blind was most likely due to a more effi-
contribute to similar or even enhanced auditory spatial skills in the cient processing of the speech signal due to more effective audi-
blind. tory-perceptual skills rather than a more extensive use of
semantic or morpho-syntactic information.
2.3. Language and memory Language processing has been closely linked to working mem-
ory functions (Just and Carpenter, 1992) and therefore, faster
A reorganization of perceptual systems in blind humans may speech processing in the blind might to some extend be due to
account for improvements in ‘‘higher” cognitive functions as well. higher working memory capacities in the blind as well. Indeed,
It is reasonable to assume that for example auditory language pro- higher working memory capacities for auditory presented words
cessing gains from faster processing of auditory stimuli (Röder and digits have been observed in congenitally blind as compared
et al., 1996, 1999a) and enhanced pitch discrimination skills to sighted individuals (Hull and Mason, 1995; Röder and Neville,
(Gougoux et al., 2004). In the following paragraphs, we will review 2003).
studies on language and memory in the blind providing evidence
that both reorganization in perceptual systems as well as specific 2.3.2. Memory for auditory stimuli
alterations in language and memory systems might contribute to Recognizing people is crucial in an everyday social context. As
compensatory behavior in the blind. the blind do not have access to facial information, they have to rely
on voices to distinguish other people. Indeed, memory for voices
2.3.1. Language processing in the blind has been found to be enhanced in blind humans as compared to
Blind humans rely more on auditory language in everyday life sighted controls (Bull et al., 1983; Röder and Neville, 2003). Their
than sighted people; for example they use talking books and the recognition performance was, however, lower than in sighted indi-
speech output of computers to ‘‘read” texts. Some blind participants viduals who had access to facial information. Moreover, enhanced
speed up the speech output to accelerate reading. To disentangle memory scores for environmental sounds have been reported for
processes of auditory language comprehension more precisely, congenitally blind and age-matched late blind humans (Röder
Röder et al. (2000) measured ERPs while blind and sighted partici- and Rösler, 2003). In this study, participants were either asked to
pants listened to short sentences. These sentences terminated process the sounds semantically or based on their physical proper-
either with a semantically congruent (e.g. ‘‘We sleep in a tent when ties. Both blind and sighted showed better recognition scores for
we go camping‘‘) or incongruent word (e.g. ‘‘*Tomorrow Bobby will semantically encoded sounds than for physically encoded sounds
be 10 years hill”). Participants had to decide whether or not the sen- suggesting that differences between groups were not due to
168 K. Hötting, B. Röder / Hearing Research 258 (2009) 165–174
different memory strategies. From these behavioral data alone, 2001), auditory motion perception (Poirier et al., 2006), auditory
however, it is not possible to differentiate whether blind humans localization (De Volder et al., 1999; Voss et al., 2008; Weeks
were more efficient in memory encoding and/or retrieval than et al., 2000) and auditory pattern discrimination (Arno et al.,
sighted control groups. Therefore, Röder et al. (2001) measured 2001). The amount of activation in occipital areas partially corre-
ERPs during the encoding and retrieval of auditory verbal material lated with participants’ memory scores (Amedi et al., 2003; Raz
in congenitally blind and sighted participants. In the encoding et al., 2005) and sound localization accuracy (Gougoux et al.,
phase, participants had to judge whether or not the presented sen- 2005). Moreover, transcranial magnetic stimulation (TMS) over
tences were semantically correct. They were not informed about occipital areas interfered with word generation (Amedi et al.,
the subsequent recognition phase (‘‘incidental memory para- 2004), auditory spatial localization (Collignon et al., 2007), Braille
digm”). During retrieval, blind participants recognized more words reading and tactile discrimination (Cohen et al., 1997) only in
presented in the study phase than sighted participants. Moreover, blind, but not in sighted humans. The latter findings suggest a cau-
they showed enlarged amplitudes of memory related ERPs as com- sal link between brain areas traditionally associated with the visual
pared to the sighted both during encoding and during retrieval. modality in the blind and their better performance in perceptual-
These results suggest that blind participants encode auditory cognitive tasks. However, the precise role of occipital cortex
material more efficient than sighted participants which in turn recruitment in blind people still remains a matter of debate. Some
facilitate retrieval. The ERP differences between blind and sighted authors point to a functional dissociation and specialization of dis-
participants in this task were seen not earlier than 200 ms after tinct areas within the occipital cortex in the blind (Amedi et al.,
stimulus onset suggesting that differences in early perceptual pro- 2003; Collignon et al., 2009). Given the variety of tasks during
cessing of auditory stimuli alone do not account for the group dif- which these activations have been observed, researchers have sug-
ference. The better memory for auditory stimuli in the blind (see gested that controlled or attention demanding processes might be
also Amedi et al., 2003 and Raz et al., 2005) might therefore, at a prerequisite for visual cortex activation in the blind (Sadato et al.,
least in part, be due to more efficient memory mechanisms. 1996; Röder et al., 1996; 1997; Weaver and Stevens, 2007).
In sum, some auditory functions including language processing Moreover, the onset of blindness has an impact on crossmodal
and auditory short-term and long-term memory have been found plasticity. The studies cited so far investigated early blind humans.
to be more efficient in blind humans as compared to sighted con- There are some studies showing similar results in late blind partic-
trols. The behavioral and ERP data suggest that both plasticity in ipants (e.g. Büchel et al., 1998; Rösler et al., 1993) and even blind-
auditory areas as well as specific adaptations in language and folding in sighted humans for some days has been found to
memory systems contribute to the compensatory behavior. More- induce short-term plasticity in occipital areas (e.g. Merabet et al.,
over, crossmodal plasticity in the occipital lobe has been discussed 2008; Pascual-Leone and Hamilton, 2001). Nevertheless, other re-
as an underlying neuronal correlate of compensatory performance. sults suggest that crossmodal plasticity is age-dependent (Cohen
The analyses of the ERP topographies in the studies by Röder et al. et al., 1999; Sadato et al., 2002) and point to the fact that different
(1999a, 2000, 2001) (Fig. 1), for example, showed a more posterior neuronal mechanisms underlie crossmodal plasticity during devel-
distribution in the blind probably due to additional activations in opment versus adulthood (for more extensive reviews on crossmo-
occipital areas. The better spatial resolution of MRI and PET tech- dal plasticity see Bavelier and Neville, 2002; Collignon et al., 2009).
niques allows a more precise localization of brain activations in
sighted and blind humans. Studies using these methods have dem-
3. Multisensory processing
onstrated occipital activity in the blind in many different tasks
including word generation (Amedi et al., 2003), auditory speech
Most events in the world are multisensory, that is, they provide
perception (Röder et al., 2002), verbal memory (Amedi et al.,
input to more than one sensory modality. For example, if we are
2003; Raz et al., 2005), Braille reading and tactile discrimination
standing at a street corner and a car drives past, we do not only
(Sadato et al., 1996, 1998), mental imagery (De Volder et al.,
hear the sound of its motor increasing and then decreasing again
in loudness, we also see the form and color of the car and possibly
perceive the air draft of the fast moving vehicle when it closely
passes by. These different perceptual qualities are encoded by sep-
arated sensory organs (ears, eyes and skin) which project to spe-
cialized sensory cortices. Many behavioral studies in humans and
animals have shown that the combination of input from different
sensory modalities enhances perception and facilitates action con-
trol. For example, reaction times are faster to multisensory stimuli
than to unisensory stimuli (Miller, 1991), seeing lip movements
enhances auditory speech perception (Sumby and Pollack, 1954)
and both multisensory object recognition (Amedi et al., 2005)
and localization (Stein et al., 1989) are more precise than unisenso-
ry judgments.
A question of current research is whether multisensory pro-
cesses are innate or whether they are shaped by experience. Visual
deprivation is one model to study if, and if yes, how the absence of
one type of sensory input affects multisensory processing within
the remaining senses.
cent review, see Driver and Noesselt, 2008, for example). The supe- domain (Hötting and Röder, 2004): One to four tactile stimuli were
rior colliculus is one of the best known subcortical convergent sites presented in a rapid sequence to the right index finger and partic-
of the auditory, visual, and somatosensory system (reviewed in ipants were asked to judge the number of tactile stimuli. In most of
Stein and Stanford, 2008). Many neurons of this midbrain structure the trials, the tactile stimuli were accompanied by task-irrelevant
respond to input of more than one sensory modality and the recep- tones. The number of these tones was either congruent or incon-
tive fields for different modalities are spatially aligned. The firing gruent to the number of touches. Sighted participants, irrespec-
rate of these neurons to multisensory input has been found to be tively whether or not they were blindfolded, were influenced by
higher than the response to the most effective unisensory stimulus, the tones although they were explicitly told to ignore them. The
provided that the multisensory stimuli are presented in close spa- mean perceived number of tactile stimuli was significantly en-
tial and temporal proximity. hanced when one tactile stimulus was presented together with
Wallace et al. (2004) reared cats in total darkness and recorded two, three or four tones (Fig. 2), thus the tones often evoked the
neuronal responses in the superior colliculus to unisensory visual, illusory perception of a second touch. Congenitally blind partici-
auditory, and somatosensory stimuli as well as to the combination pants, however, showed a markedly reduced illusion as compared
of these stimuli. Although the neurons were responsive to all types to sighted participants, especially when the discrepancy between
of unisensory stimuli, evidence for multisensory integration was the number of tones and the number of tactile stimuli was large,
not found after dark rearing: the well known response enhance- as in the one tactile/four tones condition (Fig. 2). A comparison be-
ment to multisensory stimuli as compared to unisensory stimuli tween groups for tactile only trials revealed that the blind were
was not observed even for the combination of auditory and more precise in judging the number of tactile stimuli than the
somatosensory stimuli. These results suggest that vision during sighted participants. This result pattern fits well with the inverse
development is necessary to shape multisensory functions. efficiency principle of multisensory integration stating that the
At the cortical level, several regions in the temporal, parietal or likelihood of multisensory interactions is higher when the input
frontal lobe have been described as multisensory3, for example the of the single modalities is weak or of low reliability (Ernst and
superior temporal sulcus (Beauchamp et al., 2004), the superior pari- Banks, 2002; Stein and Meredith, 1993). From this point of view,
etal lobule (Molholm et al., 2006), or the ventrolateral prefrontal cor- the likelihood for multisensory integration might be lower in the
tex (Sugihara et al., 2006). The absence of response enhancement to blind than in the sighted because of their enhanced perceptual
multisensory as compared to unisensory stimuli after visual depriva- skills within the tactile and auditory modality.
tion has been shown recently for the anterior ectosylvian sulcus In a recent ERP experiment we studied the neuronal correlates
(AES) as well, a well-defined multisensory area in the cat’s temp- of this auditory-tactile illusion in sighted participants (Hötting
oro-parietal cortex (Carriere et al., 2007). Moreover, regions of the et al., in press). In an oddball-paradigm tactile double stimuli to-
AES that normally represent visual activity start to respond to audi- gether with two tones were presented as frequent standard stimuli
tory or somatosensory input after visual deprivation (Rauschecker and single tactile stimuli with two tones as rare deviant stimuli.
and Korte, 1993). It has repeatedly reported that ERPs related to Participants’ task was to press a button whenever they perceived
stimulus classification processes (‘‘N2b effect”) had a more posterior a single tactile stimulus and to ignore the tones. Most of the times,
topography in blind as compared to sighted humans (Hötting et al., however, participants did not respond to single tactile stimuli
2004; Kujala et al., 1992, 1995, 1997; Liotti et al., 1998; Röder et al., accompanied by two tones suggesting that they perceive these
1996). Since the neural generators of the N2b have been localized in stimuli as double touches. The ERPs showed reduced tactile devi-
the multisensory cortex of the temporo-parietal junction (Halgren ant processing when participants did not detect this single tactile
et al., 1995) and parietal lobe (Knight, 1990), reorganizations in pari- deviant stimulus and thus, perceived the auditory-tactile illusion.
etal multisensory areas have been suggested in blind humans as well Interestingly, the amplitude of the tactile N2b was modulated by
(Röder et al., 1999b). participants’ subjective percept: The N2b was most pronounced
In sum, reorganization of cortical and subcortical brain areas when the actual number of touches was indeed perceived; it was
known to be involved in multisensory interactions have been ob- significantly reduced when tones successfully altered the tactile
served after visual deprivation both in animals and humans. Except percept and was lowest for standards that did not require an overt
the animal studies of Wallace et al. (2004) and Carriere et al.
(2007), however, none of these studies directly tested the process-
ing of multisensory stimuli after visual deprivation. Therefore, we
ran studies on auditory-tactile interactions in congenitally blind
humans.
response. Previous studies using unisensory stimuli reported al- around 100 ms after stimulus onset as compared to a condition
tered topographies of the N2b effect in blind humans (e.g. Kujala when participants attend to tones in the other ear (Hillyard et al.,
et al., 1995; Röder et al., 1996). Whether these reorganizations 1973). The time course and topography of this early negativity sug-
might account for the blinds’ lower susceptibility to the illusion, gests that attention enhances the processing in primary and/or sec-
however, has not been tested yet. ondary auditory cortices. Similar results have been observed for
the visual (e.g. Mangun and Hillyard, 1990) and the tactile modal-
3.2.2. Crossmodal spatial attention ity (e.g. Michie et al., 1987). ERP effects of spatial attention can be
Studying spatial attention can provide insight into how space is used to test at which processing stages crossmodal links exist.
used to integrate information across sensory modalities. It is a Moreover, comparing blind and sighted participants in an audi-
well-established finding that paying attention to a position in tory-tactile version of this paradigm allows us to assess changes
space facilitates processing of stimuli presented at this position in multisensory interactions as a consequence of visual depriva-
(e.g. Posner, 1980). Behavioral and ERP studies have shown that tion. We presented tactile stimuli and tones in a random sequence
attending stimuli of one modality at a certain location in space from the left and right side with respect to participants’ body mid-
facilitates not only the processing of stimuli of the attended modal- line (Hötting et al., 2003). The probabilities for stimuli of each
ity, but of stimuli of other modalities at that position as well (re- modality and each spatial position were the same. In different
viewed in Eimer, 2001). In crossmodal cuing studies, for example, experimental blocks, participants were asked to attend to stimuli
a tone is presented a few hundred milliseconds prior to a visual of one sensory modality and one spatial position only to detect rare
target (Spence and Driver, 1997; Spence et al., 1998). In half of deviant stimuli within that modality and at that position, for
the trials the tone is presented at the same position as the upcom- example to attend to tones on the right side only and respond to
ing visual target, in the other half of the trials at a different posi- rare double tones presented from the right side. All other stimuli
tion. Thus, the auditory stimulus does not predict the position of had to be ignored. As seen in Fig. 3a (thick lines), ERPs to tones
the target stimulus. Nevertheless, reaction times to visual stimuli at the attended side were enhanced as compared to ERPs to stimuli
have been found to be faster when they were presented at the cued at the unattended side. This spatial attention effect started at about
location suggesting that the auditory cue automatically attracts 100 ms after stimulus onset and lasted for more than 200 ms. To
attention and that allocating attention to a sound facilitates visual test for any crossmodal attention effect from touch to audition,
processing as well (Spence and Driver, 1997; Spence et al., 1998). we compared ERPs to the same tones now when tactile stimuli
Moreover, when participants expected targets of one modality at were attended (thin lines in Fig. 3a). Early auditory ERPs (100–
one spatial position, the processing of stimuli of a second modality 170 ms after stimulus onset) showed an enhanced negativity to
at that location has repeatedly been observed to be faster, even tones when they were presented at the attended position in touch
when the probability for stimuli of the second modality was high- as compared to tones presented at the unattended position in
est for the opposite position (Spence and Driver, 1996; Spence touch (crossmodal spatial attention effect). A similar pattern of re-
et al., 2000). These crossmodal effects of spatial attention imply sults was observed for somatosensory ERPs when tones were at-
that the spatial representations of the auditory and visual stimuli tended (see Hötting et al., 2003, for details). At first glance these
were matched. ERPs inform about the processing stages at which results confirm the assumption of a common spatial attention sys-
these multisensory interactions take place. It is well known that tem for audition and touch: whenever attention is directed to a po-
spatial attention within a sensory modality modulates early, sition in space within one modality, the spotlight of attention in
modality-specific ERPs (for a review see Woods, 1990). For exam- other sensory modality follows this attention shift. However, the
ple, when participants were asked to attend to tones in one ear spatial attention effect was more pronounced for the task-relevant
only, the ERPs to these stimuli showed an enhanced negativity at modality than for the task-irrelevant modality and later processing
Fig. 3. Grand average ERPs at contralateral central electrode clusters to auditory stimuli for sighted, blindfolded participants (a) and for congenitally blind participants (b) in
an auditory-tactile spatial attention experiment. Negativity is up. In different experimental blocks, participants were asked to attend to tones at a specified position (thick
lines) or to touches at corresponding positions (thin lines). The solid lines showed ERPs to tones presented at the attended position in space while dashed lines showed ERPs
to physically the same stimulus when participants attend the opposite position. Only ERPs to standard stimuli were included into the average, thus, the results were not
affected by any overt response. (Adapted from Hötting et al., 2004, With kind permission from Springer Science + Business Media, Fig. 1).
K. Hötting, B. Röder / Hearing Research 258 (2009) 165–174 171
stages (>200 ms) were modulated by spatial attention for the task- systems for touch and audition. A reduced or even absent atten-
relevant modality only. Therefore, the results are better compatible tional benefit in the ‘‘attend opposite sides” condition would sup-
with a ‘‘separable but linked view of crossmodal spatial attention” port the hypothesis of strong spatially defined links between
as proposed by Spence and Driver (1996). These authors suggested audition and touch that can not be voluntarily overruled.
that there are separable spatial attention mechanisms for each Results for a group of congenitally blind participants and a
modality but with strong spatial synergies between modalities. group of age-matched sighted but blindfolded participants are
In congenitally blind participants, however, a different pattern shown in Fig. 4. Both groups displayed a more efficient processing
of results emerged (Hötting et al., 2004). While spatial attention ef- of stimuli presented at the attended position as compared to stim-
fects within modalities were very similar for blind and sighted par- uli presented at the unattended position. This was true for both the
ticipants, the blind did not show early crossmodal spatial attention ‘‘attend same side” and the ‘‘attend opposite sides” condition.
effects; neither for auditory ERPs (Fig. 3b) nor for somatosensory There was no reduction of the attentional gain in the ‘‘attend oppo-
ERPs. Moreover, ERPs to stimuli of the unattended modality, when site sides” condition, neither in the sighted nor in the blind. Thus,
presented at the attended location, were more positive than ERPs these results support our assumption that blind participants are
to stimuli at the attended location in a later time epoch (around able to split their attention for touch and audition to different spa-
200 ms). These results support the assumption that visual depriva- tial positions. Surprisingly, even sighted participants were able to
tion reduces and alters the interaction between the remaining split their spatial attention for tactile and auditory stimuli to differ-
modalities. At early processing stages, probably within the tradi-
tional unisensory cortex areas, no spatial synergies between the
auditory and tactile spotlight of attention seem to exist in the
blind. The latter crossmodal spatial attention effect (enhanced pos-
itivity to stimuli at the attended location) may reflect a suppres-
sion of task-irrelevant stimuli at the attended location.
ERP studies in sighted humans on crossmodal spatial attention
have provided evidence that spatial attention modulates the pro-
cessing in both modality-specific and multisensory brain regions
(Eimer, 2001; McDonald and Ward, 2000). Feedback projections
from multisensory to modality-specific areas have been discussed
as one underlying mechanism (Driver and Spence, 2000). More-
over, recent research has found evidence for multisensory process-
ing in regions traditionally considered as sensory specific, probably
mediated by direct projections between the somatosensory and
auditory cortices (reviewed in Ghazanfar and Schroeder, 2006). It
can only be speculated that alterations within these connections
after visual deprivation contribute to the pattern of crossmodal
attention effects observed in the blind.
In sighted humans, however, there are results showing that
shifts in spatial attention within touch do not always follow audi-
tory attention (Eimer et al., 2002; Lloyd et al., 2003). Lloyd et al.
(2003) instructed their participants to expect auditory stimuli at
one position in space and tactile stimuli at the opposite position
and compared spatial attention effects during this condition to tri-
als when participants expected stimuli of both modalities at the
same position. Although participants found it easier to attend to
both modalities at the same position, spatial attention effects were
observed in the ‘‘attend different side” condition as well. We rea-
soned that given that blind people do not automatically shift spa-
tial attention across modalities, they should be better able to split
spatial attention across modalities than sighted individuals. There-
fore, we did a similar experiment as conducted by Lloyd et al.
(2003) in congenitally blind and sighted participants (unpublished
data). Auditory and tactile stimuli were presented in a random se-
quence from the left and right side. The probability for the two spa-
tial positions was varied across experimental blocks: in half of the
blocks 83% of all stimuli were presented from one spatial location
and the other 17% from the opposite position, regardless of modal- Fig. 4. Results of an auditory-tactile divided attention experiment for sighted (top)
ity (condition ‘‘attend same side”). In the other half of the blocks and congenitally blind participants. Participants’ task was to indicate for each
83% of all tactile stimuli were presented from one side, but 83% stimulus whether it was a single or a double stimulus, regardless of its modality and
of the auditory stimuli were presented from the opposite side (con- spatial position. Results are reported in terms of inverse efficiency (response time
divided by the proportion of correct trials per condition, see Spence et al., 2001 for
dition ‘‘attend different sides”). Participants were informed about
more details). The inverse efficiency allows the comparison between groups
the different probabilities at the beginning of each block and were uncontaminated by a possible speed-accuracy trade-off. Like response times, lower
asked to direct their spatial attention to the side with the higher values represent more efficient processing. Both groups showed a main effect of
stimulus probability. Thus, faster reaction times to stimuli at the attention: they responded faster and made fewer errors when stimuli were
attended position as compared to stimuli at the unattended posi- presented at the attended location compared to stimuli presented at the
unattended location. This effect of spatial attention was similar for both ‘‘the
tion would indicate a gain due to spatial attention. A similar gain attend same side” conditions and the ‘‘attend different sides” conditions and for
for the ‘‘attend same side” and the ‘‘attend opposite sides” condi- both sensory modalities. Overall, the inverse efficiency was lower for auditory
tions would support the hypothesis of separable spatial attention stimuli than for tactile stimuli. There was no significant difference between groups.
172 K. Hötting, B. Röder / Hearing Research 258 (2009) 165–174
ent positions. In the ERP experiment reported above (Hötting et al., Bavelier, D., Neville, H.J., 2002. Cross-modal plasticity: where and how? Nat. Rev.
Neurosci. 3, 443–452.
2004), participants had to respond to stimuli of one modality only
Beauchamp, M.S., Argall, B.D., Bodurka, J., Duyn, J.H., Martin, A., 2004. Unraveling
and thus, there was no need to shift their spatial attention to a spe- multisensory integration: patchy organization within human STS multisensory
cific position in space in the second modality. It could be specu- cortex. Nat. Neurosci. 7, 1190–1192.
lated that a link between spatial attention systems across Büchel, C., Price, C., Frackowiak, R.S., Friston, K., 1998. Different activation patterns
in the visual cortex of late and congenitally blind subjects. Brain 121, 409–419.
modalities is a default setting in sighted but not in blind humans. Bull, R., Rathborn, H., Clifford, B.R., 1983. The voice-recognition accuracy of blind
However, the link between audition and touch seems to be op- listeners. Perception 12, 223–226.
tional and not obligatory in sighted individuals as well (Lloyd Carriere, B.N., Royal, D.W., Perrault, T.J., Morrison, S.P., Vaughan, J.W., Stein, B.E.,
Wallace, M.T., 2007. Visual deprivation alters the development of cortical
et al., 2003). multisensory integration. J. Neurophysiol. 98, 2858–2867.
Collignon, O., Lassonde, M., Lepore, F., Bastien, D., Veraart, C., 2007. Functional
3.3. Development of multisensory interactions cerebral reorganization for auditory spatial processing and auditory
substitution of vision in early blind subjects. Cereb. Cortex 17, 457–465.
Collignon, O., Renier, L., Bruyer, R., Tranduy, D., Veraart, C., 2006. Improved selective
Developmental studies in animals (Wallace and Stein, 1997) and divided spatial attention in early blind subjects. Brain Res. 1075, 175–182.
and humans (Gori et al., 2008; Neil et al., 2006) have shown that Collignon, O., Voss, P., Lassonde, M., Lepore, F., 2009. Cross-modal plasticity for the
multisensory integration is not adult-like at birth but develops spatial processing of sounds in visually deprived subjects. Exp. Brain Res. 192,
343–358.
gradually during the first month and years of life. It has been sug- Cohen, L.G., Celnik, P., Pascual-Leone, A., Corwell, B., Faiz, L., Dambrosia, J., Honda,
gested that cross-sensory calibrations during development and M., Sadato, N., Gerloff, C., Catala, M.D., Hallett, M., 1997. Functional relevance of
experience with multisensory stimuli are necessary to develop cross-modal plasticity in blind humans. Nature 389, 180–183.
Cohen, L.G., Weeks, R.A., Sadato, N., Celnik, P., Ishii, K., Hallett, M., 1999. Period of
multisensory functions (Gori et al., 2008). susceptibility for cross-modal plasticity in the blind. Ann. Neurol. 45, 451–460.
Recent results in cataract patients from our lab have suggested Després, O., Candas, V., Dufour, A., 2005. Spatial auditory compensation in early-
that vision is necessary during an early sensitive phase for multisen- blind humans: involvement of eye movements and/or attention orienting?
Neuropsychologia 43, 1955–1962.
sory functions to emerge (Putzar et al., 2007). These patients were
De Volder, A.G., Catalan-Ahumada, M., Robert, A., Bol, A., Labar, D., Coppens, A.,
born with dense binocular cataracts that deprived them from any Michel, C., Veraart, C., 1999. Changes in occipital cortex activity in early blind
patterned visual input. After the cataractous lenses were removed humans using a sensory substitution device. Brain Res. 826, 128–134.
De Volder, A.G., Toyama, H., Kimura, Y., Kiyosawa, M., Nakano, H., Vanlierde, A.,
between 5th and 24th month of life, the patients showed a recovery
Wanet-Defalque, M.C., Mishina, M., Oda, K., Ishiwata, K., Senda, M., 2001.
in basic visual functions. However, auditory–visual interactions Auditory triggered mental imagery of shape involves visual association areas in
were reduced or absent even after recovery periods of at least early blind humans. Neuroimage 14, 129–139.
14 years: cataract patients showed less interference in an audio–vi- Driver, J., Noesselt, T., 2008. Multisensory interplay reveals crossmodal influences
on ‘sensory-specific’ brain regions, neural responses, and judgments. Neuron
sual capture paradigm and were not able to benefit from lip-reading 57, 11–23.
in audio–visual speech perception (Putzar et al., 2007). Moreover, Driver, J., Spence, C., 2000. Multisensory perception: beyond modularity and
they showed a reduced McGurk effect as compared to participants convergence. Curr. Biol. 10, R731–R735.
Eimer, M., 2001. Crossmodal links in spatial attention between vision, audition, and
with no history of visual impairments, that is, less fusion of incon- touch: evidence from event-related brain potentials. Neuropsychologia 39,
gruent auditory and visual signals in speech perception (Putzar 1292–1303.
et al., in press). Analogous results have been reported for congeni- Eimer, M., van Velzen, J., Driver, J., 2002. Cross-modal interactions between
audition, touch, and vision in endogenous spatial attention: ERP evidence on
tally deaf children after cochlear implantation (Schorr et al., preparatory states and sensory modulations. J. Cogn. Neurosci. 14, 254–271.
2005). Although these children correctly perceived the unimodal Elbert, T., Sterr, A., Rockstroh, B., Pantev, C., Muller, M.M., Taub, E., 2002. Expansion
auditory speech stimuli, the McGurk illusion was significantly re- of the tonotopic area in the auditory cortex of the blind. J. Neurosci. 22, 9941–
9944.
duced. The likelihood of auditory–visual fusion decreased with
Ernst, M.O., Banks, M.S., 2002. Humans integrate visual and haptic information in a
age of cochlear implantation suggesting that there is a sensitive statistically optimal fashion. Nature 415, 429–433.
phase for the development of multisensory interactions. Fieger, A., Röder, B., Teder-Salejarvi, W., Hillyard, S.A., Neville, H.J., 2006. Auditory
spatial tuning in late-onset blindness in humans. J. Cogn. Neurosci. 18, 149–157.
Studies in visually deprived animals, in blind humans, in cata-
Ghazanfar, A.A., Schroeder, C.E., 2006. Is neocortex essentially multisensory? Trends
ract patients and cochlear implanted children have provided con- Cogn. Sci. 10, 278–285.
verging evidence for the hypothesis that a large number of Gori, M., Del Viva, M., Sandini, G., Burr, D.C., 2008. Young children do not integrate
multisensory processes are not innate but depend on multisensory visual and haptic form information. Curr. Biol. 18, 694–698.
Gougoux, F., Lepore, F., Lassonde, M., Voss, P., Zatorre, R.J., Belin, P., 2004.
experience during early life. Research in congenitally blind humans Neuropsychology: pitch discrimination in the early blind. Nature 430, 309.
has suggested that vision might play a special or even essential role Gougoux, F., Zatorre, R.J., Lassonde, M., Voss, P., Lepore, F., 2005. A functional
for the emergence of multisensory functions. neuroimaging study of sound localization: visual cortex activity predicts
performance in early-blind individuals. PLoS Biol. 3, e27.
Halgren, E., Baudena, P., Clarke, J.M., Heit, G., Liegeois, C., Chauvel, P., Musolino, A.,
Acknowledgments 1995. Intracerebral potentials to rare target and distractor auditory and visual
stimuli. I. Superior temporal plane and parietal lobe. Electroencephalogr. Clin.
Neurophysiol. 94, 191–220.
The studies of the authors were supported by grants of the Ger- Hamilton, R.H., Pascual-Leone, A., Schlaug, G., 2004. Absolute pitch in blind
man Research Foundation (Deutsche Forschungsgemeinschaft, musicians. Neuroreport 15, 803–806.
DFG, Ro1226/4-1,4-2,4-3) and the BMBF (OIGW056I) to B.R. Hillyard, S.A., Hink, R.F., Schwent, V.L., Picton, T.W., 1973. Electrical signs of
selective attention in the human brain. Science 182, 177–180.
Hillyard, S.A., Picton, T.W., 1987. Electrophysiology of cognition. In: Plum, E. (Ed.),
References Handbook of Physiology: Sec.1 The Nervous System. V. Higher Functions of the
Brain: Part 2. American Physiology Society, Bethesda, pp. 519–584.
Amedi, A., Flöel, A., Knecht, S., Zohary, E., Cohen, L.G., 2004. Transcranial magnetic Hötting, K., Rösler, F., Röder, B., 2003. Crossmodal and intermodal attention
stimulation of the occipital pole interferes with verbal processing in blind modulate event-related brain potentials to tactile and auditory stimuli. Exp.
subjects. Nat. Neurosci. 7, 1266–1270. Brain Res. 148, 26–37.
Amedi, A., Raz, N., Pianka, P., Malach, R., Zohary, E., 2003. Early ‘visual’ cortex Hötting, K., Friedrich, C. K., Röder, B., in press. Neural correlates of crossmodally
activation correlates with superior verbal memory performance in the blind. induced changes in tactile awareness. J. Cogn. Neurosci. doi:10.1162/
Nat. Neurosci. 6, 758–766. jocn.2008.21177.
Amedi, A., von Kriegstein, K., van Atteveldt, N.M., Beauchamp, M.S., Naumer, M.J., Hötting, K., Röder, B., 2004. Hearing cheats touch, but less in congenitally blind than
2005. Functional imaging of human crossmodal identification and object in sighted individuals. Psychol. Sci. 15, 60–64.
recognition. Exp. Brain Res. 166, 559–571. Hötting, K., Rösler, F., Röder, B., 2004. Altered auditory-tactile interactions in
Arno, P., De Volder, A.G., Vanlierde, A., Wanet-Defalque, M.C., Streel, E., Robert, A., congenitally blind humans: an event-related potential study. Exp. Brain Res.
Sanabria-Bohorquez, S., Veraart, C., 2001. Occipital activation by pattern 159, 370–381.
recognition in the early blind using auditory substitution for vision. Hull, T., Mason, H., 1995. Performance of blind children on digit-span tests. JVIB 89,
Neuroimage 13, 632–645. 166–169.
K. Hötting, B. Röder / Hearing Research 258 (2009) 165–174 173
James, W., 1890. The principles of psychology, vol. 2. Holt, New York. pp. 203–211. Rauschecker, J.P., Korte, M., 1993. Auditory compensation for early blindness in cat
Just, M.A., Carpenter, P.A., 1992. A capacity theory of comprehension: individual cerebral cortex. J. Neurosci. 13, 4538–4548.
differences in working memory. Psychol. Rev. 99, 122–149. Raz, N., Amedi, A., Zohary, E., 2005. V1 Activation in congenitally blind humans is
Karavatos, A., Kaprinis, G., Tzavaras, A., 1984. Hemispheric specialization for associated with episodic retrieval. Cereb. Cortex 15, 1459–1468.
language in the congenitally blind: the influence of the Braille system. Röder, B., Demuth, L., Streb, J., Rösler, F., 2003. Semantic and morpho-syntactic
Neuropsychologia 22, 521–525. priming in auditory word recognition in congenitally blind adults. Lang. Cogn.
Knight, R.T., 1990. Neural mechanisms of event-related potentials: evidence from Process 18, 1–20.
human lesion studies. In: Rohrbauch, J.W., Parasuraman, R., Johnson, R.J. (Eds.), Röder, B., Neville, H.J., 2003. Developmental functional plasticity. In: Grafman, J.,
Event-Related Brain Potentials: Basic Issues and Applications. Oxford University Robertson, I.H. (Eds.), Plasticity and Rehabilitation, second ed. Elsevier,
Press, New York, pp. 3–18. Amsterdam, pp. 231–270.
Knudsen, E.I., Brainard, M.S., 1991. Visual instruction of the neural map of auditory Röder, B., Rösler, F., 2003. Memory for environmental sounds in sighted,
space in the developing optic tectum. Science 253, 85–87. congenitally blind and late blind adults: evidence for cross-modal
Knudsen, E.I., Brainard, M.S., 1995. Creating a unified representation of visual and compensation. Int. J. Psychophysiol. 50, 27–39.
auditory space in the brain. Annu. Rev. Neurosci. 18, 19–43. Röder, B., Rösler, F., Hennighausen, E., Näcker, F., 1996. Event-related potentials
Korte, M., Rauschecker, J.P., 1993. Auditory spatial tuning of cortical neurons is during auditory and somatosensory discrimination in sighted and blind human
sharpened in cats with early blindness. J. Neurophysiol. 70, 1717–1721. subjects. Brain Res. Cogn. Brain Res. 4, 77–93.
Kujala, T., Alho, K., Huotilainen, M., Ilmoniemi, R.J., Lehtokoski, A., Leinonen, A., Röder, B., Rösler, F., Hennighausen, E., 1997. Different cortical activation patterns in
Rinne, T., Salonen, O., Sinkkonen, J., Standertskjold-Nordenstam, C.G., Näätänen, blind and sighted humans during encoding and transformation of haptic
R., 1997. Electrophysiological evidence for cross-modal plasticity in humans images. Psychophysiology 34, 292–307.
with early- and late-onset blindness. Psychophysiology 34, 213–216. Röder, B., Rösler, F., Neville, H.J., 1999a. Effects of interstimulus interval on auditory
Kujala, T., Alho, K., Kekoni, J., Hamalainen, H., Reinikainen, K., Salonen, O., event-related potentials in congenitally blind and normally sighted humans.
Standertskjold-Nordenstam, C.G., Näätänen, R., 1995. Auditory and Neurosci. Lett. 264, 53–56.
somatosensory event-related brain potentials in early blind humans. Exp. Röder, B., Rösler, F., Neville, H.J., 2000. Event-related potentials during auditory
Brain Res. 104, 519–526. language processing in congenitally blind and sighted people.
Kujala, T., Alho, K., Paavilainen, P., Summala, H., Näätänen, R., 1992. Neural plasticity Neuropsychologia 38, 1482–1502.
in processing of sound location by the early blind: an event-related potential Röder, B., Rösler, F., Neville, H.J., 2001. Auditory memory in congenitally blind
study. Electroencephalogr. Clin. Neurophysiol. 84, 469–472. adults: a behavioral-electrophysiological investigation. Brain Res. Cogn. Brain
Kutas, M., Federmeier, K.D., 2000. Electrophysiology reveals semantic memory use Res. 11, 289–303.
in language comprehension. Trends Cogn. Sci. 4, 463–470. Röder, B., Stock, O., Bien, S., Neville, H., Rösler, F., 2002. Speech processing
Lessard, N., Pare, M., Lepore, F., Lassonde, M., 1998. Early-blind human subjects activates visual cortex in congenitally blind humans. Eur. J. Neurosci. 16,
localize sound sources better than sighted subjects. Nature 395, 278–280. 930–936.
Lewald, J., 2002. Vertical sound localization in blind humans. Neuropsychologia 40, Röder, B., Teder-Sälejärvi, W., Sterr, A., Rösler, F., Hillyard, S.A., Neville, H.J., 1999b.
1868–1872. Improved auditory spatial tuning in blind humans. Nature 400, 162–
Liotti, M., Ryder, K., Woldorff, M.G., 1998. Auditory attention in the congenitally 166.
blind: where, when and what gets reorganized? Neuroreport 9, 1007–1012. Rösler, F., Röder, B., Heil, M., Hennighausen, E., 1993. Topographic differences
Lloyd, D.M., Merat, N., McGlone, F., Spence, C., 2003. Crossmodal links between of slow event-related brain potentials in blind and sighted adult human
audition and touch in covert endogenous spatial attention. Percept. Psychophys. subjects during haptic mental rotation. Brain Res. Cogn Brain Res. 1, 145–
65, 901–924. 159.
Luck, S.J., 2005. An Introduction to the Event-Related Potential Technique. MIT Sadato, N., Okada, T., Honda, M., Yonekura, Y., 2002. Critical period for cross-modal
Press, Cambridge. plasticity in blind humans: a functional MRI study. Neuroimage 16, 389–
Mangun, G.R., Hillyard, S.A., 1990. Allocation of visual attention to spatial locations: 400.
tradeoff functions for event-related brain potentials and detection performance. Sadato, N., Pascual-Leone, A., Grafman, J., Deiber, M.P., Ibanez, V., Hallett, M., 1998.
Percept. Psychophys. 47, 532–550. Neural networks for Braille reading by the blind. Brain 121, 1213–1229.
McDonald, J.J., Ward, L.M., 2000. Involuntary listening aids seeing: evidence from Sadato, N., Pascual-Leone, A., Grafman, J., Ibanez, V., Deiber, M.-P., Dold, G., Hallett,
human electrophysiology. Psychol. Sci. 11, 167–171. M., 1996. Activation of the primary visual cortex by Braille reading in blind
Merabet, L.B., Hamilton, R., Schlaug, G., Swisher, J.D., Kiriakopoulos, E.T., Pitskel, subjects. Nature 380, 526–528.
N.B., Kauffman, T., Pascual-Leone, A., 2008. Rapid and reversible recruitment of Schorr, E.A., Fox, N.A., van Wassenhove, V., Knudsen, E.I., 2005. Auditory-visual
early visual cortex for touch. PLoS One 3, e3046. fusion in speech perception in children with cochlear implants. Proc. Natl. Acad.
Michie, P.T., Bearpark, H.M., Crawford, J.M., Glue, L.C., 1987. The effects of spatial Sci. USA 102, 18748–18750.
selective attention on the somatosensory event-related potential. Shams, L., Kamitani, Y., Shimojo, S., 2000. What you see is what you hear. Nature
Psychophysiology 24, 449–463. 408, 788.
Miller, J., 1991. Channel interaction and the redundant-targets effect in bimodal Spence, C., Driver, J., 1996. Audiovisual links in endogenous covert spatial attention.
divided attention. J. Exp. Psychol. Hum. Percept Perform. 17, 160–169. J. Exp. Psychol. Hum. Percept Perform. 22, 1005–1030.
Molholm, S., Sehatpour, P., Mehta, A.D., Shpaner, M., Gomez-Ramirez, M., Ortigue, S., Spence, C., Driver, J., 1997. Audiovisual links in exogenous covert spatial orienting.
Dyke, J.P., Schwartz, T.H., Foxe, J.J., 2006. Audio-visual multisensory integration Percept. Psychophys. 59, 1–22.
in superior parietal lobule revealed by human intracranial recordings. J. Spence, C., Nicholls, M.E., Gillespie, N., Driver, J., 1998. Cross-modal links in
Neurophysiol. 96, 721–729. exogenous covert spatial orienting between touch, audition, and vision. Percept.
Muchnik, C., Efrati, M., Nemeth, E., Malin, M., Hildesheimer, M., 1991. Central Psychophys. 60, 544–557.
auditory skills in blind and sighted subjects. Scand. Audiol. 20, 19–23. Spence, C., Pavani, F., Driver, J., 2000. Crossmodal links between vision and touch in
Münte, T.F., Kohlmetz, C., Nager, W., Altenmüller, E., 2001. Neuroperception. covert endogenous spatial attention. J. Exp. Psychol. Hum. Percept Perform. 26,
Superior auditory spatial tuning in conductors. Nature 409, 580. 1298–1319.
Näätänen, R., Picton, T., 1987. The N1 wave of the human electric and magnetic Spence, C., Shore, D.I., Gazzaniga, M.S., Soto-Faraco, S., Kingstone, A., 2001. Failure to
response to sound: a review and an analysis of the component structure. remap visuotactile space across the midline in the split-brain. Can. J. Exp.
Psychophysiology 24, 375–425. Psychol. 55, 133–140.
Neil, P.A., Chee-Ruiter, C., Scheier, C., Lewkowicz, D.J., Shimojo, S., 2006. Starlinger, I., Niemeyer, W., 1981. Do the blind hear better? Investigations on
Development of multisensory spatial integration and perception in humans. auditory processing in congenital or early acquired blindness. I. Peripheral
Dev. Sci. 9, 454–464. functions. Audiology 20, 503–509.
Niemeyer, W., Starlinger, I., 1981. Do the blind hear better? Investigations on Stein, B.E., Meredith, M.A., 1993. The Merging of the Senses. The MIT Press,
auditory processing in congenital or early acquired blindness. II. Central Cambridge.
functions. Audiology 20, 510–515. Stein, B.E., Meredith, M.A., Huneycutt, W.S., McDade, L., 1989. Behavioral indices of
Pascual-Leone, A., Hamilton, R., 2001. The metamodal organization of the brain. multisensory integration: orientation to visual cues is affected by auditory
Prog. Brain Res. 134, 427–445. stimuli. J. Cogn. Neurosci. 1, 12–24.
Poirier, C., Collignon, O., Scheiber, C., Renier, L., Vanlierde, A., Tranduy, D., Veraart, C., Stein, B.E., Stanford, T.R., 2008. Multisensory integration: current issues from the
De Volder, A.G., 2006. Auditory motion perception activates visual motion areas perspective of the single neuron. Nat. Rev. Neurosci. 9, 255–266.
in early blind subjects. Neuroimage 31, 279–285. Stevens, A.A., Weaver, K.E., 2009. Functional characteristics of auditory cortex in the
Posner, M.I., 1980. Orienting of attention. Q. J. Exp. Psychol. 32, 3–25. blind. Behav. Brain Res. 196, 134–138.
Putzar, L., Goerendt, I., Lange, K., Rösler, F., Röder, B., 2007. Early visual deprivation Stevens, A.A., Weaver, K., 2005. Auditory perceptual consolidation in early-onset
impairs multisensory interactions in humans. Nat. Neurosci. 10, 1243–1245. blindness. Neuropsychologia 43, 1901–1910.
Putzar, L., Hötting, K., Röder, B., in press. Early visual deprivation affects the Sugihara, T., Diltz, M.D., Averbeck, B.B., Romanski, L.M., 2006. Integration of
development of face recognition and of audio-visual speech perception. Rest. auditory and visual communication information in the primate ventrolateral
Neurol. Neurosci. prefrontal cortex. J. Neurosci. 26, 11138–11147.
Rammsayer, T., 1992. Zeitdauerdiskriminationsleistung bei Blinden und Nicht- Sumby, W.H., Pollack, I., 1954. Visual contribution to speech intelligibility in noise. J.
Blinden: Evidenz für einen biologischen Zeitmechanismus. Kognitionswissen- Acoust. Soc. Am. 26, 212–215.
schaft 2, 180–188. Voss, P., Gougoux, F., Zatorre, R.J., Lassonde, M., Lepore, F., 2008. Differential
Rauschecker, J.P., Kniepert, U., 1994. Auditory localization behaviour in visually occipital responses in early- and late-blind individuals during a sound-source
deprived cats. Eur. J. Neurosci. 6, 149–160. discrimination task. Neuroimage 40, 746–758.
174 K. Hötting, B. Röder / Hearing Research 258 (2009) 165–174
Voss, P., Lassonde, M., Gougoux, F., Fortin, M., Guillemot, J.P., Lepore, F., 2004. Early- Weeks, R., Horwitz, B., Aziz-Sultan, A., Tian, B., Wessinger, C.M., Cohen, L.G.,
and late-onset blind individuals show supra-normal auditory abilities in far- Hallett, M., Rauschecker, J.P., 2000. A positron emission tomographic study
space. Curr. Biol. 14, 1734–1738. of auditory localization in the congenitally blind. J. Neurosci. 20, 2664–
Wallace, M.T., Perrault Jr., T.J., Hairston, W.D., Stein, B.E., 2004. Visual experience is 2672.
necessary for the development of multisensory integration. J. Neurosci. 24, Woods, D.L., 1990. The physiological basis of selective attention: implications of
9580–9584. event-related potential studies. In: Rohrbaugh, J.W., Parasurama, R., Johnson, J.
Wallace, M.T., Stein, B.E., 1997. Development of multisensory neurons and (Eds.), Event-Related Brain Potentials: Basic Issues and Applications. Oxford
multisensory integration in cat superior colliculus. J. Neurosci. 17, 2429–2444. University Press, New York, pp. 178–209.
Wanet, M.C., Veraart, C., 1985. Processing of auditory information by the blind in Zwiers, M.P., Van Opstal, A.J., Cruysberg, J.R., 2001a. A spatial hearing deficit in
spatial localization tasks. Percept. Psychophys. 38, 91–96. early-blind humans. J. Neurosci. 21 (RC142), 1–5.
Weaver, K.E., Stevens, A.A., 2007. Attention and sensory interactions within the Zwiers, M.P., Van Opstal, A.J., Cruysberg, J.R., 2001b. Two-dimensional sound-
occipital cortex in the early blind: an fmri study. J. Cogn. Neurosci. 19, 315–330. localization behavior of early-blind humans. Exp. Brain Res. 140, 206–222.