The neural substrates of musical memory revealed by fMRI and two semantic tasks
Research highlights
►This fMRI study directly compares the musical and verbal memory of 20 nonmusicians. ►We find that musical and verbal semantic memories are subserved by two networks. ►Verbal and musical semantic retrievals share the same executive mechanisms. ►Semantic concepts are organized along a dorso/ventral axis in the temporal cortex.
Introduction
Semantic memory refers to memory for general knowledge, unrelated to specific experiences or the type of material used (e.g. words, faces or music). Clinical studies have revealed that patients sometimes retain musical abilities despite severe cognitive impairments such as aphasia or amnesia (Signoret et al., 1987, Cuddy & Duffin, 2005, Samson et al., 2009). Based on neuropsychological dissociations reported in clinical studies, Peretz and colleagues (Peretz & Coltheart, 2003, Peretz et al., 2009) have developed a cognitive model of the cortical organization of music recognition. This views the musical semantic memory system as a purely musical lexicon, which interacts with the verbal lexicon. In this study, musical semantic memory is defined as the long-term storage of familiar melodies or musical excerpts. It is musical semantic memory that allows us to experience a strong feeling of knowing when listening to music (reflecting familiarity processes) and gives us the ability to hum or whistle the subsequent notes of a melody, or in some cases retrieve the title, composer or performer of a particular excerpt (corresponding to identification) (Platel and Eustache, 2000). Whereas numerous clinical studies have supported the idea that musical knowledge and verbal knowledge are cognitively autonomous (for review, see Peretz 2008), few authors have investigated this issue using neuroimaging methods. The neural substrates of semantic memory have been unraveled using a variety of experimental paradigms in neuroimaging studies (Cabeza & Nyberg, 2000, Binder et al., 2009). Semantic memory retrieval requires the activation of a large neural network, mainly located in the temporal and frontal cortices of the left hemisphere. When verbal material is used, semantic memory relies mainly upon the middle and inferior temporal and inferior frontal gyri in this hemisphere (for review, see Binder et al., 2009). The situation appears to be less clear-cut for musical material, however. Neuroimaging studies featuring this type of material have reported the involvement of the anterior part of the temporal lobes, either in the left hemisphere (Platel et al., 2003) or in both (Satoh et al. 2006), with activation of the middle part of the left superior temporal gyrus and the medial frontal cortices for recognition tasks (Satoh et al., 2006) and mainly of the left inferior frontal gyrus for familiarity tasks (Plailly et al., 2007). However, these studies did not allow direct comparisons to be made between music and language, and some of them used stimuli, such as nursery songs, which may also have elicited verbal processes.
In a previous H2O15 PET study, we found that the verbal and musical sets of material used in a semantic congruence task drew on two close but partially distinct networks, located mainly in the left temporal cortex (Groussard et al., 2010). In this study, participants had to decide whether the second part of a familiar melody (musical congruence condition) or a French proverb (verbal congruence condition) was the right or wrong ending. They therefore had to access their semantic store in order to retrieve musical or verbal representations of the melodies or expressions they heard in order to decide whether they were then given the right ending or not. The post-experiment debriefing suggested, however, that the congruence task involved syntactic processes as well (i.e. corresponding to the detection of irregularities within the harmonic, melodic, rhythmic or metric structure), particularly for incongruent items.
In order to clarify this issue and to highlight the brain regions mainly involved in the as pure semantic retrieval process as possible, we chose to administer the same semantic tasks in the present study, using fMRI. In fact, using an event-related analysis, this makes it possible to exclude incongruent items which could involve syntactic processes.
Regarding the literature, it is now well established that both the difficulty and nature of the semantic task have an impact on the pattern of activation (Mummery et al., 1996, Muller et al., 1997, Cabeza & Nyberg, 2000). Thus, in order to take our direct comparison of musical and verbal semantic tasks a stage further, we decided to administer a “semantic familiarity task” as well, requiring a more thorough semantic memory search to rate the level of familiarity of musical excerpts (musical familiarity condition) and French expressions (verbal familiarity condition) on a 4-point scale. In all the musical conditions, familiar musical stimuli were strictly selected (no excerpts with lyrics, and no excerpts which might spontaneously evoke autobiographical memories) in order to limit labeling and verbalization. These two tasks were highly complementary, in that the congruence task allowed us to explore musical semantic processing with as few verbal associations as possible but was also limited to an automatic search, whereas the familiarity task gave rise to more thorough semantic retrieval, but opened up the possibility of verbal labeling when the participants knew the melody extremely well, corresponding to identification (retrieving the title, composer or performer of the musical excerpt). The complementary aspects of these two tasks (i.e. congruence and familiarity tasks) investigating automatic and more thorough semantic retrieval, performed by the same participants, would allow us to increase the understanding of the functional organization of musical semantic memory and highlight the neural networks activated by verbal material, musical material or both. Previous studies have shown that semantic memory tasks with verbal or musical material activated the prefrontal and temporal areas mainly on the left side for verbal (Binder et al., 2009) and in both hemispheres for musical material (Platel et al., 1997, Platel et al., 2003, Satoh et al., 2006, Plailly et al., 2007). In addition, we recently proposed an anteroposterior organization within the left middle and superior temporal gyri (Groussard et al., 2010), such that there was predominantly anterior activation during the musical semantic task and predominantly posterior activation during the verbal one. Thus, comparing the performance of the same group of nonmusician participants on congruence and familiarity semantic tasks featuring verbal and musical materials, allows to examine more deeply these networks and the cognitive contribution of each cortical area.
Section snippets
Participants
Twenty healthy right-handed volunteers (mean ± SD: 24.55 ± 3.80 years) were selected from a population of university students (mean education level ± SD: 16.35 ± 2.03 years) to take part in this study. All were nonmusicians (10 women and 10 men) with normal hearing and no history of neurological disease. Participants were selected according to stringent criteria: (1) none had taken music lessons or participated in musical performances (except for compulsory music classes at secondary school (1 hr per
Congruence task
Mean accuracy of performance was 83.71% (SD = 8.72) for the musical semantic task, 98.71% (SD = 2.85) for the verbal semantic task, and 94.06% (SD = 5.26) and 93.42% (SD = 3.94) for the musical and verbal reference tasks. Accuracy was significantly lower for the musical semantic task than for either the verbal semantic task or either of the reference tasks (p < 0.001). These performances were not significantly different from those of the subjects in our pre-experimental population.
Familiarity task
The statistical
Discussion
The possibility of there being a neural distinction between verbal and musical material has been raised by several clinical studies (e.g. Signoret et al., 1987, Eustache et al., 1990, Peretz, 2002) but has seldom been investigated in neuroimaging studies. Direct comparisons between language and music have rarely been performed so far and focused mainly on perceptual, production and syntactic processing (Patel, 2003, Koelsch et al., 2005, Brown et al., 2006, Ozdemir et al., 2006, Steinbeis &
Acknowledgments
This study was supported by a “Music and Memory” French National Research Agency (ANR) grant (NT05-3_45987) and by the French Ministry of Research. We thank N. Villain, P. Gagnepain and G. Chételat for their valuable contribution, and C. Mauger, J. Dayan, C. Schupp and the neuroimaging staff of the Cyceron center for their help with data acquisition.
References (52)
- et al.
Unified segmentation
Neuroimage
(2005) - et al.
Left ventrolateral prefrontal cortex and the cognitive control of memory
Neuropsychologia
(2007) - et al.
A common functional brain network for autobiographical, episodic, and semantic memory retrieval
Neuroimage
(2010) - et al.
Music, memory, and Alzheimer's disease: is music recognition spared in dementia, and how can it be assessed
Med. Hypotheses
(2005) - et al.
Imaging recollection and familiarity in the medial temporal lobe: a three-component model
Trends Cogn. Sci.
(2007) - et al.
Identification and discrimination disorders in auditory perception: a report on two cases
Neuropsychologia
(1990) - et al.
Stochastic designs in event-related fMRI
Neuroimage
(1999) - et al.
Musical and verbal semantic memory: two distinct neural networks?
Neuroimage
(2010) - et al.
Right dorsolateral prefrontal cortex is engaged during post-retrieval processing of both episodic and semantic information
Neuropsychologia
(2009) Significance of Broca's area and ventral premotor cortex for music-syntactic processing
Cortex
(2006)
Strategic control of memory
Valid conjunction inference with the minimum statistic
Neuroimage
Shared and distinct neural correlates of singing and speaking
Neuroimage
Semantic and episodic memory of music are subserved by distinct neural networks
Neuroimage
Optimization of experimental design in fMRI: a general framework using a genetic algorithm
Neuroimage
Memory of music: roles of right hippocampus and left inferior frontal gyrus
Neuroimage
A method for measuring the absolute sensitivity of positron emission tomographic scanners
Eur. J. Nucl. Med.
Where is the semantic system? A critical review and meta-analysis of 120 functional neuroimaging studies
Cereb. Cortex
Intensely pleasurable responses to music correlate with activity in brain regions implicated in reward and emotion
Proc. Natl Acad. Sci. USA
Passive music listening spontaneously engages limbic and paralimbic systems
NeuroReport
Music and language side by side in the brain: a PET study of the generation of melodies and sentences
Eur. J. Neurosci.
Imaging cognition II: an empirical review of 275 PET and fMRI studies
J. Cogn. Neurosci.
Listening to musical rhythms recruits motor regions of the brain
Cereb. Cortex
Time course of melody recognition: a gating paradigm study
Percept. Psychophys.
Triple dissociation in the medial temporal lobes: recollection, familiarity, and novelty
J. Neurophysiol.
Relatively preserved knowledge of music in semantic dementia
J. Neurol. Neurosurg. Psychiatry
Cited by (49)
-
The neural bases of familiar music listening in healthy individuals: An activation likelihood estimation meta-analysis
2023, Neuroscience and Biobehavioral Reviews -
The hearing hippocampus
2022, Progress in Neurobiology -
Soundtrack of life: An fMRI study
2022, Behavioural Brain Research