Listening to an audio drama activates two processing networks, one for all sounds, another exclusively for speech - PubMed
- ️Tue Jan 01 2013
Listening to an audio drama activates two processing networks, one for all sounds, another exclusively for speech
Robert Boldt et al. PLoS One. 2013.
Abstract
Earlier studies have shown considerable intersubject synchronization of brain activity when subjects watch the same movie or listen to the same story. Here we investigated the across-subjects similarity of brain responses to speech and non-speech sounds in a continuous audio drama designed for blind people. Thirteen healthy adults listened for ∼19 min to the audio drama while their brain activity was measured with 3 T functional magnetic resonance imaging (fMRI). An intersubject-correlation (ISC) map, computed across the whole experiment to assess the stimulus-driven extrinsic brain network, indicated statistically significant ISC in temporal, frontal and parietal cortices, cingulate cortex, and amygdala. Group-level independent component (IC) analysis was used to parcel out the brain signals into functionally coupled networks, and the dependence of the ICs on external stimuli was tested by comparing them with the ISC map. This procedure revealed four extrinsic ICs of which two-covering non-overlapping areas of the auditory cortex-were modulated by both speech and non-speech sounds. The two other extrinsic ICs, one left-hemisphere-lateralized and the other right-hemisphere-lateralized, were speech-related and comprised the superior and middle temporal gyri, temporal poles, and the left angular and inferior orbital gyri. In areas of low ISC four ICs that were defined intrinsic fluctuated similarly as the time-courses of either the speech-sound-related or all-sounds-related extrinsic ICs. These ICs included the superior temporal gyrus, the anterior insula, and the frontal, parietal and midline occipital cortices. Taken together, substantial intersubject synchronization of cortical activity was observed in subjects listening to an audio drama, with results suggesting that speech is processed in two separate networks, one dedicated to the processing of speech sounds and the other to both speech and non-speech sounds.
Conflict of interest statement
Competing Interests: The authors have declared that no competing interests exist.
Figures

The intersubject correlation (ISC) map (top panels) and general linear model (GLM) results overlaid on a MNI305 template brain and viewed from the lateral and medial aspects of both hemispheres. The GLM maps show brain areas correlating with the speech sounds (middle panels) and non-speech sounds (bottom panels). The GLM maps are thresholded at FWE-corrected p<0.05, cluster size >10 voxels, the ISC map at FWE corrected p<0.01, cluster size >20 voxels. STG = superior temporal gyrus, MTG = middle temporal gyrus, IPL = inferior parietal lobule, SMG = supramarginal gyrus, TP = temporal pole, MFG = middle frontal gyrus, IFG = inferior frontal gyrus, Cun = Cuneus, PreCun = PreCuneus, CC = cingulate cortex, Amyg = amygdala, L = left hemisphere, R = right hemisphere.

The bar graph shows the independent components (ICs) organized in a descending order based on their spatial correlations with the intersubject correlation (ISC) map. The colors of the four most stimulus related extrinsic ICs (IC1–IC4) in the bar graph correspond to the colors in the independent component analysis (ICA) map illustrating their spatial distribution. The four ICs (IC5–IC8) that correlated temporally with two extrinsic ICs are also marked in the bar graph. The ISC and IC maps are thresholded at FWE corrected p<0.01, cluster size >20 voxels. L = left hemisphere, R = right hemisphere.

The extrinsic independent components (ICs), IC1–IC4, are shown on the left and the intrinsic ICs, IC5–IC8 on the right. Extrinsic and intrinsic ICs are grouped with green boxes. Mean intersubject correlation values of the ICs time-courses are written below the IC number. Correlation values between the extrinsic and intrinsic ICs are displayed in the middle. The group-level t-maps are thresholded at FWE corrected p<0.01, cluster size >20 voxels and overlaid on an average of the subjects’ normalized anatomical images. MNI z-coordinates are presented beside the axial slices. L = left, R = right, r = mean Pearson correlation strength.

The correlation matrix for the time-courses of IC1–IC8 is presented on the left. The colour of each square represents the subjects’ mean correlation strengths. The colour key is given to the right of the matrix. The bar graph on the right presents mean beta value results from the analysis where the time-courses of the spatial ICs were subjected to a multiple regression analysis with the speech (brown) and non-speech (blue) regressors. * p<0.05, **p<0.005.

The extrinsic areas are labeled with blue letters, the intrinsic with red letters. Areas of the speech network are marked by solid boxes, areas of the non-speech sound network by dotted boxes. The summary is based on the characteristics of independent components (ICs) IC1–IC8. Extrinsic areas are based on the correlation and overlap of the ICs with the intersubject correlation (ISC) map. Speech and non-speech brain areas are based on the similarity of the time-courses of the ICs to the speech and non-speech regressors, and on functional connectivity analysis. Abbreviations: MTG = middle temporal gyrus, IN = insula, SFG = superior frontal gyrus, SFGmed = superior frontal gyrus medial part, MFG = middle frontal gyrus, IFG = inferior frontal gyrus, TP = temporal pole, IPL = inferior parietal lobule, SMG = supramarginal gyrus, CC = cingulate cortex, Cun = Cuneus, PreCun = PreCuneus, ant = anterior, post = posterior, sup = superior, inf = inferior, trans = transitional.
Similar articles
-
Segmental processing in the human auditory dorsal stream.
Zaehle T, Geiser E, Alter K, Jancke L, Meyer M. Zaehle T, et al. Brain Res. 2008 Jul 18;1220:179-90. doi: 10.1016/j.brainres.2007.11.013. Epub 2007 Nov 17. Brain Res. 2008. PMID: 18096139 Review.
-
Data-based functional template for sorting independent components of fMRI activity.
Malinen S, Hari R. Malinen S, et al. Neurosci Res. 2011 Dec;71(4):369-76. doi: 10.1016/j.neures.2011.08.014. Epub 2011 Sep 10. Neurosci Res. 2011. PMID: 21925216
-
Osnes B, Hugdahl K, Hjelmervik H, Specht K. Osnes B, et al. Brain Lang. 2012 Apr;121(1):65-9. doi: 10.1016/j.bandl.2012.02.002. Epub 2012 Feb 28. Brain Lang. 2012. PMID: 22377261
-
Neural Tuning to Low-Level Features of Speech throughout the Perisylvian Cortex.
Berezutskaya J, Freudenburg ZV, Güçlü U, van Gerven MAJ, Ramsey NF. Berezutskaya J, et al. J Neurosci. 2017 Aug 16;37(33):7906-7920. doi: 10.1523/JNEUROSCI.0238-17.2017. Epub 2017 Jul 17. J Neurosci. 2017. PMID: 28716965 Free PMC article.
-
Alho K, Rinne T, Herron TJ, Woods DL. Alho K, et al. Hear Res. 2014 Jan;307:29-41. doi: 10.1016/j.heares.2013.08.001. Epub 2013 Aug 11. Hear Res. 2014. PMID: 23938208 Review.
Cited by
-
Congenital blindness is associated with large-scale reorganization of anatomical networks.
Hasson U, Andric M, Atilgan H, Collignon O. Hasson U, et al. Neuroimage. 2016 Mar;128:362-372. doi: 10.1016/j.neuroimage.2015.12.048. Epub 2016 Jan 5. Neuroimage. 2016. PMID: 26767944 Free PMC article.
-
Engaged listeners: shared neural processing of powerful political speeches.
Schmälzle R, Häcker FE, Honey CJ, Hasson U. Schmälzle R, et al. Soc Cogn Affect Neurosci. 2015 Aug;10(8):1137-43. doi: 10.1093/scan/nsu168. Epub 2015 Feb 3. Soc Cogn Affect Neurosci. 2015. PMID: 25653012 Free PMC article.
-
Emotions amplify speaker-listener neural alignment.
Smirnov D, Saarimäki H, Glerean E, Hari R, Sams M, Nummenmaa L. Smirnov D, et al. Hum Brain Mapp. 2019 Nov 1;40(16):4777-4788. doi: 10.1002/hbm.24736. Epub 2019 Aug 9. Hum Brain Mapp. 2019. PMID: 31400052 Free PMC article.
-
Kauttonen J, Paekivi S, Kauramäki J, Tikka P. Kauttonen J, et al. Front Psychol. 2023 Oct 19;14:1153968. doi: 10.3389/fpsyg.2023.1153968. eCollection 2023. Front Psychol. 2023. PMID: 37928563 Free PMC article.
-
Synchronous brain activity across individuals underlies shared psychological perspectives.
Lahnakoski JM, Glerean E, Jääskeläinen IP, Hyönä J, Hari R, Sams M, Nummenmaa L. Lahnakoski JM, et al. Neuroimage. 2014 Oct 15;100(100):316-24. doi: 10.1016/j.neuroimage.2014.06.022. Epub 2014 Jun 14. Neuroimage. 2014. PMID: 24936687 Free PMC article.
References
-
- Hasson U, Nir Y, Levy I, Fuhrmann G, Malach R (2004) Intersubject synchronization of cortical activity during natural vision. Science 303: 1634–1640. - PubMed
-
- Malinen S, Hlushchuk Y, Hari R (2007) Towards natural stimulation in fMRI–issues of data analysis. Neuroimage 35: 131–139. - PubMed
-
- Wilson SM, Molnar-Szakacs I, Iacoboni M (2008) Beyond superior temporal cortex: intersubject correlations in narrative speech comprehension. Cereb Cortex 18: 230–242. - PubMed
Publication types
MeSH terms
Grants and funding
The study was supported by the Academy of Finland (National Centers of Excellence Program 2006–2011; grant #259752; grant #263800), European Research Council Advanced Grant #232946 (to R.H.), the aivoAALTO project of the Aalto University, and Päivikki and Sakari Sohlberg Foundation. The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.
LinkOut - more resources
Full Text Sources
Other Literature Sources