Categorical representation of facial expressions in the infant brain
. Author manuscript; available in PMC: 2010 Oct 14.
Abstract
Categorical perception, demonstrated as reduced discrimination of within-category relative to between-category differences in stimuli, has been found in a variety of perceptual domains in adults. To examine the development of categorical perception in the domain of facial expression processing, we used behavioral and event-related potential (ERP) methods to assess discrimination of within-category (happy-happy) and between-category (happy-sad) differences in facial expressions in 7-month-old infants. Data from a visual paired-comparison test and recordings of attention-sensitive ERPs showed no discrimination of facial expressions in the within-category condition whereas reliable discrimination was observed in the between-category condition. The results also showed that face-sensitive ERPs over occipital-temporal scalp (P400) were attenuated in the within-category condition relative to the between-category condition, suggesting a potential neural basis for the reduced within-category sensitivity. Together, these results suggest that the neural systems underlying categorical representation of facial expressions emerge during the early stages of postnatal development, before acquisition of language.
Keywords: Brain Development, Categorization, Electrophysiology, Facial Expressions
Categorical representations are fundamental to human perception and cognition (Harnad, 1987). In the perception of colors, for example, the natural continuum of light wavelength is divided by our perceptual system into discrete regions of colors with relatively sharp boundaries (Bornstein & Korda, 1984). Similar categorical perception effects occur in other domains of perception, including perception of speech sounds (Eimas, Siqueland, Jusczyk, & Vigorito, 1971) and multidimensional stimuli such as facial expressions (Etcoff & Magee, 1992; Young et al., 1997).
Categorical perception of facial expressions has been demonstrated by asking adult observers to identify morphed facial expressions that show a change from one facial expression to a second facial expression in equal-interval steps. Although images on such a morphed continuum show a linear transition from one facial expression to another, the identification of the expressions follows a non-linear pattern with an abrupt change in the reported emotion at a certain point along the continuum (Etcoff & Magee, 1992; Young et al., 1997). It has also been found that discrimination of two expressions within a certain emotion category is more difficult than discrimination of two expressions from adjacent emotion categories, even when the physical differences between the expressions are kept identical (i.e., the distance of the two expressions along the continuum). Reduced sensitivity to within-category differences and relatively enhanced detection of between-category differences is a hallmark of categorical perception (Harnad, 1987). Categorical perception reflects an adaptive feature of human cognition that allows an observer to ignore meaningless variations in facial expressions (e.g., small differences between two happy expressions) and detect behaviorally relevant expression differences (e.g., a change from happy to fearful expression).
Behavioral results demonstrating categorical perception of facial expressions have been complemented by findings from electrophysiological studies. Campanella et al. (2002) recorded event-related potentials (ERPs) in adult observers while they were viewing successive presentations of morphed facial expressions from the same emotion category (i.e., two happy or two fearful expressions) or morphed expressions from different emotion categories (i.e., a happy expression followed by a fearful expression or vice versa). The results showed that posterior face-sensitive (N170) and central attention-sensitive (P300) ERP components differed in response to within-category as compared to between-category changes in facial expressions. The amplitude of the N170 component over occipital-temporal scalp was reduced when the first and second facial expressions belonged to the same emotion category, whereas there was no such reduction when the expressions belonged to different emotion categories. The amplitude reduction to repeated presentations of facial expressions from the same emotion category was similar to that observed when two physically identical expressions were presented successively, suggesting that the higher-level visual systems of the brain may respond to two physically different expressions within a particular emotion category as if they were the same. The P300 component was also smaller in the within-category as compared to the between-category condition, suggesting that observers allocated fewer processing resources to within-category changes in facial expressions.
An important question concerns the development of categorical representations of facial expressions. Evidence for universals in the judgment of facial expressions (Ekman et al., 1987) suggests that categories of facial expressions are acquired either independently of experience or on the basis of a limited amount of exposure to species-typical facial expressions. Conversely, it has also been proposed that experience with language and specific emotion words plays a crucial role in categorization of facial expressions (Barrett, Lindquist, & Gendron, 2007; Davidoff, 2001). Studies in developmental populations and, in particular, preverbal infants can provide important information to examine the tenability of these alternative views. Previous behavioral studies with infants have shown that infants are able to discriminate a variety of facial expressions soon after birth. Newborns discriminate happy and sad, happy and surprised, and sad and surprised facial expressions (Field, Woodson, Greenberg, & Cohen, 1982). Three to 5-month-olds discriminate among happy, angry, fearful, and sad expressions (de Haan & Nelson, 1998; Schwartz, Izard, & Ansul, 1985; Young-Browne, Rosenfield, & Horowitz, 1977). By the age of 5 to 7 months, infants also recognize some expressions (e.g., happy expressions) despite variations in the identity of the face posing the expression or the intensity of the expression (for review, see Leppänen & Nelson, 2006).
There is also evidence that infants perceive facial expressions categorically within the first year of life. Kotsoni et al. (2001) used a combination of familiarization and visual paired comparison techniques to show that 7-month-old infants fail to make a within-category discrimination of either happy or fearful expressions (i.e., to tell two happy faces or two fearful faces apart). In contrast, infants discriminated between two expressions that straddled the category boundary between happy and fearful expressions, although this between-category discrimination was observed only when infants were familiarized to happy expressions but not when infants were familiarized to fearful facial expressions. This asymmetry may, however, reflect infants’ persistent attentional bias towards fearful expressions (Leppänen, Moulson, Vogel-Farley, & Nelson, 2007; Peltola, Leppänen, Palokangas, & Hietanen, 2008) rather than an inability to perceive the category boundary between happy and fearful expressions.
Although the existing behavioral work suggests that preverbal infants perceive facial expressions categorically, the neural correlates of categorical perception in infants have not been investigated. In the present study, we used a visual paired-comparison (VPC) test and ERP recordings to examine behavioral and electrophysiological correlates of within-category and between-category discrimination of facial expressions in 7-month-old infants. Previous behavioral studies have shown stable categorization of happy facial expressions in 7-month-old infants whereas the evidence for categorization of other facial expressions is less consistent (see Leppänen & Nelson, 2006). For these reasons, the present ERP study was designed to examine infants’ ability to discriminate variations within the happy expression category as well as variations that crossed the category boundary between happy and sad expressions. Based on previous behavioral and ERP studies in adults, we hypothesized that categorical perception effects are manifested as reduced behavioral discrimination (i.e., weaker novelty preference) of within-category relative to between-category differences in facial expression. We also predicted that the amplitude of the attention-sensitive ERPs over frontocentral scalp regions (Negative Central or NC component) would be less sensitive to within-category as compared to between-category differences in facial expressions. No specific predictions were made regarding the sensitivity of posterior face-sensitive ERP components (N290 and P400) to within-category versus between-category changes in facial expressions. Studies in adults suggest, however, that posterior ERPs may be generally attenuated in the within-category condition due to repeated presentation of stimuli from the same emotion category (see Campanella et al., 2002).
Method
Participants
The participants were 25 seven-month-old infants, randomly assigned to within-category (n = 13, 8 females, Mean age = 211.2 days, SD = 5.0 days) and between-category (n = 12, 7 females, Mean age = 211.3 days, SD = 4.9 days) experimental conditions. All infants were born full-term (i.e., between 38 and 42 weeks gestational age) and had no history of visual or neurological abnormalities. Seventeen additional infants were tested but excluded for excessive EEG artifact (n = 11), fussiness (n = 4), pre-term birth (n = 1), or medical history (n = 1).
Stimuli
A morphing algorithm (Morpher) was used to create a continuum of images showing a transition from a happy to a sad facial expression in increments of 10% for two female models1 (see Young, Perrett, Calder, Sprengelmeyer, & Ekman, 2002 for details about the morphing procedure). Based on adult judgments of the morphed images (see Figure 1), two pairs of facial expressions were selected from each continuum, one pair consisting of facial expressions within the happy category (within-category stimulus pair) and the other pair of facial expressions from different sides of the happy-sad category boundary (between-category stimulus pair). Critically, the two expressions in each pair were separated by an equal physical distance on the continuum (30%). The stimuli subtended approximately 12° × 16° when viewed from a distance of 60 cm.
Figure 1.
Mean percentages of “happy” and “sad” responses for each face on a morphed continuum from happy to sad (in 10% steps) in a judgment study with 19 adult observers. The vertical dash line indicates category boundary, determined as the intersection point of the two identification curves.
Experimental Design and Procedure
The experiment consisted of a habituation procedure, a behavioral visual paired-comparison (VPC) test and ERP recording, presented in this order for all participants. Infants in the within-category condition were habituated to a happy facial expression of one of the two female models and tested with the same stimulus and a novel happy facial expression from the same model during the VPC and ERP procedures (the model used differed across participants). Infants in the between-category condition were habituated to a sad facial expression and tested with the same stimulus and a (novel) happy facial expression during the VPC and ERP procedures. This experimental design allowed us to examine infants’ responses to the same novel stimulus after they were habituated to a within- or between-category stimulus. Successful discrimination of facial expressions was expected to be demonstrated as longer duration in looking toward the novel than the familiar facial expression during the VPC test (i.e. novelty preference), and differential response of attention-sensitive ERP components to novel and familiar facial expressions.
Infants were tested while sitting in their parent’s lap. The stimuli were presented on a monitor surrounded by black panels that blocked the participant’s view of the room behind the screen and to his/her sides.
Habituation and Visual Paired Comparison Task
During habituation, the infants saw repeated presentations of a happy (within-category condition) or a sad (between-category condition) facial expression of the same female model. Before each stimulus presentation, infant’s attention was drawn to the center of the screen by a red circle that expanded from 0.4° to 4.3° in a continuous fashion. The experimenter initiated each stimulus presentation with a key-press when the infant was attending to the circle in the center of the screen. A trial terminated when the infant looked away from the stimulus. Stimulus presentation continued until the infant reached a habituation criterion of three consecutive looks that summed to less than 50% of the total looking time on the first three trials. Calculation of the looking times and habituation criterion time was based upon the experimenter’s button presses and controlled by E-prime software.
Following habituation, infants were tested in a VPC task in which a pair of stimuli (the familiar and novel facial expressions) was presented in two 10s trials. The participants were monitored through a video camera prior to presenting the pictures, and when they were looking directly at a fixation cross on the middle of the screen, the test stimuli were presented. The left-right arrangement of the two images was counterbalanced across participants during the first trial and reversed for the second trial. Videotaped recordings of the VPC task were analyzed off-line by an observer who was blind to the left/right positioning of the familiar and novel stimuli to determine the time each participant spent looking at the stimulus on the left and the stimulus on the right for each trial. Another independent observer recorded the looking times of 7 out of 22 participants (~30 %). Pearson correlations between the two coders’ measurements of the looking times were on average .97 (SD = .04).
ERP Recording
After the visual paired comparison test, the infant was fitted with a 64-channel Geodesic Sensor Net (Electrical Geodesics, Inc.). ERP testing commenced within ~5 minutes following the habituation and visual paired comparison test. During the ERP recording, the familiar and novel facial stimuli were presented in a random order for 500 ms, followed by a 1000ms post-stimulus recording period. The experimenter monitored participants’ eye movements through a hidden video camera and initiated stimulus presentation when the infant was attending to the screen. Trials during which the infant turned his/her attention away from the screen or showed other types of movement were negated on-line by the experimenter controlling the stimulus presentation. The stimulus presentation was continued until the infant became too fussy or inattentive to continue or had seen the maximum number of trials (100). Infants saw an average of 58.8 trials (SD = 10.9) in the within-category condition and 57.4 trials (SD = 11.3) in the between-category condition. The difference was not significant, p > .10.
Continuous EEG was recorded against a vertex reference (Cz). The electrical signal was amplified with a 0.1- to 100-Hz band-pass filter, digitized at 250 Hz, and stored on a computer disk. Offline, the continuous EEG signal was segmented to 1100-ms periods starting 100 ms prior to stimulus presentation. Digitally filtered (30-Hz lowpass elliptical filter) and baseline-corrected segments were visually inspected for off-scale activity, eye movement, body movements, high frequency noise and other visible artifact. If more than 15% of the channels (≥ 10 channels) were contaminated by artifact, the whole segment was excluded from further analysis. If fewer than 10 channels were contaminated by artifact, the bad channels were replaced using spherical spline interpolation. Participants with fewer than 10 good trials per stimulus were excluded from further analysis. For the remaining participants, the average waveforms for each experimental condition were calculated and re-referenced to the average reference. In the within-category condition, the average number of good trials was 15.7 (SD = 4.3) for the familiar and 17.3 (SD = 6.2) for the novel stimulus (the difference was not significant, p > .05). In the between category condition, the average number of good trials was 15.8 (SD = 4.4) for the familiar and 16.9 (SD = 4.8) for the novel stimulus, p > .05.
Results
Analyses of infants’ looking times during the habituation trials confirmed that infants habituated to the stimuli and the habituation rate did not differ between conditions. Specifically, looking time decreased significantly from the first three habituation trials (M = 17.9 s, SD = 7.2 s) to the last three habituation trials (M = 6.9 s, SD = 2.3 s), t(22) = 9.7, p < .001. The average number of trials required to reach the habituation criterion was 9.5 (SD = 3.5). Looking times during the first three habituation trials, last three habituation trials, and the average number of trials required to reach the habituation criterion were not different in the within- and between-category conditions, all ps > .10.
VPC
Looking time data of one infant in the within-category group and two in the between-category group were excluded due to side bias (n = 1) or error in data storing (n = 2). In the remaining infants, the mean total looking time (averaged across the familiar and novel stimulus) was 6.1 (SD = 1.3) in the within-category condition and 6.2 (SD = 1.4) in the between-category condition. To examine potential differences in the looking times for the familiar and novel expression stimuli, the percentage of looking time toward the novel facial expressions out of the total looking time during the test trials was calculated for each participant. A score above 50% indicates a novelty preference. There was no significant novelty preference in the within-category condition (M = 53%, SD = 8.7), p > .20, but there was a significant novelty preference in the between-category condition (M = 57%, SD = 6.9), t(9) = 3.4, p < .01. Eight out of ten infants in the between-category group exhibited a novelty preference. This pattern of findings was unaltered when the sample size was increased by including those infants who provided analyzable behavioral data but were excluded from the final sample due to ERP artifact.
Electrophysiological Data
The results of the VPC test showed that infants in the within-category condition did not discriminate between familiar and novel facial expressions whereas infants in the between-category condition demonstrated discrimination. We next examined the neural correlates of these behavioral findings. The first set of analyses of the electrophysiological data was conducted to examine attention-sensitive ERPs over frontocentral scalp regions (Negative Central or NC). Previous studies using a combination of habituation and ERP measurement have shown that the NC is larger for novel compared to familiar stimuli (Quinn, Westerlund, & Nelson, 2006; Reynolds & Richards, 2005) and during periods of attention compared to periods of inattention (Richards, 2003). Here the NC was analyzed to determine whether the amplitude differed in response to familiar and novel facial expressions in the within-category and between-category conditions. Although infants had seen the novel expressions briefly during the two VPC test trials, infants had had far greater opportunity to look at the familiar stimulus than at the novel stimulus overall; thus it was expected that infants would still distinguish between the familiar and the novel stimulus during ERP testing.
To quantify the amplitude of the NC component, the mean amplitude of the ERP activity was determined in a temporal window from 300 to 600 ms post-stimulus for electrodes over frontocentral scalp region.2 A 2 × 2 × 2 × 3 ANOVA with Condition (within vs. between) as a between-subject factor and Stimulus (familiar vs. novel), Hemisphere (left vs. right), and Time (three 100-ms time bins) as within subject factors yielded a significant Condition × Stimulus interaction, F(1, 23) = 4.3, p < .05, η2 = .16. As shown in Figure 2, there was no significant difference in the amplitude of the NC in response to familiar (M = −8.3, S.E.M. = 1.2) and novel (M = −7.7, S.E.M. = 1.1) facial expressions in the within-category condition (Figure 2a), p > .10, whereas the novel facial expression (M = −11.2, S.E.M. = 1.2) elicited a reliably larger negativity than the familiar facial expression (M = −8.2, S.E.M. = 1.2) in the between-category condition (Figure 2b), F(1, 11) = 6.6, p < .05, η2 = .37. The ANOVA showed no other significant effects involving the condition or stimulus factors, ps > .05.
Figure 2.
Graphs A and B show grand average waveforms for familiar and novel faces in the within-category (A) and between-category (B) conditions. The waveforms represent averaged activity of electrodes over the left (9, 16, 20) and right (56, 57, 58) frontocentral regions. Graphs C and D show the scalp distribution of the difference potential between novel and familiar stimulus in the within-category (C) and between-category (D) conditions at 490 ms post-stimulus. The electrodes that were used for producing the waveforms and for statistical analyses are marked on the scalp maps.
The second set of analyses examined ERPs associated with the early stages of face processing over occipital-temporal scalp regions (N290 and P400, see de Haan, Johnson, & Halit, 2003). Studies in adults have shown that the amplitude of face-sensitive ERPs is reduced in response to successive presentation of facial expressions from the same emotion category (Campanella et al., 2002). To examine whether a similar phenomenon is observed in infants, it was of particular interest to examine whether the amplitude of the N290 and P400 components differed between the two stimulus conditions; that is, in response to two expressions from one emotion category (within-category) versus two expressions from different emotion categories (i.e., between-category condition).
The mean amplitude of the N290 (200–300 ms post-stimulus) and P400 (350–450 ms post-stimulus) components were calculated for electrodes over medial, semi-medial and lateral occipital-temporal regions and submitted to 2 × 2 × 2 × 3 ANOVAs with Condition (within vs. between) as a between-subject factor and Stimulus (familiar vs. novel), Hemisphere (left vs. right), and Region (medial vs. semi-medial vs. lateral) as within-subject factors. The amplitude of the N290 did not differ between conditions or between familiar and novel stimuli, ps > .10. However, the amplitude of the P400 component was significantly smaller in the within-category as compared to the between-category condition, F(1, 23) = 7.0, p < .02, η2 = .23. This main effect was qualified by a marginal Condition × Region interaction, F(2, 46) = 3.1, p = .06, η2 = .12, reflecting the fact that the P400 amplitude difference was more evident over the medial and semi-medial than over the lateral occipito-temporal scalp (see Figure 3). There were no other significant main or interaction effects involving condition on the P400 amplitude. Also, the P400 did not differentiate between the familiar and novel facial expressions in either of the two experimental conditions, ps > .10.
Figure 3.
ERP waveforms recorded over occipital-temporal scalp regions in the within-category and between-category condition (collapsed across familiar and novel facial expressions). The waveforms represent averaged activity over groups of electrodes at medial, semi-medial, and lateral recording sites over the left and right hemispheres.
Correlations between Behavioral and ERP Measures
To examine potential relationship between behavioral and ERP measures of facial expression discrimination in the between-category condition, a Pearson’s correlation between behavioral novelty preference score and the magnitude of the NC amplitude effect (i.e., novel minus familiar) was calculated. There was no significant correlation between these two indices of discrimination, p > .80.
Discussion
The present results suggest that facial expression processing in preverbal infants meets some of the criteria for categorical perception (reduced sensitivity to within-category relative to between-category differences). Specifically, measurement of looking times during a visual paired-comparison test revealed that infants who were habituated to a happy facial expression and tested with the familiar happy expression paired with a novel happy expression did not discriminate between the familiar and novel stimuli. In contrast, infants habituated to a sad expression and tested with a sad and a happy expression demonstrated discrimination (i.e., preferential looking toward the novel facial expression). Given that the familiar and the novel facial expression were equidistant from one another in the within-category and between-category conditions, the differential pattern of discrimination performance cannot be attributed to simple physical characteristics of the stimuli. The behavioral findings were reinforced by data from recordings of ERPs showing that attention-sensitive ERPs over frontocentral scalp regions (NC) did not differentiate between two within-category expressions whereas a clear discrimination among between-category expressions was observed. The larger NC for the novel compared to the familiar facial expression in the between-category condition is likely to reflect a neural manifestation of infants’ increased allocation of attention to the novel stimulus category, similar to that observed in a recent study that explicitly examined the neural correlates of category learning in infants (Quinn et al., 2006). More generally, these ERP findings indicate that differential sensitivity to within-category relative to between-category differences in facial expressions is also observed at the neural level.
The amplitude of the N290 and P400 components over posterior scalp did not differ for familiar and novel facial expressions in either of the two experimental conditions. This finding suggests that unlike the amplitude of the frontocentral NC, the amplitude of the posterior face-related components does not differentiate between familiar and unfamiliar face stimuli in infants at this age (cf. Scott & Nelson, 2006). The present findings are also consistent with studies showing that the posterior face-sensitive ERPs are not sensitive to happy versus sad distinction in adults or in 4- to 15-year-old children (Batty & Taylor, 2003, 2006). While there was no differentiation between familiar and novel facial expressions, an overall group difference in the P400 amplitude was observed, showing markedly reduced P400 in the within as compared to the between category condition. It is possible that this amplitude reduction reflects a higher-level adaptation effect, occurring when neurons underlying exemplar-invariant representations of stimuli adapt to repeated presentations of stimuli from the same perceptual category (Campanella et al., 2002; see also Schweinberger, Pfütze, & Sommer, 1995). This finding also raises the intriguing possibility that that infants’ visual systems responded to the two separate examples of happy expressions as if they were the same. That is, despite their physical differences, two expressions within a particular category may give rise to similar representations in higher-level visual systems (Campanella et al., 2002). Consequently, reduced discrimination of within-category differences may reflect a flow-on effect of similar structural encoding of facial expressions assigned to the same category.
The finding that infants did not discriminate between two happy expressions in the present study appear to stand in contrast to previous studies suggesting that infants are able to discriminate between varying degrees of happy facial expressions (Kuchuck, Vibbert, & Bornstein, 1986; Striano, Brennan, & Vanman, 2002). Specifically, these studies have shown that infants look longer at a happy expression than at a neutral expression. Furthermore, this bias towards happy expression becomes stronger as the intensity of the happy expression is increased (Kuchuck et al., 1986). However, discrimination of two happy expressions was not directly examined in these studies (i.e., one happy expression was not compared to another happy expression). For this reason, the results of these studies are not directly comparable to the present results. In one earlier study, successful discrimination of two happy expressions was demonstrated in 7-month-old infants (Ludemann & Nelson, 1988). However, the stimuli in this study were two different intensities of happy expressions and the within category variations in expressions were clearly more salient than the subtle variations in the presented morphed images. It is likely that these differences in the stimuli explain the contrasting findings.
A potential alternative interpretation of the present data is that rather than reduced sensitivity to within-category changes in facial expressions, the lack of discrimination of happy expressions reflected a failure of infants to attend to expression information in this particular experimental condition. That is, when presented with two different happy expressions, infants may not attend to expression information and, consequently, exhibit no discrimination of the familiar and novel facial expression. In contrast, infants may attend to expression information when they are habituated to sad expressions, which would explain why they demonstrated successful discrimination in the between-category condition. We consider this possibility unlikely, however. There is substantial previous evidence showing that infants habituated to happy expressions discriminate the familiar happy expression from other facial expressions (e.g., Kotsoni et al., 2001; Young-Browne et al., 1977). As noted above, there is also evidence that infants are able to discriminate between different intensities of happy expressions (Ludemann & Nelson, 1988). These results clearly show that infants do attend to expression information when presented with happy facial expressions.
The result showing no evidence for any within-category discrimination in infants differs from findings showing that adults typically retain some sensitivity to within-category differences. The interpretation of this difference is complicated, however, because the tasks used in adult studies are arguably more sensitive than those commonly used in infant studies (Franklin, Pilling, & Davies, 2005). An inherent limitation of the methodology used in infant studies is that, although the employed tasks allow one to demonstrate differential sensitivity in one experimental condition vs. another (such as detection of within vs. between-category variations), the lack of discrimination in these tasks does not necessarily imply absolute inability to discriminate.
No significant correlations were found between behavioral and ERP measures of discrimination in the present study (between-category condition). Given the small sample size in the present study, the lack of significant correlations must be interpreted with caution. The present findings are, however, consistent with several previous studies showing that behavioral and ERP (NC) measures of attention are not correlated (see de Haan, 2007).
Finally, there are important limitations to this study that need to be considered. Most importantly, the present investigation of within-category discrimination was limited to one emotion category (happiness) and only two within-category instances and two between category instances of facial expressions were used. Like others (e.g., Bornstein & Arterberry, 2003), we decided to focus on happy expressions as a first step to examine the neural correlates of categorical perception. It is clear, however, that further research is required to establish whether the categorical perception effects observed here for stimuli on the happy to sad continuum generalize to other emotion continua. Further research is also required to test whether the present results hold when more than one within-category or between-category expression pairs are used. This would be important not only for extending the generalizability of the findings but also for testing whether stimuli with more similar physical characteristics are responded to like stimuli with greater physical differences within a specific expression category. These caveats notwithstanding, the present results provide one of the first pieces of evidence to suggest that the neural systems that support categorical representations of facial expressions may become functional by the age of 7 months, before the development of linguistic systems and the acquisition of specific emotion words. The fact that categories of facial expressions can emerge early in development does not necessarily imply, however, that they are acquired independently of experience. An alternative possibility is that the acquisition of facial expression categories relies on experiential input and may occur via similar category learning mechanisms as category formation in other domains of perception. Specifically, formation of categories may be supported by a mechanism that allows an infant to acquire summary representations (prototypes) or other types of representations of the defining features of the different facial expressions that they encounter in their natural environment (see e.g., de Haan, Johnson, Maurer, & Perrett, 2001; Kuhl, 2004; Quinn, 2002; Sherman, 1985). At the neural level, this developmental process may involve a gradual tuning of populations of cells in occipitotemporal cortex to those invariant features of faces that are diagnostic for a particular category of facial expression (cf. Sigala & Logothetis, 2002).
Acknowledgments
This study was supported in part by grants from the Academy of Finland and Finnish Cultural Foundation to the first author and NIH (NS32976) and Richard and Mary Scott (through an endowment to Children’s Hospital) to the fifth author. We thank Joseph McCleery and Kristin Shutts for their comments on a draft of this article.
Footnotes
1
The original stimuli were taken from the MacBrain Face Stimulus Set. Development of the MacBrain Face Stimulus Set was overseen by Nim Tottenham and supported by the John D. and Catherine T. MacArthur Foundation Research Network on Early Experience and Brain Development.
2
Mean amplitude for windows centered around the peak of the components of interest rather than peak-within-a-window algorithm was used to quantify ERPs in the present study because mean amplitude provides a more stable measure of amplitude differences especially when no clear peaks for waveforms from individual subject or electrode site can be identified (Picton et al., 2000).
References
- Barrett LF, Lindquist KA, Gendron M. Language as context for the perception of emotion. Trends in Cognitive Sciences. 2007;11:327–332. doi: 10.1016/j.tics.2007.06.003. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Batty M, Taylor MJ. Early processing of the six basic facial emotional expressions. Cognitive Brain Research. 2003;17:613–620. doi: 10.1016/s0926-6410(03)00174-5. [DOI] [PubMed] [Google Scholar]
- Batty M, Taylor MJ. The development of emotional face processing during childhood. Developmental Science. 2006;9:207–220. doi: 10.1111/j.1467-7687.2006.00480.x. [DOI] [PubMed] [Google Scholar]
- Bornstein MH, Arterberry ME. Recognition, discrimination and categorization of smiling by 5-month-old infants. Developmental Science. 2003;6:585–599. [Google Scholar]
- Bornstein MH, Korda NO. Discrimination and matching between hues measured by reaction times: Some implications for categorical perception and levels of information processing. Psychological Research. 1984;46:207–222. doi: 10.1007/BF00308884. [DOI] [PubMed] [Google Scholar]
- Campanella S, Quinet P, Bruyer R, Crommelinck M, Guerit JM. Categorical perception of happiness and fear facial expressions: an ERP study. Journal of Cognitive Neuroscience. 2002;14:210–227. doi: 10.1162/089892902317236858. [DOI] [PubMed] [Google Scholar]
- Davidoff J. Language and perceptual categorization. Trends in Cognitive Sciences. 2001;5:382–387. doi: 10.1016/s1364-6613(00)01726-5. [DOI] [PubMed] [Google Scholar]
- de Haan M. Visual attention and recognition memory in infancy. In: de Haan M, editor. Infant EEG and event-related potentials. Hove: Psychology Press; 2007. pp. 101–144. [Google Scholar]
- de Haan M, Johnson MH, Halit H. Development of face-sensitive event-related potentials during infancy: a review. International Journal of Psychophysiology. 2003;51:45–58. doi: 10.1016/s0167-8760(03)00152-1. [DOI] [PubMed] [Google Scholar]
- de Haan M, Johnson MH, Maurer D, Perrett DI. Recognition of individual faces and average face prototypes by 1- and 3-month-old infants. Cognitive Development. 2001;16:659–678. [Google Scholar]
- de Haan M, Nelson CA. Discrimination and categorization of facial expressions of emotion during infancy. In: Slater AM, editor. Perceptual development: visual, auditory, and language perception in infancy. London: University College London Press; 1998. pp. 287–309. [Google Scholar]
- Eimas PD, Siqueland ER, Jusczyk P, Vigorito J. Speech Perception in infants. Science. 1971;171:303–306. doi: 10.1126/science.171.3968.303. [DOI] [PubMed] [Google Scholar]
- Ekman P, Friesen WV, O’Sullivan M, Chan A, Diacoyanni-Tarlatzis I, Heider K, et al. Universals and cultural differences in the judgments of facial expression of emotion. Journal of Personality and Social Psychology. 1987;53:712–717. doi: 10.1037//0022-3514.53.4.712. [DOI] [PubMed] [Google Scholar]
- Etcoff NL, Magee JJ. Categorical perception of facial expressions. Cognition. 1992;44:227–240. doi: 10.1016/0010-0277(92)90002-y. [DOI] [PubMed] [Google Scholar]
- Field TM, Woodson RW, Greenberg R, Cohen C. Discrimination and imitation of facial expressions by neonates. Science. 1982;218:179–181. doi: 10.1126/science.7123230. [DOI] [PubMed] [Google Scholar]
- Franklin A, Pilling M, Davies I. The nature of infant color categorization: evidence from eye movements on a target detection task. Journal of Experimental Child Psychology. 2005;91:227–248. doi: 10.1016/j.jecp.2005.03.003. [DOI] [PubMed] [Google Scholar]
- Harnad S. Categorical Perception: The groundwork of Cognition. New York: Cambridge University Press; 1987. [Google Scholar]
- Kotsoni E, de Haan M, Johnson MH. Categorical perception of facial expressions by 7-month-old infants. Perception. 2001;30:1115–1125. doi: 10.1068/p3155. [DOI] [PubMed] [Google Scholar]
- Kuchuck A, Vibbert M, Bornstein MH. The perception of smiling and its experiential correlates in three-month-old infants. Child Development. 1986;57:1054–1061. [PubMed] [Google Scholar]
- Kuhl PK. Early language acquisition: cracking the speech code. Nature Reviews Neuroscience. 2004;8:831–843. doi: 10.1038/nrn1533. [DOI] [PubMed] [Google Scholar]
- Leppänen JM, Moulson MC, Vogel-Farley VK, Nelson CA. An ERP study of emotional face processing in the adult and infant brain. Child Development. 2007;78:232–245. doi: 10.1111/j.1467-8624.2007.00994.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Leppänen JM, Nelson CA. The development and neural bases of facial emotion recognition. In: Kail RJ, editor. Advances in Child Development and Behavior. Vol. 34. San Diego: Academic Press; 2006. pp. 207–245. [DOI] [PubMed] [Google Scholar]
- Ludemann PM, Nelson CA. Categorical representation of facial expressions by 7-month-old infants. Developmental Psychology. 1988;24:492–501. [Google Scholar]
- Peltola MJ, Leppänen JM, Palokangas T, Hietanen JK. Fearful faces modulate looking duration and attention disengagement in 7-month-old infants. Developmental Science. 2008;11:60–68. doi: 10.1111/j.1467-7687.2007.00659.x. [DOI] [PubMed] [Google Scholar]
- Picton TW, Bentin S, Berg P, Donchin E, Hillyard SA, Johnson JR, et al. Guidelines for using human event-related potentials to study cognition: Recording standards and publication criteria. Psychophysiology. 2000;37:127–152. [PubMed] [Google Scholar]
- Quinn PC. Category representation in young infants. Current Directions in Psychological Science. 2002;11:66–70. [Google Scholar]
- Quinn PC, Westerlund A, Nelson CA. Neural markers of categorization in 6-month-old infants. Psychological Science. 2006;17:59–66. doi: 10.1111/j.1467-9280.2005.01665.x. [DOI] [PubMed] [Google Scholar]
- Reynolds GD, Richards JE. Familiarization, attention, and recognition memory in infancy: an event-related-potential and cortcial source localization study. Developmental Psychology. 2005;41:598–615. doi: 10.1037/0012-1649.41.4.598. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Richards JE. Attention affects the recognition of briefly presented visual stimuli in infants: an ERP study. Developmental Science. 2003;6:312–328. doi: 10.1111/1467-7687.00287. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Schwartz GM, Izard CE, Ansul SE. The 5-month-old’s ability to discriminate facial expressions of emotion. Infant Behavior and Development: An International and Interdisciplinary Journal. 1985;8:65–77. [Google Scholar]
- Schweinberger SR, Pfütze EM, Sommer W. Repetition priming and associative priming of face recognition: evidence from event-related potentials. Journal of Experimental Psychology: Learning, Memory, and Cognition. 1995;21:722–736. [Google Scholar]
- Scott LS, Nelson CA. Featural and configural face processing in adults and infants: A behavioral and electrophysiological investigation. Perception. 2006;35:1107–1128. doi: 10.1068/p5493. [DOI] [PubMed] [Google Scholar]
- Sherman T. Categorization skills in infants. Child Development. 1985;56:1561–1573. [PubMed] [Google Scholar]
- Sigala N, Logothetis NK. Visual categorization shapes feature selectivity in the primate temporal cortex. Nature. 2002;415:318–320. doi: 10.1038/415318a. [DOI] [PubMed] [Google Scholar]
- Striano T, Brennan PA, Vanman EJ. Maternal depressive symptoms and 6-month-old infants’ sensitivity to facial expressions. Infancy. 2002;3:115–126. [Google Scholar]
- Young-Browne G, Rosenfield HM, Horowitz FD. Infant discrimination of facial expressions. Child Development. 1977;48:555–562. [Google Scholar]
- Young AW, Perrett D, Calder AJ, Sprengelmeyer R, Ekman P. Facial Expressions of Emotion Stimuli and Tests (FEEST) Bury St. Edmunds, Suffolk, England: Thames Valley Test Company; 2002. [Google Scholar]
- Young AW, Rowland D, Calder AJ, Etcoff NL, Seth A, Perrett DI. Facial Expression Megamix: Tests of dimensional and category accounts of emotion recognition. Cognition. 1997;63:271–313. doi: 10.1016/s0010-0277(97)00003-6. [DOI] [PubMed] [Google Scholar]