Facial processing is an important aspect of interpersonal communication and a significant modulator of social behavior. Facial expressions of emotion can provide information about another person's emotional state and enable one to predict another person's probable actions.1 Facial expressions that humans can reliably identify include anger, fear, sadness, happiness, surprise, and disgust.2,3 Deficits in the recognition of some or all facial expressions of emotion might be a contributing factor in the significant social and behavioral impairment observed in patients with Alzheimer's disease (AD).4
Earlier studies of the recognition of emotional and nonemotional facial features in Alzheimer's disease have produced variable results. Roudier et al.5 found that patients with AD were significantly impaired in discriminating facial identities but not in discriminating facial emotions. Albert et al.6 found significant impairments in several tests of facial emotion recognition in AD. They suggested that these impairments were due to the deficits in recognizing nonemotional facial features and in verbal processing that they also observed in their AD patients. Cadieux and Greve7 demonstrated impairment in some measures of facial emotion recognition in AD, but no differences in facial identity recognition. However, they attributed the facial emotion deficits to impairment in verbal and spatial processing in the AD patients. Allender and Kasniak8 found evidence for independent deficits in both nonemotional and emotional facial recognition tasks in AD. Lavenu et al.9 found that patients with AD were generally unimpaired in both detecting and naming facial emotions. Although several studies have found at least some evidence of impairment in the recognition of facial expressions of emotion in AD, only Allender and Kaszniak8 interpret their findings as evidence for a specific impairment in emotional processing, rather than an indirect result of deficits in verbal, spatial, or other nonemotional skills.
The possibility that there may be deficits in the recognition of specific facial expressions of emotions in AD has not been systematically investigated. However, Lavenu et al.9 reported preliminary evidence suggesting selective impairment in the labeling of facial expressions of fear and contempt in AD. Atrophy and neuropathological changes in the amygdala may occur early in the course of AD.10 Because lesions of the amygdala have been associated with a specific impairment in processing fearful faces, it is possible that patients with mild or moderate AD would demonstrate disproportionate impairment in their ability to recognize facial expressions of fear.11
The goal of our study was to evaluate the ability of a group of patients with AD, compared with elderly control subjects, to accurately recognize and discriminate facial expressions of six different emotions. We also evaluated facial identity recognition ability as a measure of nonemotional deficits in facial processing. We included a group of elderly psychiatric outpatients with mood and anxiety disorders as a control group in addition to a group of elderly normal volunteers; several studies have shown that anxious and depressed patients may have deficits in recognizing facial expressions of emotion.12—14 Although increased rates of anxiety and depression symptoms have been observed in AD patients, previous studies of facial emotion processing in AD have not used psychiatric control subjects.4,15 On the basis of prior studies, we expected to observe deficits in nonemotional facial processing in the AD patients relative to the control groups.
This study was designed to answer the following three questions: 1) Do AD patients have deficits in recognition of facial expressions of emotion? 2) Is there evidence that deficits in processing emotional faces in AD patients are independent of their deficits in nonemotional facial processing? 3) Do AD patients have selective deficits in recognizing any specific facial expressions of emotion? Clarifying the nature and extent of facial emotion processing deficits in AD and their relationship to cognitive decline may lead to new insights into the neurobiology of AD and the social and behavioral disturbances that often accompany this illness.
Normal control subjects (NC) and psychiatric control subjects (PC) were recruited from the Martinez Veterans Affairs (VA) Outpatient Clinic. AD patients were recruited from the Alzheimer's Disease Diagnostic and Treatment Centers located at Martinez VA Outpatient Clinic, University of California—Davis and Stanford/VA Palo Alto Health Care System. All subjects signed a standard consent form approved by the Investigational Review Boards of UC-Davis and the VA Northern California Systems of Clinics.
All AD patients underwent structured diagnostic interviews, formal neurological evaluation, and neuropsychological testing to assess demographic characteristics, symptoms of dementia, and impairment in cognitive functioning and activities of daily living. Diagnostic evaluations were performed by a team of neurologists, physicians, nurses, and neuropsychologists. The diagnosis of probable or possible AD was assigned to patients according to the criteria of the National Institute of Neurological and Communicative Disorders and Stroke and the Alzheimer's Disease and Related Disorders Associations (NINCDS/ADRDA).16 AD patients were excluded if they were judged to be unable to comprehend task instructions. None of the AD patients were reported to have prosopagnosia. All subjects were right-handed. Subjects with history of head trauma, profound visual or hearing deficits, alcoholism, or serious neurological disorders were excluded.
Assessment of Facial Processing
Four facial processing tasks and three other neurocognitive tasks were administered to all subjects. The facial processing test battery consisted of the Benton Facial Recognition Test and three facial emotion recognition tasks: Facial Emotion Matching, Facial Emotion Labeling, and Same—Different Emotion Differentiation.17 All tests were administered by either a board-certified geriatric psychiatrist (R.H.) or a trained research assistant.
The Benton Facial Recognition Test measures the subject's ability to recognize the identity of neutral/nonemotional faces and requires the subject to select from a set of six 2×3-inch aligned black and white photographs the face with the same identity as the reference face. Six items require one identity recognition response and 16 items require three identity recognition responses. The 22 items yield a maximum score of 54 correct choices.
The remainder of the test battery used standardized photographs from the Japanese and Caucasian Facial Expressions of Emotion.18,19 This is a series of color photographs of faces depicting six different emotions by seven Caucasian men, seven Caucasian women, seven Japanese men and seven Japanese women. The six emotional expressions depicted were anger, sadness, happiness, fear, surprise, and disgust. Data from cross-cultural studies have validated that the photographs accurately depict the intended emotions.18,19 Biehl et al.20 and Matsumoto et al.21 have demonstrated the reliability and validity of using these facial stimuli to assess the ability of subjects to recognize facial expressions of emotion.
For Facial Emotion Matching, subjects were shown a photograph of the reference face and asked to match the emotion displayed on the reference face with one of six simultaneously presented alternatives (another view of the reference emotion and five distracters). All seven photographs (reference and six alternatives) were faces of different people of the same gender and ethnicity. The reference face was mounted on an 8½×11-inch cardboard mat. The six alternative photographs were horizontally mounted on an 8½×14-inch cardboard mat in a 2×3 alignment. The task was repeated with four male and four female reference faces for each of six separate emotions, yielding a maximum score of 48 correct choices. Among the facial emotion recognition tests, the nonemotional cognitive skills required for Facial Emotion Matching were considered to be the most similar to the skills required for the Benton Facial Recognition Test. A comparison of performance on these two tasks was planned as a primary test of whether AD patients have deficits in recognition of facial emotion independent of their deficits in nonemotional facial processing.
For Facial Emotion Labeling, the subject was shown a photograph of the reference face depicting one of the six possible emotions. The names of the six emotions were printed below the photograph in a 2×3-inch horizontal alignment, with the order of these names randomized across trials. The task was repeated with four male and four female reference faces for each of six separate emotions, yielding a maximum score of 48 correct choices.
For Same—Different Emotion Differentiation, the subject was presented with a pair of photographs of different people of the same sex and ethnicity mounted on a 8½×11-inch cardboard mat. Subjects were asked to state if the two photographs in the pair were depicting the same or different emotions. The task was repeated with five pairs of male and five pairs of female target faces for each of six separate emotions, yielding a maximum score of 60 correct choices. These tasks were intended to provide additional information about facial emotion processing performance in AD.
The 22-item Hamilton Rating Scale for Depression (Ham-D),22 the State-Trait Anxiety Inventory (STAI)23 and the Mini-Mental State Examination (MMSE)24 were also administered to all subjects.
Sociodemographic and clinical variables (age, gender, ethnicity, education, MMSE, Ham-D, and STAI scores) were compared in NC, PC, and AD patients by using analysis of variance (ANOVA) followed by post hoc testing with Fisher's protected least significant difference (PLSD) test for continuous variables and Fisher's exact test for nominal data. Performance on the facial processing tasks was quantified as the total number of correct responses and compared in NC, PC, and AD patients using a MANOVA model followed by individual ANOVAs. Subsequent post hoc testing was done with Fisher's PLSD test. A multivariate analysis of covariance (MANCOVA) model was used to compare the performance on the facial emotion tasks in NC, PC, and AD patients using each subject's score on the facial identity matching task as a covariate. Subsequent individual ANCOVAs were followed by post hoc t-tests on the covariate adjusted scores.
The interaction between diagnosis and the ability to recognize specific facial expressions of emotion (fearful, happy, etc.) was analyzed by using ANOVA with repeated measures. The relationships between MMSE scores and performance on facial processing tasks were analyzed by using correlation coefficients (Pearson's r). Significant differences are reported for two-tailed P<0.05 for all analyses except the ANCOVAs, for which trends at higher P-levels are also reported.
Subjects were 22 Alzheimer's disease patients, 14 normal control subjects, and 10 psychiatric control subjects. All were right-handed. t1 shows the demographic and clinical characteristics of AD patients and control subjects. Subjects were 37 Caucasians, 7 African Americans, 1 Hispanic, and 2 Asians. Half of the PC subjects received diagnoses of generalized anxiety disorder and half received diagnoses of bipolar mood disorder. MMSE scores in the AD patients ranged from 9 to 26 and were significantly lower than those of the PC and NC subjects. There were no significant differences between AD patients, PC subjects, and NC subjects in age, education, ethnicity, Ham-D scores, or STAI scores. The PC group had a significantly higher proportion of male subjects. No significant correlations were found between performance on the facial emotion tasks and age, education, Ham-D score, or STAI scores. Performance on Benton Facial Recognition correlated significantly with performance on Facial Emotion Matching (r=0.68; P<0.001), Same—Different Emotion Differentiation (r=0.50; P<0.001), and Facial Emotion Labeling (r=0.39; P<0.01). There was no significant effect of gender on performance on any of the facial processing tasks.
t2 shows the results of individual ANOVAs for the four facial processing tasks across the three groups. Based on the overall MANOVA, the number of correct responses on the facial processing tasks varied significantly across the groups (F=22.5, df=4,41, P<0.0001). The individual ANOVAs showed that the effect of group was significant for each of the four facial processing tasks. Post hoc testing with Fisher's PLSD test showed that AD patients made fewer correct responses than NC subjects and PC patients on Facial Emotion Matching, Same—Different Emotion Differentiation, Facial Emotion Labeling, and Benton Facial Recognition. There were no significant differences between the NC and PC groups. The performance of the AD patients on two of the four facial processing tasks were significantly correlated with the severity of dementia as measured by the MMSE. Correlations between the MMSE and each of the four facial processing tasks in the AD patients were r=0.61 (P=0.003) for Facial Emotion Matching; r=0.59 (P=0.005) for Benton Facial Recognition; r=0.42 (P=0.06) for Same—Different Emotion Differentiation; and r=0.09 (not significant) for Facial Emotion Labeling.
t3 shows the results of individual ANCOVAs and the covariate adjusted means for the three facial emotion processing tasks across the three groups when performance on the Benton Facial Recognition Test was used as a covariate. The overall MANCOVA showed that the covariate adjusted performance on the facial emotion processing tasks varied significantly across the groups (F=4.27, df=3,41, P=0.01). The individual ANCOVAs showed that the effect of group was significant for Same—Different Emotion and Emotion Labeling. Post hoc t-tests of the covariate adjusted scores showed that AD patients made fewer correct responses than the NC subjects on Same—Different Emotion and fewer correct responses than the NC subjects and the PC patients on Emotion Labeling. ANCOVA of the Emotion Matching task showed a trend toward an effect of group. Post hoc t-tests of the covariate adjusted scores revealed a strong trend suggesting that AD patients had lower covariate adjusted scores than the NC subjects. There were no significant differences between the NC and PC groups.
A mixed-design ANOVA, with one between-subjects variable (diagnostic group) and one within-subjects variable (the six different facial expressions of emotion), was performed on both Emotion Labeling and the Emotion Matching to determine if AD patients had a disproportionate deficit in processing any specific facial expressions of emotion. Analysis of the Emotion Labeling data revealed significant main effects for group (F=9.5, df=2,43, P=0.0004) and emotion (F=27.3, df=5,43, P=0.0004), and a significant interaction between group and emotion (F=7.8, df=10,215, P=0.006). Post hoc univariate ANOVAs showed a significant effect of group for sad (F= 10.0, df=2,43, P=0.0003), surprised (F= 4.3, df=2,43, P=0.020), and disgusted faces (F= 6.5, df=2,43, P=0.003). Fisher's PLSD test showed that AD patients performed significantly worse than NC subjects in the labeling of sad (P=0.0002), surprised (P=0.02), and disgusted faces (P=0.002), and significantly worse than PC subjects in labeling sad (P=0.004) and surprised faces (P=0.02). The interaction effect was further analyzed by calculating the relative performance scores for each emotion as the difference between the subject's raw score on that emotion and the subject's average score on the other five emotions. The relative performance score can identify deficits in processing a specific emotion that are disproportionate to the overall deficits in processing facial emotions in AD. Univariate ANOVAs of the relative performance scores for each emotion showed a significant effect of group for sad (F=5.8, df=2,43, P=0.006), happy (F=4.5, df=2,43, P=0.017), and disgusted faces (F=4.5, df=2,43, P=0.017). Fisher's PLSD test showed that AD patients had significantly worse relative performance with sad faces than NC (P=0.004) and PC (P=0.02) subjects, but significantly better relative performance with happy faces than NC subjects (P=0.005).
The main effect for emotion across groups was due to three patterns of significant differences. Performance on labeling of happy faces was significantly better than all other emotions. Performance was significantly better for labeling surprised faces than all other facial emotions except happy faces. Performance was significantly worse for labeling fearful faces than all other facial emotions.
Analysis of the Facial Emotion Matching task revealed the expected significant main effect for group (F=16.0, df=2,43, P<0.0001) and a significant main effect for emotion (F=16.2, df=5,43, P<0.0001), but no significant interaction effect between group and emotion (F=1.1, df=10,215, not significant). The lack of an interaction effect indicated that no disproportionate deficit in processing a specific facial expression of emotion in AD was evident in the Emotion Matching task data. Post hoc univariate ANOVAs confirmed that AD patients performed significantly worse than NC subjects in matching faces for all six emotions. The main effect for emotion across groups was due to a pattern of significant differences that was generally similar to that observed in the analysis of the Facial Emotion Labeling task. Performance on happy faces was significantly better than all other emotions. Performance was significantly worse for faces expressing fear and disgust than the other four facial emotions.
Our study demonstrates significant impairment in the ability to recognize facial expressions of emotion in AD patients compared with both normal elderly volunteers and elderly, nondemented psychiatric outpatients with mood or anxiety disorders. This impairment is evident in a Facial Emotion Matching task, a Same—Different Facial Emotion Differentiation task, and a Facial Emotion Labeling task. Our finding of impairment on Facial Emotion Matching is consistent with the results of Cadieux and Greve,7 who were the only previous investigators to use a test of this design. Our finding of impairment in the Same—Different Emotion task is consistent with the results of Albert et al.,6 and Cadieux and Greve,7 but contrasts with the negative findings of Roudier et al.5 However, the latter authors tested their subjects by using only four trials each with four expressions (happy, sad, angry, and indifferent). This more limited test may account for their discrepant findings. Our observation of impairment on Facial Emotion Labeling was similar to the results of Albert et al.,6 Cadieux and Greve,7 and Roudier et al.,5 but contrasts with the negative results of Lavenu et al.9 However, Lavenu et al.9 did observe a trend toward impairment in their AD patients on this test. Thus, our findings are generally consistent with earlier studies demonstrating that AD patients are less able to recognize facial expressions of emotion than elderly nondemented control subjects. This deficit may contribute to the social and behavioral impairment observed in patients with Alzheimer's disease (AD).4,15
Our study also showed that AD patients were significantly impaired in recognizing nonemotional facial features, as tested by the Benton Facial Recognition Test. Several investigators have suggested that the deficits in recognition of facial expressions of emotion in AD are due to underlying deficits in verbal, spatial, and facial processing and not the result of a primary impairment in emotional processing.5—7 In contrast, Allender and Kaszniak8 found evidence for an independent deficit in emotional processing in AD. In our study, the AD patients had significantly poorer performance on the Same—Different Emotion and Emotion Labeling tasks even when their Benton Facial Recognition Test scores were used as a covariate to adjust for nonemotional facial processing skills. A similar trend was observed for the Emotion Matching task. Therefore, the results are consistent with a specific impairment in emotional processing in AD and cannot be explained simply by a deficit in facial processing.
It is difficult to isolate emotional processing skills in tests measuring the recognition of facial expressions of emotions in AD patients. The Emotion Labeling task requires verbal skills that are not required for the Benton Facial Recognition Test, and thus lower covariate-adjusted Labeling scores may reflect the impaired verbal abilities often observed in AD. However, the Emotion Matching and the Facial Recognition tasks were very similar, differing primarily in the requirement for recognition of facial emotion in the former and identity in the latter. Nonetheless, the Benton Facial Recognition Test is significantly more difficult, and the performance on the Emotion Matching test may benefit from verbally mediated strategies in some subjects. Performance on the Same—Different Emotion task may also benefit from verbally mediated strategies and is sensitive to variations in response bias. Our use of Benton Facial Recognition scores as a covariate is a statistically conservative approach that increases the risk of failing to demonstrate a group difference when one does exist. Because Benton Facial Recognition scores are significantly lower in the AD group, they are inherently confounded with the grouping variable. Thus, using these scores as a covariate will not only remove some variance that is due to differences in nonemotional facial processing abilities, it will also remove some variance that is due to the effects of AD.25 Given these caveats, the results of our covariance analysis are consistent with the hypothesis that AD patients have specific emotional processing deficits that contribute to their impairment in recognizing facial expressions of emotion.
Our analysis of performance with specific facial emotions showed a significant difference among the groups in their relative abilities to label the six emotions. Although AD patients scored lower than NC subjects for all six emotions, the differences were nonsignificant for happy, fearful, and angry faces, and significant for sad, surprised, and disgusted faces. This pattern of statistical results may be due to variations in the power of our design to demonstrate significant effects for each emotion, rather than to selective impairment for specific emotions. Our comparison of the relative scores on each emotion is a stricter test for selective impairment. This analysis showed that relative to their average score with the other five emotions, AD patients scored significantly worse than both control groups in labeling sad faces. This finding is suggestive of a selective impairment in labeling sad faces in AD. Replication of this finding will be necessary prior to interpretation of its significance. Our finding that AD patients had significantly better relative performance labeling happy faces is probably due to happy faces being the easiest of the six emotions to label correctly.
Evidence for specific involvement of the amygdala in the recognition of fearful faces11,24 and evidence of early damage to this structure in AD10 suggested there might be a specific deficit in recognizing fearful faces in AD patients. In contrast to Lavenu et al.,9 we did not find any support for the hypothesis that AD patients have selective impairment in processing facial expressions of fear. The preliminary evidence reported by Lavenu et al.9 in support of this hypothesis was a small effect based on only 4 trials in each subject. Their reported difference in performance in AD patients and control subjects did not reach statistical significance when the P threshold was corrected for multiple comparisons.
Numerous neural structures are involved with the complex tasks of facial processing. Studies of unilateral lesion patients and functional neuroimaging experiments in normal subjects suggest that recognition of facial identity and facial emotions are mediated by separate neuronal systems.26—30 Bilateral temporo-occipital lesions have been associated with deficits in the ability to identify faces. Positron emission tomography studies suggest that the cortical areas activated by facial identification tasks include right lingual gyrus, right parahippocampal gyrus, right anterior temporal lobe, and middle lateral cortex of the left temporal lobe.28 Orbitofrontal cortex and fusiform gyrus of both hemispheres are also activated during facial identification tasks.28
Neurophysiological and neuropsychological studies suggest that recognition and interpretation of facial emotions involves the amygdala, orbitofrontal cortex and hypothalamus.31 In functional neuroimaging studies, the amygdala and the posterior cingulate cortex are frequently activated by the evaluation of emotionally salient stimuli, including facial expressions of emotion.26,32 Neuropathological changes in the amygdala have been observed in early AD.10 Functional neuroimaging studies have shown the posterior cingulate cortex to be one of the most metabolically abnormal brain regions in early AD.33,34 Anomia for facial emotions has been reported in patients with lesions in the right middle temporal gyrus.35 Temporal and parietal cortices are also among the most metabolically abnormal regions in AD.33,36 Animal and human studies suggest that the right cerebral hemisphere is superior to the left hemisphere in the perception and interpretation of facial expressions of emotion.27,31,37—41
The generalizability of our results is limited by several factors. The AD patients in this study were recruited from the Alzheimer's Disease Diagnostic and Treatment Centers and may not be clinically representative of a community sample of patients with dementia. Our study does not have neuropathologically confirmed diagnoses to minimize the risk of misclassification of dementia etiology. The study is cross-sectional in design, and longitudinal studies may reveal a different clinical picture of facial processing deficits in AD. Longitudinal studies with larger patient samples that examine facial emotion recognition along with other aspects of emotional processing (e.g., vocal prosody, recognition of emotion through speech prosody, and production of facial emotions) will be important to further characterize the deficits in emotional processing in AD.
In the natural environment, facial expressions are not static like photographs but represent ever-changing events. Some investigators suggest that facial photographs lack important dynamic information the AD patient needs to more accurately interpret facial expressions.7 Future studies that make use of videotaped representations of facial expressions of emotions could address this issue. In addition, AD patients may be more accurate in recognizing facial affect when they can integrate information about emotional cues from several modalities (e.g., facial expressions, vocal intonations, and language) within the context of environmental cues.8 AD patients may show greater impairment in perception of facial affect when they are required to base their assessment on minimal or unstructured information.8 Studies of emotion recognition in more naturalistic contexts would be of value in this regard.
The human face conveys nonverbal information about a person's identity and emotional state that is critical for the initiation of appropriate social behavior. Misreading emotions may be an aspect of the AD patient's inability to employ external and internal emotional cues to modify behavior and adjust self-perception. Impairments in facial processing may lead to poor judgment in social interactions and behavior disturbances. Future studies are needed to elucidate the relationship between facial affect recognition deficits, behavioral disturbances, and inappropriate social interactions in AD patients. Such studies would benefit from larger samples of AD patients and inclusion of a standardized instrument for rating social and behavioral impairment, such as the Neurobehavioral Rating Scale.