T1110 |
0-28762 |
CHEBI_15611 |
denotes |
Early neural activation during facial affect processing in adolescents with Autism Spectrum Disorder☆
Abstract
Impaired social interaction is one of the hallmarks of Autism Spectrum Disorder (ASD). Emotional faces are arguably the most critical visual social stimuli and the ability to perceive, recognize, and interpret emotions is central to social interaction and communication, and subsequently healthy social development. However, our understanding of the neural and cognitive mechanisms underlying emotional face processing in adolescents with ASD is limited. We recruited 48 adolescents, 24 with high functioning ASD and 24 typically developing controls. Participants completed an implicit emotional face processing task in the MEG. We examined spatiotemporal differences in neural activation between the groups during implicit angry and happy face processing. While there were no differences in response latencies between groups across emotions, adolescents with ASD had lower accuracy on the implicit emotional face processing task when the trials included angry faces. MEG data showed atypical neural activity in adolescents with ASD during angry and happy face processing, which included atypical activity in the insula, anterior and posterior cingulate and temporal and orbitofrontal regions. Our findings demonstrate differences in neural activity during happy and angry face processing between adolescents with and without ASD. These differences in activation in social cognitive regions may index the difficulties in face processing and in comprehension of social reward and punishment in the ASD group. Thus, our results suggest that atypical neural activation contributes to impaired affect processing, and thus social cognition, in adolescents with ASD.
Highlights
• The ability to recognize and interpret emotions is central to social interaction. • Deficits in social interactions are hallmarks of autism spectrum disorder (ASD). • Adolescents with and without ASD completed an emotional face task in MEG. • MEG data showed atypical neural activity in ASD to both angry and happy faces. • Insula, cingulate, temporal and orbitofrontal activities were particularly affected in the ASD group.
1 Introduction
Emotional face processing is an innate and universal ability that is integral to the acquisition of social skills (Ekman & Friesen, 1971; Meltzoff & Moore, 1977). The human face is the most important visual stimulus for human social interactions. The ability to extract the significance of expressive faces is critical for successful social interactions, as it facilitates the understanding of another's mental states and intentions and is important in guiding appropriate reciprocal behaviour. Impaired social functioning is one of the diagnostic hallmarks of Autism Spectrum Disorder (ASD). While it is generally understood that individuals with ASD experience difficulties with social cues, the current literature on emotional face processing in ASD has yielded inconsistent results with some studies finding deficits in emotional processing (e.g., Celani et al., 1999; Eack et al., 2014; García-Villamisar et al., 2010; Golan et al., 2008), with impairment in fear (Ashwin et al., 2007; Howard et al., 2000; Pelphrey et al., 2002), surprise (Baron-Cohen et al., 1993) and anger processing (Kuusikko et al., 2009), while others have noted no deficits (Adolphs et al., 2001; Balconi & Carrera, 2007; Buitelaar et al., 1999; Castelli, 2005; Tracy et al., 2011).
In typical development, emotional face processing is associated with activation in a widespread neural network, encompassing the visual, limbic, temporal, temporoparietal and prefrontal regions (Blair et al., 1999; Fusar-Poli et al., 2009; Phillips et al., 1999; Vuilleumier & Pourtois, 2007). The processing of happy facial expressions implicates a number of structures including the amygdalae, insulae, cingulate, inferior, medial and middle frontal, fusiform and middle temporal areas (Breiter et al., 1996; Devinsky et al., 1995; Fusar-Poli et al., 2009; García-Villamisar et al., 2010; Hoehl et al., 2010; Kesler-West et al., 2001; Phillips et al., 1998; Thomas et al., 2001). Angry faces elicit cingulate, bilateral fusiform, inferior frontal, superior temporal, middle and medial superior frontal, and orbitofrontal activity (Blair et al., 1999; Devinsky et al., 1995; Fusar-Poli et al., 2009; García-Villamisar et al., 2010; Kesler-West et al., 2001); moreover, decreased activation to angry, relative to neutral, faces has been found in the caudate nucleus, superior temporal, anterior cingulate (ACC), and medial frontal regions (Phillips et al., 1999). While the amygdalae have been widely reported to be involved in threat processing, amygdala activation to angry faces has not been found consistently (e.g., Phillips et al., 1999; Luo et al., 2007; Whalen et al., 2001; but see Fusar-Poli et al., 2009).
Atypical activation of the social brain networks using fMRI during emotional face processing has been found in adults with ASD, including reduced left amygdala and orbitofrontal activation (Ashwin et al., 2007), greater activity in the left superior temporal gyrus and right peristriate visual cortex (Critchley et al., 2000), and reduced fusiform and extrastriate activity, while also showing activation comparable to controls in the anterior cingulate, superior temporal, medial frontal and insula regions (Deeley et al., 2007). Children and adolescents with ASD have shown reduced fusiform but greater precuneus activity during an emotion-matching task, but not during a simpler emotion labelling task (Wang et al., 2004). Collectively, these studies suggest that task demands modulate neural activity during emotional processing to a greater extent in individuals with ASD, and have likely contributed to the discrepant findings in existing literature (Wang et al., 2004).
Of the two major subsystems involved in social cognition, the ventral orbitofrontal-amygdala circuit is of particular relevance to ASD as it is implicated in socio-emotional regulation and behaviour through the processing of others' emotional states, responses and intentions (see Bachevalier & Loveland, 2006, for a review). The orbitofrontal cortex contributes to this subsystem by mediating emotional behaviour, social inhibition, reversal learning and altering of unsuitable behaviour (Blair et al., 1999; Dias et al., 1996; Elliott et al., 2000; Rolls, 2004; Van Honk et al., 2005) as well deriving positive social joy and reward (Britton et al., 2006; Morris & Dolan, 2001). Sharing a close functional and anatomical reciprocal interconnection with the orbitofrontal cortex, atypical activation in the left amygdala during affect processing has been noted in ASD (e.g., Corbett et al., 2009; Critchley et al., 2000). Hence, focusing on the orbitofrontal-amygdala circuit dysfunction may aid in delineating the profile of facial affect processing in ASD.
1.1 The spatiotemporal profile of neural activity during affect processing in ASD
There are few studies on the timing of the neural processing of emotional faces in ASD. ERP studies have noted variously a delay, reduced, or lack of activity in children and adults with ASD, relative to controls (Batty et al., 2011; Dawson et al., 2004; McPartland et al., 2004; O'Connor et al., 2005; Wong et al., 2008). A recent ERP study found that adolescents with ASD failed to show emotion-specific responses that were observed in the typically developing group (Wagner et al., 2013). Further, while face scanning was significantly associated with patterns of neural activation during face processing in typically developing adolescents, no such association was found in ASD (Wagner et al., 2013). Significant impairments in emotion recognition in adult, but not child, faces have also been noted in adolescents with ASD (Lerner et al., 2013). The same study also reported a significant association between the latency of the face-sensitive N170 component, localized to the fusiform gyri and inferior temporal areas, and deficits in emotion recognition in adolescents with ASD.
One study that investigated affective processing in adolescents with ASD using magnetoencephalography (MEG), which provides both timing and more accurate spatial information than EEG studies (Hari et al., 2010), showed reduced early emotion-specific gamma-band power, relative to controls, as well as later decreased power across all emotions (Wright et al., 2012). There do not appear to be other studies that have taken advantage of the spatio-temporal precision of MEG to determine the timing and/or brain regional activation differences in developmental ASD populations with emotional faces.
1.2 Anger processing
While emotional facial expressions of fear and anger both constitute threat-relevant signals, angry faces are more appropriate than fearful faces for investigating emotional face processing in ASD. Processing of anger requires the understanding of social norms and context, topics with which individuals with ASD struggle (Berkowitz, 1999; Zeman & Garber, 1996). Angry facial expressions are also more likely to be produced in response to repeated aggravations rather than first-time behavioural transgressions (Averill, 1982). Given that individuals with ASD are often poor at recognizing others' mental states or social norms, they have likely encountered displays of anger without understanding the implications (Bal et al., 2010; Baron-Cohen et al., 1999; Begeer et al., 2006). Past studies have specifically noted atypical angry facial processing in individuals with ASD (Ashwin et al., 2006; Kuusikko et al., 2009; Rieff et al., 2007; Rump et al., 2009). Age-effects in anger processing have also been noted, as older children and adolescents are able to correctly identify angry faces more often than younger children, regardless of diagnosis (Lindner & Rosén, 2006). There are also sharp increases in anger-specific sensitivity from adolescence to adulthood, providing support for a later maturation of anger processing (Thomas et al., 2007).
An fMRI study contrasting activations elicited by fear and anger in typical individuals showed that while both emotions activated a network of regions similarly, including the amygdalae and insulae, anger specifically elicited neural activation in a wider set of regions including the ventromedial prefrontal cortex and the posterior orbitofrontal cortex (Pichon et al., 2009). These areas have been implicated in behavioural regulation, supporting the idea that anger processing requires more resources and contextual information to adjust behaviour accordingly (Pichon et al., 2009).
1.3 Happiness processing in ASD
Happiness is the only one of the six basic emotions that is definitely positive and is the first emotion to be accurately identified in early development (Markham & Adams, 1992). Seemingly typical processing of happy affect in individuals with ASD has been observed, which may be due to the greater frequency of encountering and hence greater familiarity with happy faces (Critchley et al., 2000; Farran et al., 2011). However, while happy faces are socially rewarding for typically developing individuals (Phillips et al., 1998), insensitivity towards social reward derived from happy faces has been shown in individuals with ASD (Sepeta et al., 2012). Hence, it is important to investigate the underlying neural processes involved in the processing of happy faces in ASD to determine whether social reward systems are appropriately activated.
1.4 Adolescence
In typically developing individuals, recognition of emotional face expressions follows a protracted developmental trajectory that extends into late adolescence (e.g., Batty & Taylor, 2006; Kolb et al., 1992). Asynchronous development of emotion recognition is well established (De Sonneville et al., 2002; Pichon et al., 2009) and neuroimaging findings have demonstrated the shifting involvement of different neural networks over the course of development (Hung et al., 2012; Killgore & Yurgelun-Todd, 2007; Monk et al., 2003; Thomas et al., 2001). At around 11 years of age emotional processing abilities undergo marked improvement, which would implicate greater demands on neural processes involved in emotional processing in early adolescence (Tonks et al., 2007).
There are few studies on emotional face processing in adolescents with ASD, representing a serious gap in the literature, as adolescence is a period of vulnerability, volatility and increased stress (Spear, 2000) during which the prevalence of negative emotions peaks and shows greater variability (Compas et al., 1995; Hare et al., 2008). Behavioural studies investigating emotional face processing in adolescents with ASD have shown affective processing comparable to controls (Grossman et al., 2000; Rieffe et al., 2007). Adolescents with ASD in these studies showed deficits in decoding emotions from static and dynamic facial affect overall, and, although they have been shown to identify specific emotions (Rump et al., 2009) such as happiness and sadness readily, they have difficulties with more complex emotions such as embarrassment (Capps et al., 1992) and identifying emotions when shown videos (Koning & Magill-Evans, 2001). Individuals with Asperger Syndrome but not high-functioning autism have shown intact emotion perception (Mazefsky & Oswald, 2007).
Adults with ASD continue to experience difficulties when presented with brief or subtle emotions, and may never reach the same level of competency in emotional processing as typically developing adults (Begeer et al., 2006; Rieffe et al., 2007). The impairment in emotional processing in adults with ASD is consistent with the notion that individuals with ASD require compensatory strategies to achieve average performance. Although competency with emotional recognition in ASD improves with age, this skill often plateaus at a performance level below that of typically developing peers, or is associated with atypical brain activations to achieve comparable performance levels. These results underscore the need to examine the adolescent period of emotional processing in ASD to obtain a clearer understanding of the emotional development in this population.
Thus, the present study explored the neural substrates of implicit emotional face processing in adolescents with ASD using MEG, providing both temporal and spatial measures of brain processes, focusing on the neural areas implicated in emotional processing. We hypothesized that adolescents with ASD would show (1) shorter response latencies to emotional faces as they are not distracted by the emotions and (2) reduced and delayed patterns of neural activation in the frontal, limbic and temporal areas, key constituents of the social brain.
2 Materials and methods
2.1 Participants
Twenty-four adolescents (age range = 12–15 years) diagnosed with ASD (20 males, 14.03 ± 1.20 years, 23 right-handed, 7 medicated, IQ = 90.79 ± 23.76) and 24 healthy controls (19 males, 14.27 ± 1.12 years, 23 right-handed, IQ = 110.04 ± 12.21) were recruited. Participants in the clinical group had a diagnosis of ASD, informed by the Autism Diagnostic Observation Schedule-Generic (ADOS-G; Lord et al., 2000), or Autism Diagnostic Observation Schedule-2 (ADOS-2; Rutter et al., 2012), and confirmed by expert clinical judgment. Exclusion criteria for both groups included a history of neurological or neurodevelopment disorders (other than ASD for participants in the clinical group), acquired brain injury, uncorrected vision, colour blindness, IQ ≤ 65 and standard contraindications to MEG and MRI. Use of psychotropic medications was an exclusion criterion for control participants only, due to difficulty in recruiting medication-naïve adolescents with ASD. Seven participants with ASD were on medication, which included Bicentin, Celexa, Cipralex, Concerta, Gabapentin, Ritalin, Seroquel, Strattera and Zeldox. The study was approved by The Hospital for Sick Children Research Ethics Board and written informed consent was obtained from all participants and their legal guardians.
2.2 Characterization measures
2.2.1 Autism Diagnostic Observation Schedule
The ADOS-G (Lord et al., 2000) and ADOS-2 (Rutter et al., 2012) are semi-structured clinical assessments for diagnosing and assessing ASD. Research-reliable team members administered either module 3 or module 4 of the ADOS-G or ADOS-2 to all eligible participants with ASD. The mean and standard deviation of total scores (ADOS and ADOS-2 pooled) were 11.30 ± 3.30, which was well above the clinical threshold.
2.2.2 Wechsler Abbreviated Scales of Intelligence (WASI)
IQ was estimated using two subtests (vocabulary and matrix reasoning) of the Wechsler Abbreviated Scale of Intelligence (WASI-2; Wechsler, 2002). The two-subtest WASI-2 takes a shorter time to complete and has comparable validity to the full WASI-2.
2.3 MEG task
The task stimuli consisted of a face (happy, angry or neutral) presented concurrently with a scrambled pattern, each on either side of a central fixation cross (Fig. 1). Twenty-five colour photographs of different faces (13 males, 12 females) for each of the three expressions were selected from the NimStim Set of Facial Expressions; only happy and angry faces correctly categorized at a minimum of 80% accuracy were selected (Tottenham et al., 2009). To create unique scrambled patterns corresponding to each face, each of the selected faces from the NimStim set was divided into 64 cells and randomized. A mosaic was applied to the image (15 cells per square) after which a Gaussian blur was applied (10.0°) using Adobe Photoshop. Face–pattern pairs were matched for luminosity and colour.
Fifty trials of each of the three expressions in the left and right hemifields (each face was presented twice in each hemifield) were shown in randomized order such that the task included 300 trials in total. Emotions were irrelevant to the task as participants were instructed to fixate on the central cross and respond to the location of the scrambled pattern by pressing left or right buttons on a button box. Stimuli were presented using Presentation (http://www.neurobs.com/). Stimuli in each trial were presented for 80 ms with an ISI varying from 1300 to 1500 ms. Images were back-projected through a set of mirrors onto a screen positioned at a viewing distance of 79 cm. The visual angle of the stimuli was 6.9° and fell within the parafoveal region of view. Response latency was recorded for each trial. All participants performed the task in the MEG, following a practice session outside the MEG such that they were familiar with and understood the task.
2.4 Neuroimaging data acquisition
MEG data were recorded using a 151-channel CTF MEG system (MISL, Coquitlam, BC, Canada) at a 600 Hz sampling rate in a magnetically shielded room at the Hospital for Sick Children. A third order spatial gradient was used to improve signal quality with a recording bandpass of 0–150 Hz. All participants lay in a supine position with their head in the MEG dewar while they completed the experimental paradigm. Fiducial coils were placed on the left and right pre-auricular points and the nasion to monitor head position and movement within the dewar. These were replaced by radio-opaque markers for MRI co-registration. Each adolescent had a T1-weighted MR image (3D SAG MPRAGE: PAT, GRAPPA = 2, TR/TE/FA = 2300 ms/2.96 ms/90°, FOV = 28.8 × 19.2 cm, 256 × 256 matrix, 192 slices, slice thickness = 1.0 mm isotropic voxels) obtained from a 3T MR scanner (MAGNETOM Tim Trio, Siemens AG, Erlangen, Germany), with a 12-channel head coil. Unique inner skull surfaces were derived using FSL (http://fsl.fmrib.ox.ac.uk/fsl/fslwiki/) from each MR image and multi-sphere head models were fit to this surface (Lalancette et al., 2011).
2.5 Behavioural analyses
Group effects in IQ, response accuracy and response latencies across emotions were assessed using SPSS 20.0 software (SPSS Inc., Chicago, IL). Repeated measures ANCOVAs, with IQ as a covariate, were conducted to examine group (ASD vs. control) and emotion (angry vs. happy) effects.
2.6 Neuroimaging analyses
MEG activity was filtered off-line with a fourth order Butterworth filter with a bandpass of 1–30 Hz. MEG trials were epoched into 650 ms time windows with a 150 ms pre-stimulus baseline. Data were time-locked to trial onset and averaged by emotion type across subjects. Continuous head localization recording allowed us to manually exclude trials with head movement exceeding 10 mm (relative to median head position) using software developed in-house. Artefacts in MEG data were rejected using independent component analysis (ICA; EEGlab, http://www.sccn.ucsd.edu/eeglab), which allowed for the removal of eye and heart artefacts from the data while preserving neural activity of interest. Thirty components were examined for artefacts; there was no between-group difference in the number of components removed (ASD: 2.1 ± 1.2, Controls = 1.5 ± 1.3), t(23) = 1.55, p = 0.13.
Activation sources for each emotion were estimated using an event-related vector beamforming (ERB) method (Quraan et al., 2011) developed in-house from 50 to 400 ms, using sliding time windows of 50 ms in duration, overlapping by 25 ms (e.g., 50–100, 75–125 ms) for a total of 13 time windows. To beamform the ICA-cleaned data, the covariance matrix for each subject was regularized by the smallest non-zero eigenvalue (Fatima et al., 2013; Lancaster et al., 2000). Beamformer images with a spatial resolution of 5 mm were normalized to a template using Advanced Normalization Tools (ANTS; http://picsl.upenn.edu/software/ants/).
While neutral faces were originally intended as emotional baselines, the ‘emotional neutrality’ of neutral faces has been subject to debate given their ambiguity and tendency to be misinterpreted, especially in children (Carrera-Levillain & Fernandez-Dols, 1994; Lobaugh et al., 2006; Thomas et al., 2001). In light of similar findings in individuals with ASD, who have recently been shown to misinterpret neutral faces and assign negative valence to emotionally neutral faces (Eack et al., 2014), we chose not to use neutral faces as an emotional baseline. Instead, two-sample unpaired non-parametric random permutation tests (p < 0.05, Sidak-corrected for multiple comparisons, 10,000 permutations) were conducted on beamformer images to determine significant differences between group activity to happy and angry emotions. Non-parametric random permutation tests determine whether the sample means from two groups, in this case, ASD and controls, are from the same distribution, with the null hypothesis being that the two groups do come from the same distributions. The distribution of the test statistic under the null hypothesis is obtained by calculating all possible values of the test statistic by rearranging membership of participants' voxel values; the difference in sample means is calculated and recorded for many re-shufflings of group membership. The set of these calculated differences makes up a distribution of possible differences under the null hypothesis. The null hypothesis is rejected if the observed test value falls within the alpha level (α = 0.05).
Analysis of Functional Neuroimages (AFNI; Cox, 1996) was used to visualize images and in-house developed software was used to identify peak activity. To investigate between-group differences in the timing of emotion-related neural activation, the time courses of significant areas of peak activity identified from the average contrasts were computed. Two-sample non-parametric permutation tests (p < 0.05, Sidak-corrected, 10,000 permutations) were conducted on each normalized time point (pseudo-Z) to identify emotion-relevant between-group differences in neural activity across time. Lastly, MRIcron software (Rorden, 2012) was used to create 3D renderings of between group activations on spatially normalized brain images.
3 Results
3.1 Behavioural results
3.1.1 IQ
The 2-sub-test IQ scores were significantly lower in adolescents with ASD (M = 90.79, SD = 23.76) than in controls (M = 110.04, SD = 12.21), t(46) = –3.53, p = 0.001.
3.1.2 Accuracy
A 2 (emotion: happy, angry) × 2 (group: ASD, controls) repeated measures ANCOVA with IQ as a covariate showed a between-group effect on accuracy, F(1, 42) = 4.32, p = 0.04. Follow-up ANCOVAs revealed lower accuracy in responses (proportion correct) to angry faces (ASD: M = 0.87, SD = 0.10; Controls: M = 0.95, SD = 0.04; F(1, 42) = 5.53, p = 0.02). No other significant effects or interactions were found.
3.1.3 Response latency
A 2 (emotion: happy, angry) × 2 (group: ASD, controls) ANCOVA with IQ as a covariate showed no main or interaction effects on response latency.
3.2 MEG results: time course and source localization of neural activity
Significant between-group activations for happy and angry faces (p < 0.05, two-tailed, Sidak-corrected for multiple comparisons, 10,000 permutations) are listed in Table 1.
3.2.1 Angry
Apart from earlier and larger orbitofrontal activation in controls (75–125 ms; Fig. 2A), relative to adolescents with ASD, the other early activations (50–175 ms) in the presence of angry faces were all larger in the ASD group. These activations were lateralized to the left hemisphere, including the inferior frontal, inferior parietal and particularly the middle temporal areas. These latter two regions also showed greater activation in the ASD group later in the time-course: inferior parietal (275–325 ms) and middle temporal (350–400 ms) gyri. Otherwise to angry faces, there was significantly greater activity in the typically developing adolescents from 150 to 375 ms, involving the bilateral middle and orbital frontal areas, right middle temporal (250–350 ms), left ACC (225–275 ms; Fig. 2B), as well as right inferior temporal and supramarginal gyri (200–250 ms), and right posterior cingulate (325–375 ms; Fig. 2C).
3.2.2 Happy
To happy faces, the adolescents with ASD again showed early left-sided activity in the left middle temporal area (75–175 ms), and left angular (100–175 ms) and supramarginal gyri (150–200 ms). The left inferior temporal region also showed greater activation (275–325 ms) in the ASD group. Otherwise, there was extensive, greater activity in the control adolescents, almost entirely in the right hemisphere. The right inferior temporal gyrus showed sustained greater activity from 150 to 400 ms, with the middle temporal region more active from 225 to 275 ms. To the happy faces, controls also had greater activity than the ASD group in the right ACC (100–150 ms; Fig. 2D), supramarginal (150–200 ms) and angular (225–325 ms) gyri, in the superior, middle and inferior frontal areas between 200 and 400 ms, all in the right hemisphere, and in the right PCC (200–250 ms; 350–400 ms; Fig. 2E, Fig. 2F). The only left hemisphere regions with greater activation in the controls were in the left superior medial (100–150 ms) and orbital frontal cortices (150–200 ms) and in the left insula (350–400 ms; Fig. 2F).
4 Discussion
The present study examined neural activity during emotional face processing in adolescents with ASD compared to typically developing adolescents using MEG. This approach allowed us to determine both the timing and spatial localization of ongoing brain activity. Behavioural analyses showed poorer anger-specific accuracy in ASD, while neuroimaging analyses revealed atypical recruitment of neural regions in ASD during angry and happy face processing.
4.1 Behavioural results
While there were no significant between- or within-group differences in response latency, adolescents with ASD showed significantly lower accuracy, relative to controls, on trials with angry faces only. Despite poorer accuracy, however, the absence of significant between-group differences in response latencies indicates a lack of attentional bias across emotions or groups. A deficit in anger processing corroborates previous findings, as young individuals with ASD have been shown to make significantly more errors in identifying angry faces and also to mislabel faces with ambiguous emotions as angry more frequently than controls (Kuusikko et al., 2009; Philip et al., 2010). A lack of difference in response latency, despite poorer anger-specific accuracy, shows comparable subjective experiences of task difficulty between groups in processing the two emotions. Hence, our behavioural findings are consistent with the concept that social difficulties experienced by individuals with ASD may be, in part, attributable to the inaccurate perception or interpretation of other individuals' facial expressions.
4.2 |