PMC:4300004 JSONTXT 5 Projects

Annnotations TAB TSV DIC JSON TextAE

Id Subject Object Predicate Lexical cue
T1110 0-28762 CHEBI_15611 denotes Early neural activation during facial affect processing in adolescents with Autism Spectrum Disorder☆ Abstract Impaired social interaction is one of the hallmarks of Autism Spectrum Disorder (ASD). Emotional faces are arguably the most critical visual social stimuli and the ability to perceive, recognize, and interpret emotions is central to social interaction and communication, and subsequently healthy social development. However, our understanding of the neural and cognitive mechanisms underlying emotional face processing in adolescents with ASD is limited. We recruited 48 adolescents, 24 with high functioning ASD and 24 typically developing controls. Participants completed an implicit emotional face processing task in the MEG. We examined spatiotemporal differences in neural activation between the groups during implicit angry and happy face processing. While there were no differences in response latencies between groups across emotions, adolescents with ASD had lower accuracy on the implicit emotional face processing task when the trials included angry faces. MEG data showed atypical neural activity in adolescents with ASD during angry and happy face processing, which included atypical activity in the insula, anterior and posterior cingulate and temporal and orbitofrontal regions. Our findings demonstrate differences in neural activity during happy and angry face processing between adolescents with and without ASD. These differences in activation in social cognitive regions may index the difficulties in face processing and in comprehension of social reward and punishment in the ASD group. Thus, our results suggest that atypical neural activation contributes to impaired affect processing, and thus social cognition, in adolescents with ASD. Highlights • The ability to recognize and interpret emotions is central to social interaction. • Deficits in social interactions are hallmarks of autism spectrum disorder (ASD). • Adolescents with and without ASD completed an emotional face task in MEG. • MEG data showed atypical neural activity in ASD to both angry and happy faces. • Insula, cingulate, temporal and orbitofrontal activities were particularly affected in the ASD group. 1 Introduction Emotional face processing is an innate and universal ability that is integral to the acquisition of social skills (Ekman & Friesen, 1971; Meltzoff & Moore, 1977). The human face is the most important visual stimulus for human social interactions. The ability to extract the significance of expressive faces is critical for successful social interactions, as it facilitates the understanding of another's mental states and intentions and is important in guiding appropriate reciprocal behaviour. Impaired social functioning is one of the diagnostic hallmarks of Autism Spectrum Disorder (ASD). While it is generally understood that individuals with ASD experience difficulties with social cues, the current literature on emotional face processing in ASD has yielded inconsistent results with some studies finding deficits in emotional processing (e.g., Celani et al., 1999; Eack et al., 2014; García-Villamisar et al., 2010; Golan et al., 2008), with impairment in fear (Ashwin et al., 2007; Howard et al., 2000; Pelphrey et al., 2002), surprise (Baron-Cohen et al., 1993) and anger processing (Kuusikko et al., 2009), while others have noted no deficits (Adolphs et al., 2001; Balconi & Carrera, 2007; Buitelaar et al., 1999; Castelli, 2005; Tracy et al., 2011). In typical development, emotional face processing is associated with activation in a widespread neural network, encompassing the visual, limbic, temporal, temporoparietal and prefrontal regions (Blair et al., 1999; Fusar-Poli et al., 2009; Phillips et al., 1999; Vuilleumier & Pourtois, 2007). The processing of happy facial expressions implicates a number of structures including the amygdalae, insulae, cingulate, inferior, medial and middle frontal, fusiform and middle temporal areas (Breiter et al., 1996; Devinsky et al., 1995; Fusar-Poli et al., 2009; García-Villamisar et al., 2010; Hoehl et al., 2010; Kesler-West et al., 2001; Phillips et al., 1998; Thomas et al., 2001). Angry faces elicit cingulate, bilateral fusiform, inferior frontal, superior temporal, middle and medial superior frontal, and orbitofrontal activity (Blair et al., 1999; Devinsky et al., 1995; Fusar-Poli et al., 2009; García-Villamisar et al., 2010; Kesler-West et al., 2001); moreover, decreased activation to angry, relative to neutral, faces has been found in the caudate nucleus, superior temporal, anterior cingulate (ACC), and medial frontal regions (Phillips et al., 1999). While the amygdalae have been widely reported to be involved in threat processing, amygdala activation to angry faces has not been found consistently (e.g., Phillips et al., 1999; Luo et al., 2007; Whalen et al., 2001; but see Fusar-Poli et al., 2009). Atypical activation of the social brain networks using fMRI during emotional face processing has been found in adults with ASD, including reduced left amygdala and orbitofrontal activation (Ashwin et al., 2007), greater activity in the left superior temporal gyrus and right peristriate visual cortex (Critchley et al., 2000), and reduced fusiform and extrastriate activity, while also showing activation comparable to controls in the anterior cingulate, superior temporal, medial frontal and insula regions (Deeley et al., 2007). Children and adolescents with ASD have shown reduced fusiform but greater precuneus activity during an emotion-matching task, but not during a simpler emotion labelling task (Wang et al., 2004). Collectively, these studies suggest that task demands modulate neural activity during emotional processing to a greater extent in individuals with ASD, and have likely contributed to the discrepant findings in existing literature (Wang et al., 2004). Of the two major subsystems involved in social cognition, the ventral orbitofrontal-amygdala circuit is of particular relevance to ASD as it is implicated in socio-emotional regulation and behaviour through the processing of others' emotional states, responses and intentions (see Bachevalier & Loveland, 2006, for a review). The orbitofrontal cortex contributes to this subsystem by mediating emotional behaviour, social inhibition, reversal learning and altering of unsuitable behaviour (Blair et al., 1999; Dias et al., 1996; Elliott et al., 2000; Rolls, 2004; Van Honk et al., 2005) as well deriving positive social joy and reward (Britton et al., 2006; Morris & Dolan, 2001). Sharing a close functional and anatomical reciprocal interconnection with the orbitofrontal cortex, atypical activation in the left amygdala during affect processing has been noted in ASD (e.g., Corbett et al., 2009; Critchley et al., 2000). Hence, focusing on the orbitofrontal-amygdala circuit dysfunction may aid in delineating the profile of facial affect processing in ASD. 1.1 The spatiotemporal profile of neural activity during affect processing in ASD There are few studies on the timing of the neural processing of emotional faces in ASD. ERP studies have noted variously a delay, reduced, or lack of activity in children and adults with ASD, relative to controls (Batty et al., 2011; Dawson et al., 2004; McPartland et al., 2004; O'Connor et al., 2005; Wong et al., 2008). A recent ERP study found that adolescents with ASD failed to show emotion-specific responses that were observed in the typically developing group (Wagner et al., 2013). Further, while face scanning was significantly associated with patterns of neural activation during face processing in typically developing adolescents, no such association was found in ASD (Wagner et al., 2013). Significant impairments in emotion recognition in adult, but not child, faces have also been noted in adolescents with ASD (Lerner et al., 2013). The same study also reported a significant association between the latency of the face-sensitive N170 component, localized to the fusiform gyri and inferior temporal areas, and deficits in emotion recognition in adolescents with ASD. One study that investigated affective processing in adolescents with ASD using magnetoencephalography (MEG), which provides both timing and more accurate spatial information than EEG studies (Hari et al., 2010), showed reduced early emotion-specific gamma-band power, relative to controls, as well as later decreased power across all emotions (Wright et al., 2012). There do not appear to be other studies that have taken advantage of the spatio-temporal precision of MEG to determine the timing and/or brain regional activation differences in developmental ASD populations with emotional faces. 1.2 Anger processing While emotional facial expressions of fear and anger both constitute threat-relevant signals, angry faces are more appropriate than fearful faces for investigating emotional face processing in ASD. Processing of anger requires the understanding of social norms and context, topics with which individuals with ASD struggle (Berkowitz, 1999; Zeman & Garber, 1996). Angry facial expressions are also more likely to be produced in response to repeated aggravations rather than first-time behavioural transgressions (Averill, 1982). Given that individuals with ASD are often poor at recognizing others' mental states or social norms, they have likely encountered displays of anger without understanding the implications (Bal et al., 2010; Baron-Cohen et al., 1999; Begeer et al., 2006). Past studies have specifically noted atypical angry facial processing in individuals with ASD (Ashwin et al., 2006; Kuusikko et al., 2009; Rieff et al., 2007; Rump et al., 2009). Age-effects in anger processing have also been noted, as older children and adolescents are able to correctly identify angry faces more often than younger children, regardless of diagnosis (Lindner & Rosén, 2006). There are also sharp increases in anger-specific sensitivity from adolescence to adulthood, providing support for a later maturation of anger processing (Thomas et al., 2007). An fMRI study contrasting activations elicited by fear and anger in typical individuals showed that while both emotions activated a network of regions similarly, including the amygdalae and insulae, anger specifically elicited neural activation in a wider set of regions including the ventromedial prefrontal cortex and the posterior orbitofrontal cortex (Pichon et al., 2009). These areas have been implicated in behavioural regulation, supporting the idea that anger processing requires more resources and contextual information to adjust behaviour accordingly (Pichon et al., 2009). 1.3 Happiness processing in ASD Happiness is the only one of the six basic emotions that is definitely positive and is the first emotion to be accurately identified in early development (Markham & Adams, 1992). Seemingly typical processing of happy affect in individuals with ASD has been observed, which may be due to the greater frequency of encountering and hence greater familiarity with happy faces (Critchley et al., 2000; Farran et al., 2011). However, while happy faces are socially rewarding for typically developing individuals (Phillips et al., 1998), insensitivity towards social reward derived from happy faces has been shown in individuals with ASD (Sepeta et al., 2012). Hence, it is important to investigate the underlying neural processes involved in the processing of happy faces in ASD to determine whether social reward systems are appropriately activated. 1.4 Adolescence In typically developing individuals, recognition of emotional face expressions follows a protracted developmental trajectory that extends into late adolescence (e.g., Batty & Taylor, 2006; Kolb et al., 1992). Asynchronous development of emotion recognition is well established (De Sonneville et al., 2002; Pichon et al., 2009) and neuroimaging findings have demonstrated the shifting involvement of different neural networks over the course of development (Hung et al., 2012; Killgore & Yurgelun-Todd, 2007; Monk et al., 2003; Thomas et al., 2001). At around 11 years of age emotional processing abilities undergo marked improvement, which would implicate greater demands on neural processes involved in emotional processing in early adolescence (Tonks et al., 2007). There are few studies on emotional face processing in adolescents with ASD, representing a serious gap in the literature, as adolescence is a period of vulnerability, volatility and increased stress (Spear, 2000) during which the prevalence of negative emotions peaks and shows greater variability (Compas et al., 1995; Hare et al., 2008). Behavioural studies investigating emotional face processing in adolescents with ASD have shown affective processing comparable to controls (Grossman et al., 2000; Rieffe et al., 2007). Adolescents with ASD in these studies showed deficits in decoding emotions from static and dynamic facial affect overall, and, although they have been shown to identify specific emotions (Rump et al., 2009) such as happiness and sadness readily, they have difficulties with more complex emotions such as embarrassment (Capps et al., 1992) and identifying emotions when shown videos (Koning & Magill-Evans, 2001). Individuals with Asperger Syndrome but not high-functioning autism have shown intact emotion perception (Mazefsky & Oswald, 2007). Adults with ASD continue to experience difficulties when presented with brief or subtle emotions, and may never reach the same level of competency in emotional processing as typically developing adults (Begeer et al., 2006; Rieffe et al., 2007). The impairment in emotional processing in adults with ASD is consistent with the notion that individuals with ASD require compensatory strategies to achieve average performance. Although competency with emotional recognition in ASD improves with age, this skill often plateaus at a performance level below that of typically developing peers, or is associated with atypical brain activations to achieve comparable performance levels. These results underscore the need to examine the adolescent period of emotional processing in ASD to obtain a clearer understanding of the emotional development in this population. Thus, the present study explored the neural substrates of implicit emotional face processing in adolescents with ASD using MEG, providing both temporal and spatial measures of brain processes, focusing on the neural areas implicated in emotional processing. We hypothesized that adolescents with ASD would show (1) shorter response latencies to emotional faces as they are not distracted by the emotions and (2) reduced and delayed patterns of neural activation in the frontal, limbic and temporal areas, key constituents of the social brain. 2 Materials and methods 2.1 Participants Twenty-four adolescents (age range = 12–15 years) diagnosed with ASD (20 males, 14.03 ± 1.20 years, 23 right-handed, 7 medicated, IQ = 90.79 ± 23.76) and 24 healthy controls (19 males, 14.27 ± 1.12 years, 23 right-handed, IQ = 110.04 ± 12.21) were recruited. Participants in the clinical group had a diagnosis of ASD, informed by the Autism Diagnostic Observation Schedule-Generic (ADOS-G; Lord et al., 2000), or Autism Diagnostic Observation Schedule-2 (ADOS-2; Rutter et al., 2012), and confirmed by expert clinical judgment. Exclusion criteria for both groups included a history of neurological or neurodevelopment disorders (other than ASD for participants in the clinical group), acquired brain injury, uncorrected vision, colour blindness, IQ ≤ 65 and standard contraindications to MEG and MRI. Use of psychotropic medications was an exclusion criterion for control participants only, due to difficulty in recruiting medication-naïve adolescents with ASD. Seven participants with ASD were on medication, which included Bicentin, Celexa, Cipralex, Concerta, Gabapentin, Ritalin, Seroquel, Strattera and Zeldox. The study was approved by The Hospital for Sick Children Research Ethics Board and written informed consent was obtained from all participants and their legal guardians. 2.2 Characterization measures 2.2.1 Autism Diagnostic Observation Schedule The ADOS-G (Lord et al., 2000) and ADOS-2 (Rutter et al., 2012) are semi-structured clinical assessments for diagnosing and assessing ASD. Research-reliable team members administered either module 3 or module 4 of the ADOS-G or ADOS-2 to all eligible participants with ASD. The mean and standard deviation of total scores (ADOS and ADOS-2 pooled) were 11.30 ± 3.30, which was well above the clinical threshold. 2.2.2 Wechsler Abbreviated Scales of Intelligence (WASI) IQ was estimated using two subtests (vocabulary and matrix reasoning) of the Wechsler Abbreviated Scale of Intelligence (WASI-2; Wechsler, 2002). The two-subtest WASI-2 takes a shorter time to complete and has comparable validity to the full WASI-2. 2.3 MEG task The task stimuli consisted of a face (happy, angry or neutral) presented concurrently with a scrambled pattern, each on either side of a central fixation cross (Fig. 1). Twenty-five colour photographs of different faces (13 males, 12 females) for each of the three expressions were selected from the NimStim Set of Facial Expressions; only happy and angry faces correctly categorized at a minimum of 80% accuracy were selected (Tottenham et al., 2009). To create unique scrambled patterns corresponding to each face, each of the selected faces from the NimStim set was divided into 64 cells and randomized. A mosaic was applied to the image (15 cells per square) after which a Gaussian blur was applied (10.0°) using Adobe Photoshop. Face–pattern pairs were matched for luminosity and colour. Fifty trials of each of the three expressions in the left and right hemifields (each face was presented twice in each hemifield) were shown in randomized order such that the task included 300 trials in total. Emotions were irrelevant to the task as participants were instructed to fixate on the central cross and respond to the location of the scrambled pattern by pressing left or right buttons on a button box. Stimuli were presented using Presentation (http://www.neurobs.com/). Stimuli in each trial were presented for 80 ms with an ISI varying from 1300 to 1500 ms. Images were back-projected through a set of mirrors onto a screen positioned at a viewing distance of 79 cm. The visual angle of the stimuli was 6.9° and fell within the parafoveal region of view. Response latency was recorded for each trial. All participants performed the task in the MEG, following a practice session outside the MEG such that they were familiar with and understood the task. 2.4 Neuroimaging data acquisition MEG data were recorded using a 151-channel CTF MEG system (MISL, Coquitlam, BC, Canada) at a 600 Hz sampling rate in a magnetically shielded room at the Hospital for Sick Children. A third order spatial gradient was used to improve signal quality with a recording bandpass of 0–150 Hz. All participants lay in a supine position with their head in the MEG dewar while they completed the experimental paradigm. Fiducial coils were placed on the left and right pre-auricular points and the nasion to monitor head position and movement within the dewar. These were replaced by radio-opaque markers for MRI co-registration. Each adolescent had a T1-weighted MR image (3D SAG MPRAGE: PAT, GRAPPA = 2, TR/TE/FA = 2300 ms/2.96 ms/90°, FOV = 28.8 × 19.2 cm, 256 × 256 matrix, 192 slices, slice thickness = 1.0 mm isotropic voxels) obtained from a 3T MR scanner (MAGNETOM Tim Trio, Siemens AG, Erlangen, Germany), with a 12-channel head coil. Unique inner skull surfaces were derived using FSL (http://fsl.fmrib.ox.ac.uk/fsl/fslwiki/) from each MR image and multi-sphere head models were fit to this surface (Lalancette et al., 2011). 2.5 Behavioural analyses Group effects in IQ, response accuracy and response latencies across emotions were assessed using SPSS 20.0 software (SPSS Inc., Chicago, IL). Repeated measures ANCOVAs, with IQ as a covariate, were conducted to examine group (ASD vs. control) and emotion (angry vs. happy) effects. 2.6 Neuroimaging analyses MEG activity was filtered off-line with a fourth order Butterworth filter with a bandpass of 1–30 Hz. MEG trials were epoched into 650 ms time windows with a 150 ms pre-stimulus baseline. Data were time-locked to trial onset and averaged by emotion type across subjects. Continuous head localization recording allowed us to manually exclude trials with head movement exceeding 10 mm (relative to median head position) using software developed in-house. Artefacts in MEG data were rejected using independent component analysis (ICA; EEGlab, http://www.sccn.ucsd.edu/eeglab), which allowed for the removal of eye and heart artefacts from the data while preserving neural activity of interest. Thirty components were examined for artefacts; there was no between-group difference in the number of components removed (ASD: 2.1 ± 1.2, Controls = 1.5 ± 1.3), t(23) = 1.55, p = 0.13. Activation sources for each emotion were estimated using an event-related vector beamforming (ERB) method (Quraan et al., 2011) developed in-house from 50 to 400 ms, using sliding time windows of 50 ms in duration, overlapping by 25 ms (e.g., 50–100, 75–125 ms) for a total of 13 time windows. To beamform the ICA-cleaned data, the covariance matrix for each subject was regularized by the smallest non-zero eigenvalue (Fatima et al., 2013; Lancaster et al., 2000). Beamformer images with a spatial resolution of 5 mm were normalized to a template using Advanced Normalization Tools (ANTS; http://picsl.upenn.edu/software/ants/). While neutral faces were originally intended as emotional baselines, the ‘emotional neutrality’ of neutral faces has been subject to debate given their ambiguity and tendency to be misinterpreted, especially in children (Carrera-Levillain & Fernandez-Dols, 1994; Lobaugh et al., 2006; Thomas et al., 2001). In light of similar findings in individuals with ASD, who have recently been shown to misinterpret neutral faces and assign negative valence to emotionally neutral faces (Eack et al., 2014), we chose not to use neutral faces as an emotional baseline. Instead, two-sample unpaired non-parametric random permutation tests (p < 0.05, Sidak-corrected for multiple comparisons, 10,000 permutations) were conducted on beamformer images to determine significant differences between group activity to happy and angry emotions. Non-parametric random permutation tests determine whether the sample means from two groups, in this case, ASD and controls, are from the same distribution, with the null hypothesis being that the two groups do come from the same distributions. The distribution of the test statistic under the null hypothesis is obtained by calculating all possible values of the test statistic by rearranging membership of participants' voxel values; the difference in sample means is calculated and recorded for many re-shufflings of group membership. The set of these calculated differences makes up a distribution of possible differences under the null hypothesis. The null hypothesis is rejected if the observed test value falls within the alpha level (α = 0.05). Analysis of Functional Neuroimages (AFNI; Cox, 1996) was used to visualize images and in-house developed software was used to identify peak activity. To investigate between-group differences in the timing of emotion-related neural activation, the time courses of significant areas of peak activity identified from the average contrasts were computed. Two-sample non-parametric permutation tests (p < 0.05, Sidak-corrected, 10,000 permutations) were conducted on each normalized time point (pseudo-Z) to identify emotion-relevant between-group differences in neural activity across time. Lastly, MRIcron software (Rorden, 2012) was used to create 3D renderings of between group activations on spatially normalized brain images. 3 Results 3.1 Behavioural results 3.1.1 IQ The 2-sub-test IQ scores were significantly lower in adolescents with ASD (M = 90.79, SD = 23.76) than in controls (M = 110.04, SD = 12.21), t(46) = –3.53, p = 0.001. 3.1.2 Accuracy A 2 (emotion: happy, angry) × 2 (group: ASD, controls) repeated measures ANCOVA with IQ as a covariate showed a between-group effect on accuracy, F(1, 42) = 4.32, p = 0.04. Follow-up ANCOVAs revealed lower accuracy in responses (proportion correct) to angry faces (ASD: M = 0.87, SD = 0.10; Controls: M = 0.95, SD = 0.04; F(1, 42) = 5.53, p = 0.02). No other significant effects or interactions were found. 3.1.3 Response latency A 2 (emotion: happy, angry) × 2 (group: ASD, controls) ANCOVA with IQ as a covariate showed no main or interaction effects on response latency. 3.2 MEG results: time course and source localization of neural activity Significant between-group activations for happy and angry faces (p < 0.05, two-tailed, Sidak-corrected for multiple comparisons, 10,000 permutations) are listed in Table 1. 3.2.1 Angry Apart from earlier and larger orbitofrontal activation in controls (75–125 ms; Fig. 2A), relative to adolescents with ASD, the other early activations (50–175 ms) in the presence of angry faces were all larger in the ASD group. These activations were lateralized to the left hemisphere, including the inferior frontal, inferior parietal and particularly the middle temporal areas. These latter two regions also showed greater activation in the ASD group later in the time-course: inferior parietal (275–325 ms) and middle temporal (350–400 ms) gyri. Otherwise to angry faces, there was significantly greater activity in the typically developing adolescents from 150 to 375 ms, involving the bilateral middle and orbital frontal areas, right middle temporal (250–350 ms), left ACC (225–275 ms; Fig. 2B), as well as right inferior temporal and supramarginal gyri (200–250 ms), and right posterior cingulate (325–375 ms; Fig. 2C). 3.2.2 Happy To happy faces, the adolescents with ASD again showed early left-sided activity in the left middle temporal area (75–175 ms), and left angular (100–175 ms) and supramarginal gyri (150–200 ms). The left inferior temporal region also showed greater activation (275–325 ms) in the ASD group. Otherwise, there was extensive, greater activity in the control adolescents, almost entirely in the right hemisphere. The right inferior temporal gyrus showed sustained greater activity from 150 to 400 ms, with the middle temporal region more active from 225 to 275 ms. To the happy faces, controls also had greater activity than the ASD group in the right ACC (100–150 ms; Fig. 2D), supramarginal (150–200 ms) and angular (225–325 ms) gyri, in the superior, middle and inferior frontal areas between 200 and 400 ms, all in the right hemisphere, and in the right PCC (200–250 ms; 350–400 ms; Fig. 2E, Fig. 2F). The only left hemisphere regions with greater activation in the controls were in the left superior medial (100–150 ms) and orbital frontal cortices (150–200 ms) and in the left insula (350–400 ms; Fig. 2F). 4 Discussion The present study examined neural activity during emotional face processing in adolescents with ASD compared to typically developing adolescents using MEG. This approach allowed us to determine both the timing and spatial localization of ongoing brain activity. Behavioural analyses showed poorer anger-specific accuracy in ASD, while neuroimaging analyses revealed atypical recruitment of neural regions in ASD during angry and happy face processing. 4.1 Behavioural results While there were no significant between- or within-group differences in response latency, adolescents with ASD showed significantly lower accuracy, relative to controls, on trials with angry faces only. Despite poorer accuracy, however, the absence of significant between-group differences in response latencies indicates a lack of attentional bias across emotions or groups. A deficit in anger processing corroborates previous findings, as young individuals with ASD have been shown to make significantly more errors in identifying angry faces and also to mislabel faces with ambiguous emotions as angry more frequently than controls (Kuusikko et al., 2009; Philip et al., 2010). A lack of difference in response latency, despite poorer anger-specific accuracy, shows comparable subjective experiences of task difficulty between groups in processing the two emotions. Hence, our behavioural findings are consistent with the concept that social difficulties experienced by individuals with ASD may be, in part, attributable to the inaccurate perception or interpretation of other individuals' facial expressions. 4.2
T1 59-70 Patient denotes adolescents
T2 113-121 PATO_0000762 denotes Impaired
T3 247-253 PATO_0000234 denotes visual
T4 535-546 Patient denotes adolescents
T5 559-566 PATO_0000392 denotes limited
T6 584-595 Patient denotes adolescents
T7 605-609 PATO_0000469 denotes high
T8 664-676 Patient denotes Participants
T9 956-967 Patient denotes adolescents
T10 1125-1136 Patient denotes adolescents
T11 1410-1421 Patient denotes adolescents
T12 1614-1619 CHEBI_24433 denotes group
T13 1694-1702 PATO_0000762 denotes impaired
T14 1752-1763 Patient denotes adolescents
T41 2336-2345 PATO_0001555 denotes f social
T42 2438-2444 PATO_0000234 denotes visual
T43 2440-2448 PATO_0002400 denotes sual sti
T44 2472-2480 PATO_0002400 denotes nteracti
T45 2692-2701 PATO_0000618 denotes uiding ap
T46 2702-2710 PATO_0002400 denotes ropriate
T47 2733-2741 PATO_0000762 denotes Impaired
T48 3003-3015 PATO_0000991 denotes inconsistent
T49 3026-3033 PATO_0001880 denotes th some
T50 3034-3041 CHEBI_33252 denotes studies
T51 3038-3048 PATO_0000076 denotes ies findin
T52 3312-3320 PATO_0000200 denotes d anger
T53 3545-3549 PATO_0000366 denotes ssin
T54 3554-3569 PATO_0001668 denotes associated with
T55 3566-3576 PATO_0001510 denotes ith activa
T56 3630-3636 PATO_0000234 denotes visual
T57 3635-3639 PATO_0000366 denotes l, l
T58 3668-3673 PATO_0000367 denotes tal a
T59 3677-3681 PATO_0000366 denotes refr
T60 3686-3692 PATO_0000234 denotes regio
T61 3739-3748 PATO_0002400 denotes ; Phillip
T62 5159-5182 PM9180 denotes superior temporal gyrus
T63 5449-5457 Patient denotes Children
T64 5462-5473 Patient denotes adolescents
T312 7201-7209 Patient denotes children
T313 7364-7370 PATO_0001484 denotes recent
T314 7392-7403 Patient denotes adolescents
T315 7502-7507 CHEBI_24433 denotes group
T316 7578-7593 PATO_0001668 denotes associated with
T317 7671-7682 Patient denotes adolescents
T318 7794-7799 Patient denotes adult
T319 7809-7814 Patient denotes child
T320 7846-7857 Patient denotes adolescents
T321 7957-7964 PATO_0001005 denotes latency
T322 7977-7986 PATO_0000516 denotes sensitive
T323 8003-8012 PATO_0000627 denotes localized
T324 8020-8028 PATO_0002400 denotes fusiform
T325 8102-8113 Patient denotes adolescents
T330 8176-8187 Patient denotes adolescents
T326 8303-8306 CHEBI_73494 denotes EEG
T327 8374-8379 CHEBI_30212 denotes gamma
T328 8385-8390 PATO_0001024 denotes power
T329 8441-8446 PATO_0001024 denotes power
T361 8875-8882 PATO_0000874 denotes fearful
T362 9170-9181 PATO_0000077 denotes response to
T363 9222-9226 PATO_0000165 denotes time
T364 9372-9376 Patient denotes they
T365 9459-9462 CHEBI_64198 denotes Bal
T366 9704-9707 PATO_0000011 denotes Age
T367 9704-9707 CHEBI_84123 denotes Age
T368 9767-9775 Patient denotes children
T374 9780-9791 Patient denotes adolescents
T369 9859-9867 Patient denotes children
T370 9904-9907 CHEBI_26523 denotes Ros
T371 9933-9938 PATO_0001419 denotes sharp
T372 9967-9978 PATO_0000085 denotes sensitivity
T373 10520-10530 PATO_0000076 denotes regulation
T394 11720-11724 PATO_0000693 denotes late
T395 11786-11798 PATO_0000688 denotes Asynchronous
T396 12148-12151 CHEBI_84123 denotes age
T397 12148-12151 PATO_0000011 denotes age
T398 12191-12197 PATO_0000465 denotes marked
T399 12399-12410 Patient denotes adolescents
T400 12487-12493 PATO_0001309 denotes period
T401 12631-12642 PATO_0001303 denotes variability
T402 12748-12759 Patient denotes adolescents
T403 12870-12881 Patient denotes Adolescents
T404 13006-13010 Patient denotes they
T405 13116-13120 Patient denotes they
T406 13149-13156 PATO_0001504 denotes complex
T407 13326-13330 PATO_0000469 denotes high
T408 13721-13731 PATO_0000990 denotes consistent
T409 13906-13909 CHEBI_84123 denotes age
T410 13906-13909 PATO_0000011 denotes age
T411 14008-14023 PATO_0001668 denotes associated with
T412 14142-14152 Patient denotes adolescent
T413 14142-14152 PATO_0001189 denotes adolescent
T414 14153-14159 PATO_0001309 denotes period
T415 14284-14291 PATO_0000467 denotes present
T416 14370-14381 Patient denotes adolescents
T419 14553-14564 Patient denotes adolescents
T417 14638-14642 Patient denotes they
T418 14698-14705 PATO_0000502 denotes delayed
T466 14874-14885 Patient denotes adolescents
T467 14887-14890 PATO_0000011 denotes age
T468 14887-14890 CHEBI_84123 denotes age
T469 14935-14940 Patient denotes males
T470 14965-14970 PATO_0000367 denotes right
T471 15040-15045 Patient denotes males
T472 15070-15075 PATO_0000367 denotes right
T473 15121-15133 Patient denotes Participants
T474 15150-15155 CHEBI_24433 denotes group
T475 15510-15522 Patient denotes participants
T476 15539-15544 CHEBI_24433 denotes group
T477 15590-15596 PATO_0000014 denotes colour
T478 15734-15746 Patient denotes participants
T488 15802-15813 Patient denotes adolescents
T479 15830-15842 Patient denotes participants
T480 15897-15903 CHEBI_3724 denotes Celexa
T481 15915-15923 CHEBI_31836 denotes Concerta
T482 15925-15935 CHEBI_42797 denotes Gabapentin
T483 15937-15944 CHEBI_31836 denotes Ritalin
T484 15946-15954 CHEBI_8708 denotes Seroquel
T485 15970-15976 CHEBI_32314 denotes Zeldox
T486 16026-16034 Patient denotes Children
T487 16108-16120 Patient denotes participants
T509 16300-16310 PATO_0001411 denotes structured
T513 16478-16490 Patient denotes participants
T510 16523-16532 PATO_0002175 denotes deviation
T511 16627-16636 PATO_0000152 denotes threshold
T512 16882-16886 PATO_0000165 denotes time
T526 17144-17150 PATO_0000014 denotes colour
T527 17186-17191 Patient denotes males
T528 17196-17203 Patient denotes females
T529 17425-17431 PATO_0000430 denotes unique
T530 17617-17623 PATO_0000413 denotes square
T531 17747-17753 PATO_0000014 denotes colour
T532 17808-17812 PATO_0000366 denotes left
T533 17817-17822 PATO_0000367 denotes right
T534 18004-18016 Patient denotes participants
T535 18083-18091 PATO_0000140 denotes location
T536 18129-18133 PATO_0000366 denotes left
T537 18137-18142 PATO_0000367 denotes right
T538 18230-18233 CHEBI_17905 denotes com
T539 18416-18424 PATO_0000040 denotes distance
T540 18439-18445 PATO_0000234 denotes visual
T541 18446-18451 PATO_0002326 denotes angle
T542 18532-18539 PATO_0001005 denotes latency
T543 18573-18585 Patient denotes participants
T544 18672-18676 Patient denotes they
T630 18866-18870 PATO_0000161 denotes rate
T631 18928-18936 Patient denotes Children
T632 18996-19003 PATO_0000001 denotes quality
T633 19047-19059 Patient denotes participants
T634 19076-19084 PATO_0000140 denotes position
T635 19124-19128 Patient denotes they
T636 19200-19204 PATO_0000366 denotes left
T637 19209-19214 PATO_0000367 denotes right
T638 19267-19275 PATO_0000140 denotes position
T639 19330-19335 CHEBI_33325 denotes radio
T640 19336-19342 PATO_0000963 denotes opaque
T641 19381-19391 PATO_0001189 denotes adolescent
T645 19381-19391 Patient denotes adolescent
T642 19435-19438 CHEBI_74926 denotes PAT
T643 19542-19551 PATO_0000915 denotes thickness
T644 19690-19696 PATO_0000430 denotes Unique
T736 20358-20362 PATO_0000165 denotes time
T737 20418-20422 PATO_0000165 denotes time
T738 20481-20489 Patient denotes subjects
T739 20491-20501 PATO_0000689 denotes Continuous
T740 20628-20636 PATO_0000140 denotes position
T741 20747-20750 CHEBI_29202 denotes ICA
T742 20979-20984 CHEBI_24433 denotes group
T743 21003-21012 PATO_0001555 denotes number of
T744 21276-21280 PATO_0000165 denotes time
T745 21301-21309 PATO_0001309 denotes duration
T746 21311-21322 PATO_0002488 denotes overlapping
T747 21376-21380 PATO_0000165 denotes time
T748 21406-21409 CHEBI_29202 denotes ICA
T749 21455-21462 Patient denotes subject
T750 21650-21658 PATO_0000694 denotes Advanced
T751 21848-21855 Patient denotes subject
T752 21892-21900 PATO_0002360 denotes tendency
T753 21937-21945 Patient denotes children
T754 22036-22041 PATO_0000665 denotes light
T755 22036-22041 CHEBI_30212 denotes light
T756 22087-22090 Patient denotes who
T757 22384-22392 PATO_0002118 denotes multiple
T758 22508-22513 CHEBI_24433 denotes group
T759 22694-22706 PATO_0000060 denotes distribution
T760 22800-22812 PATO_0000060 denotes distribution
T772 22959-22971 Patient denotes participants
T761 23071-23076 CHEBI_24433 denotes group
T762 23140-23152 PATO_0000060 denotes distribution
T763 23257-23262 PATO_0000002 denotes value
T764 23280-23285 CHEBI_30216 denotes alpha
T765 23316-23326 PATO_0001510 denotes Functional
T766 23477-23482 CHEBI_24433 denotes group
T767 23551-23555 PATO_0000165 denotes time
T768 23782-23786 PATO_0000165 denotes time
T769 23841-23846 CHEBI_24433 denotes group
T770 23885-23889 PATO_0000165 denotes time
T771 23975-23980 CHEBI_24433 denotes group
T857 24133-24144 Patient denotes adolescents
T851 24297-24302 CHEBI_24433 denotes group
T852 24384-24389 CHEBI_24433 denotes group
T853 24493-24503 PATO_0001470 denotes proportion
T854 24688-24695 PATO_0001005 denotes latency
T855 24729-24734 CHEBI_24433 denotes group
T856 24831-24838 PATO_0001005 denotes latency
T870 24859-24863 PATO_0000165 denotes time
T871 24934-24939 CHEBI_24433 denotes group
T872 25021-25029 PATO_0002118 denotes multiple
T873 25202-25213 Patient denotes adolescents
T874 25322-25327 CHEBI_24433 denotes group
T875 25352-25363 PATO_0000626 denotes lateralized
T876 25371-25375 PATO_0000366 denotes left
T877 25549-25554 CHEBI_24433 denotes group
T878 25568-25572 PATO_0000165 denotes time
T879 25746-25757 Patient denotes adolescents
T880 25792-25801 PATO_0000618 denotes bilateral
T881 25836-25841 PATO_0000367 denotes right
T882 25872-25876 PATO_0000366 denotes left
T883 25915-25920 PATO_0000367 denotes right
T884 25980-25985 PATO_0000367 denotes right
T885 26063-26074 Patient denotes adolescents
T886 26103-26107 PATO_0000366 denotes left
T887 26130-26134 PATO_0000366 denotes left
T888 26151-26155 PATO_0001323 denotes area
T889 26173-26177 PATO_0000366 denotes left
T890 26178-26185 PATO_0001977 denotes angular
T891 26240-26244 PATO_0000366 denotes left
T892 26325-26330 CHEBI_24433 denotes group
T904 26396-26407 Patient denotes adolescents
T893 26432-26437 PATO_0000367 denotes right
T894 26454-26459 PATO_0000367 denotes right
T895 26575-26581 PATO_0002354 denotes active
T896 26670-26675 CHEBI_24433 denotes group
T897 26683-26688 PATO_0000367 denotes right
T898 26747-26754 PATO_0001977 denotes angular
T899 26860-26865 PATO_0000367 denotes right
T900 26889-26894 PATO_0000367 denotes right
T901 26952-26956 PATO_0000366 denotes left
T902 27028-27032 PATO_0000366 denotes left
T903 27115-27119 PATO_0000366 denotes left
T1083 27169-27176 PATO_0000467 denotes present
T1084 27244-27255 Patient denotes adolescents
T1085 27298-27309 Patient denotes adolescents
T1090 27694-27699 CHEBI_24433 denotes group
T1091 27724-27731 PATO_0001005 denotes latency
T1097 27733-27744 Patient denotes adolescents
T1092 27884-27891 PATO_0000462 denotes absence
T1093 27915-27920 CHEBI_24433 denotes group
T1094 28084-28089 PATO_0000309 denotes young
T1095 28357-28364 PATO_0001005 denotes latency
T1096 28548-28558 PATO_0000990 denotes consistent
T1111 28794-28798 PATO_0000366 denotes mons
T1112 28806-28813 PATO_0001504 denotes complex
T1113 28813-28817 PATO_0001323 denotes pro
T1114 28825-28829 CHEBI_50906 denotes of
T1115 28828-28836 PATO_0001227 denotes both ov
T1116 28840-28845 PATO_0000367 denotes and u
T1117 28852-28867 PATO_0001668 denotes tivation in the
T1118 28863-28874 PATO_0000077 denotes the fronta
T1119 28870-28874 PATO_0000366 denotes onta
T1120 28887-28892 CHEBI_32849 denotes tempo
T1121 28895-28906 PATO_0000077 denotes rietal, and
T1122 28929-28939 PATO_0001510 denotes Of particu
T1123 28941-28951 PATO_0000990 denotes r interest
T1124 28960-28968 PATO_0000460 denotes abnormal
T1125 28963-28967 PATO_0000366 denotes orma
T1126 28965-28969 PATO_0000366 denotes mal
T1127 28970-28976 PATO_0000234 denotes eural
T1128 29022-29027 CHEBI_24433 denotes enco
T1129 29050-29054 PATO_0000366 denotes ofro
T1130 29055-29061 PATO_0000234 denotes tal co
T1131 29057-29062 PATO_0000367 denotes l cor
T1132 29090-29095 CHEBI_24433 denotes twork
T1133 29136-29147 PATO_0000077 denotes ion is cons
T1134 29143-29153 PATO_0000990 denotes consistent
T1135 29146-29153 PATO_0001977 denotes sistent
T1136 29174-29178 PATO_0000366 denotes omal
T1137 29182-29186 PATO_0000573 denotes long
T1138 29197-29204 PATO_0001977 denotes ections
T1139 29226-29230 PATO_0000366 denotes crui
T1140 29237-29245 PATO_0000762 denotes f greate
T1141 29240-29246 PATO_0001020 denotes reater
T1142 29263-29268 PATO_0000367 denotes rks i
T1143 29271-29276 PATO_0000367 denotes SD (C
T1144 29302-29306 CHEBI_50906 denotes Just
T1145 29383-29390 CHEBI_33252 denotes and co
T1146 29395-29402 PATO_0001977 denotes ate mul
T1147 29399-29407 PATO_0002118 denotes multiple
T1148 29410-29419 PATO_0000141 denotes gnitive d
T1149 29419-29423 PATO_0000366 denotes omai
T1150 29459-29463 PATO_0000366 denotes ures
T1151 29468-29473 PATO_0000367 denotes data
T1152 29477-29482 PATO_0000367 denotes gest
T1153 29490-29497 PATO_0001977 denotes ividual
T1154 29498-29507 PATO_0000618 denotes with ASD
T1155 29527-29535 PATO_0001501 denotes e recrui
T1156 29532-29539 PATO_0000467 denotes ruitmen
T1157 29536-29541 PATO_0000367 denotes ment
T1158 29573-29581 PATO_0002118 denotes luding t
T1159 29575-29580 PATO_0000367 denotes ding
T1160 29623-29627 PATO_0000165 denotes ions
T1161 29635-29650 PATO_0001668 denotes impact their ab
T1162 29642-29647 PATO_0000367 denotes their
T1163 29658-29663 PATO_0000367 denotes with
T1164 29686-29691 PATO_0000366 denotes g. 4
T1165 29692-29701 PATO_0001977 denotes 2.1 Atyp
T1166 29712-29717 CHEBI_30212 denotes ate a
T1167 29712-29717 PATO_0000665 denotes ate a
T1168 29725-29730 PATO_0000367 denotes n: de
T1169 29728-29735 PATO_0000502 denotes delayed
T1170 29736-29746 PATO_0001510 denotes functional
T1171 29743-29752 PATO_0000516 denotes nal speci
T1172 29799-29803 PATO_0000366 denotes ng i
T1173 29804-29811 PATO_0001977 denotes ASD Ad
T1174 29809-29820 Patient denotes Adolescents
T1175 29845-29849 PATO_0000366 denotes left
T1176 29845-29850 CHEBI_32849 denotes left
T1177 29858-29862 PATO_0000693 denotes late
T1178 29862-29867 PATO_0000367 denotes righ
T1179 29863-29868 PATO_0000367 denotes right
T1180 29919-29923 PATO_0000165 denotes s an
T1181 29926-29933 PATO_0000874 denotes arly ri
T1182 29931-29936 PATO_0000367 denotes right
T1183 29978-29985 PATO_0001504 denotes s. The
T1184 29996-30006 PATO_0000990 denotes yrus is fu
T1185 30016-30023 PATO_0001977 denotes and cy
T1186 30021-30030 PATO_0000618 denotes cytoarchi
T1187 30073-30083 PATO_0000990 denotes and poste
T1188 30112-30116 CHEBI_50906 denotes nect
T1189 30127-30134 PATO_0001977 denotes ferent
T1190 30186-30192 PATO_0001309 denotes view).
T1191 30194-30205 PATO_0000077 denotes lassically,
T1192 30199-30202 CHEBI_26523 denotes cal
T1193 30226-30232 PATO_0000465 denotes is div
T1194 30586-30594 Patient denotes children
T1195 30722-30730 Patient denotes children
T1196 30814-30825 Patient denotes adolescents
T1197 31068-31079 Patient denotes adolescents
T1198 31376-31384 Patient denotes children
T1199 31553-31564 Patient denotes adolescents
T1200 31730-31741 Patient denotes Adolescents
T1201 33315-33326 Patient denotes adolescents
T1202 33511-33522 Patient denotes adolescents
T1203 34165-34176 Patient denotes adolescents
T1204 34664-34675 Patient denotes adolescents
T1205 36297-36305 Patient denotes Children
T1206 37496-37507 Patient denotes adolescents
T1207 37928-37939 Patient denotes adolescents
T1208 38678-38689 Patient denotes adolescents
T1209 38760-38771 Patient denotes adolescents
T1212 39167-39178 Patient denotes adolescents
T1210 39484-39489 Patient denotes child
T1211 39494-39499 Patient denotes adult