Publications of Striano, Tricia

Eye contact and emotional face processing in 6-month-old infants: advanced statistical methods applied to event-related potentials

Event-related potential (ERP) studies with infants are often limited by a small number of measurements. We introduce a weighted general linear mixed model analysis with a time-varying covariate, which allows for the efficient analysis of all available event-related potential data of infants. This method allows controlling the signal to noise ratio effect on averaged ERP estimates due to small and varying numbers of trials. The method enables analyzing ERP data sets of infants, which would often not be possible otherwise. We illustrate this method by analyzing an experimental study and discuss the advantages in comparison to currently used methods as well as its potential limitations. In this study, 6-month-old infants saw a face showing a neutral or an angry expression in combination with direct or averted eye gaze. We examined how the infant brain processes facial expressions and whether the direction of eye gaze has an influence on it. We focused on the infant Negative Central ERP component (Nc). The neutral expression elicited larger amplitude and peaked earlier than the angry expression. An interaction between emotion and gaze was found for Nc latency, suggesting that emotions are processed in combination with eye gaze in infancy.

"Did You Call Me?'' 5-Month-Old Infants Own Name Guides Their Attention

An infant's own name is a unique social cue. Infants are sensitive to their own name by 4 months of age, but whether they use their names as a social cue is unknown. Electroencephalogram (EEG) was measured as infants heard their own name or stranger's names and while looking at novel objects. Event related brain potentials (ERPs) in response to names revealed that infants differentiate their own name from stranger names from the first phoneme. The amplitude of the ERPs to objects indicated that infants attended more to objects after hearing their own names compared to another name. Thus, by 5 months of age infants not only detect their name, but also use it as a social cue to guide their attention to events and objects in the world.

Processing Faces in Dyadic and Triadic Contexts

In a series of four experiments we assessed whether functional properties of the human face, such as signaling an object through eye gaze, influence face processing in 3- and 4-month-old infants. Infants viewed canonical and scrambled faces. We found that 4- but not 3-month-old infants' ERP showed an enhanced face-sensitive N170 component for the scrambled stimulus. Furthermore, when canonical and scrambled faces were gazing toward an object, 4-month-olds displayed an enhanced Negative central (Nc) component, related to attentional processes, for the scrambled face. Three-month-olds did not display any of these effects. These results point to important transition in the first months of infancy and show that triadic cues influence the processing of the human face.

The neural correlates of infant and adult goal prediction: evidence for semantic processing systems

The sequential nature of action ensures that an individual can anticipate the conclusion of an observed action via the use of semantic rules. The semantic processing of language and action has been linked to the N400 component of the event-related potential (ERP). The authors developed an ERP paradigm in which infants and adults observed simple sequences of actions. In one condition the conclusion of the sequence was anticipated, whereas in the other condition the conclusion was not anticipated. Adults and infants at 9 months and 7 months were assessed via the same neural mechanisms-the N400 component and analysis of the theta frequency. Results indicated that adults and infants at 9 months produced N400-like responses when anticipating action conclusions. The infants at 7 months displayed no N400 component. Analysis of the theta frequency provided support for the relation between the N400 and semantic processing. This study suggests that infants at 9 months anticipate goals and use similar cognitive mechanisms to adults in this task. In addition, this result suggests that language processing may derive from understanding action in early development.

Looking at eye gaze processing and its neural correlates in infancy-implications for social development and autism spectrum disorder

The importance of eye gaze as a means of communication is indisputable. However, there is debate about whether there is a dedicated neural module, which functions as an eye gaze detector and when infants are able to use eye gaze cues in a referential way. The application of neuroscience methodologies to developmental psychology has provided new insights into early social cognitive development. This review integrates findings on the development of eye gaze processing with research on the neural mechanisms underlying infant and adult social cognition. This research shows how a cognitive neuroscience approach can improve our understanding of social development and autism spectrum disorder.

Direct eye contact influences the neural processing of objects in 5-month-old infants

Do 5-month-old infants show differences in processing objects as a function of a prior interaction with an adult? Using a live ERP paradigm we assessed this question utilizing a within-subjects design. Infants saw objects during two pretest phases with an adult experimenter. We recorded event-related potentials to the presentation of objects following the interactive pretest phases. Experimental conditions differed only in the nature of eye contact between the infant and the experimenter during the pretests. In one condition the experimenter engaged the infant with direct eye contact. In a second condition the experimenter looked only at the infant's chest. We found that the negative component, related to attentional processes, showed differences between experimental conditions in left fronto-central locations. These data show that 5-month-old infants allocate more attention to objects that have been previously seen during direct eye-contact interaction. In addition, these results clarify the functional nature of the negative component.

Influence of vocal cues on learning about objects in joint attention contexts

An experimenter taught infants about a novel toy in two joint attention conditions, one with and one without vocal cues. In test trials, infants viewed the familiar toy and a novel toy. Infants in the Joint Attention plus Voice condition looked significantly longer to the novel toy.