Two data-rich observational studies by University of Miami doctoral students are shedding new light on autism spectrum disorder (ASD) and child development—thanks to the Advanced Computing resources of the Frost Institute for Data Science and Computing (IDSC).
“These ground-breaking studies generated a tremendous volume of auditory and visual data,” said Daniel S. Messinger, Ph.D., professor of psychology, pediatrics, electrical and computer engineering, and music engineering; research director, Linda Ray Intervention Center; and director of IDSC Social and Behavioral Data Science. “Automating the analysis of those observations was essential to the research processes.”
Classroom Engagement
In her doctoral research, Regina Fasano, Ph.D., studied the vocal interactions of preschool children with ASD and developmental delays. Her work, “Automated Measures of Vocal Interactions and Engagement in Inclusive Preschool Classrooms,” was published recently in Autism Research.
“One of the challenges of these types of studies is the complexity of child-to-child and child-to-teacher interactions,” said Messinger. Therefore, researchers have typically followed one student at a time. For her study, Fasano used UM’s Interactive Behavior in Schools (IBIS) platform, which allows researchers to use special tracking and speech recognition to monitor the behaviors of multiple children.
“We found that the ability of children to vocalize with social partners is a critical variable in peer engagement,” added Fasano.
Using automated measures of vocal interaction and an observational measure of engagement, Fasano found that children in the ASD group displayed lower engagement with peers, teachers, and tasks than children in the typical development group. However, the vocalizations were positively associated with engagement with peers and teachers, shedding new light on behaviors supporting engagement in children with ASD.
Smiling Faces
In her doctoral study, Yeojin “Amy” Ahn, Ph.D., a former IDSC Fellow, looked at how infants and mothers smile at each other and the potential effects on social and emotional development. For instance, a mother who “smiles with her eyes” (a Duchenne smile) when face-to-face with her infant expresses a more intense positive emotion than non-Duchenne smiling.
Ahn’s study, “Automated Measurement of Infant and Mother Duchenne Facial Expressions in the Face-to-Face/Still-Face,” was published recently in the journal Infancy. “While still-face effects elicited in the Face-to-Face/Still-Face (FFSF) protocol are well-studied in developmental science, little has been known about the degree to which the FFSF is associated with intense affective displays,” said Ahn.
In her research, Ahn looked at two-minute interactions between 40 infant and mother pairs. Using a facial action coding system and automated facial affect recognition (AFAR) computer vision software, she found that Duchenne expressions (intense expressions involving eye constriction) appear to be a sensitive index of intense emotional interactions.
This study confirms that researchers can measure these intense expressions, both positive and negative, using automated software,” said Messinger. “This opens the door to conducting further studies at scale.”
Tags: Amy Ahn, auditory data, autism spectrum disorder, Behavioral Data Science, child development, child development delays, Computer Vision, Duchenne Expressions, Face-to-Face/Still-Face, infant emotional development, Interactive Behavior in Schools, non-Duchenne smiling, Regina Fasano, Yeojin Ahn