Congratulations to Prof. Amin Sarafraz who was one of the authors of an article that recently appeared in the Frontiers in Psychology Journal, under the research topic “Empirical Research at a Distance: New Methods for Developmental Science.” As Covid-19 affected in-person data collection, this group found an approach for estimating gaze direction and duration collected remotely from webcam recordings by using a combination of OpenFace and machine learning.
In early 2020, in-person data collection dramatically slowed or was completely halted across the world as many labs were forced to close due to the COVID-19 pandemic. Developmental researchers who assess looking time (especially those who rely heavily on in-lab eye-tracking or live coding techniques) were forced to re-think their methods of data collection. While a variety of remote or online platforms are available for gathering behavioral data outside of the typical lab setting, few are specifically designed for collecting and processing looking time data in infants and young children. To address these challenges, our lab developed several novel approaches for continuing data collection and coding for a remotely administered audiovisual-looking time protocol. First, we detail a comprehensive approach for successfully administering the Multisensory Attention Assessment Protocol (MAAP), developed by our lab to assess multisensory attention skills (MASks; duration of looking, speed of shifting/disengaging, the accuracy of audiovisual matching). The MAAP is administered from a distance (remotely) by using Zoom, Gorilla Experiment Builder, an internet connection, and a home computer. This new data collection approach has the advantage that participants can be tested in their homes. We discuss challenges and successes in implementing our approach for remote testing and data collection during an ongoing longitudinal project. Second, we detail an approach for estimating gaze direction and duration collected remotely from webcam recordings using a post-processing toolkit (OpenFace) and demonstrate its effectiveness and precision. However, because OpenFace derives gaze estimates without translating them to an external frame of reference (i.e., the participant’s screen), we developed a machine learning (ML) approach to overcome this limitation. Thus, third, we trained an ML algorithm [(artificial neural network (ANN)] to classify gaze estimates from OpenFace with respect to areas of interest (AOI) on the participant’s screen (i.e., left, right, and center). We then demonstrate reliability between this approach and traditional coding approaches (e.g., coding gaze live). The combination of OpenFace and ML will provide a method to automate the coding of looking time for data collected remotely. Finally, we outline a series of best practices for developmental researchers conducting remote data collection for looking time studies.
Eschman, Bret; Todd, James Torrence; Sarafraz, Amin; Edgar, Elizabeth V.; Petrulla, Victoria; McNew, Myriah; Gomez, William; Bahrick, Lorraine E., Remote Data Collection During a Pandemic: A New Approach for Assessing and Coding Multisensory Attention Skills in Infants and Young Children, Frontiers in Psychology, Vol. 12, 21 Jan. 2022, DOI 10.3389/fpsyg.2021.731618, ISSN 1664-1078.