Mitsunori Ogihara (Director) and Gang Ren (Postdoctoral Associate) of Big Data Analytics and Data Mining program, and Daniel Messinger, Program Director of Social Systems Informatics, presented an interdisciplinary paper (TITLE: “Categorical Timeline Allocation and Alignment for Diagnostic Head Movement Tracking Feature Analysis”) together at the workshop on Face and Gesture Analysis for Health Informatics (FGAHI) at CVPR 2019 (Computer Vision and Pattern Recognition, part of IEEE CVF) 2019, on June 16-21, 2019 in Long Beach, California.
The Face and Gesture Analysis for Health Informatics Workshop discussed the strengths and major challenges in using computer vision and machine learning of automatic face and gesture analysis for clinical research and healthcare applications. Scientists working in related areas of computer vision and machine learning for face and gesture analysis, affective computing, human behavior sensing, and cognitive behavior, shared their expertise and achievements in the emerging field of computer vision and machine-learning-based face and gesture analysis for health informatics.
Topics of interest included:
- Deep learning based face and gesture analysis for healthcare
- Deep learning based facial expression recognition for healthcare
- Remote physiological sensing for healthcare
- Human-Computer Interaction systems for healthcare
- Deep learning based multi-modal (visual and verbal) fusion for healthcare applications
- Clinical protocols for face and gesture analysis and modeling in clinical context
Applications included: Automatic pain intensity measurement, automatic depression severity assessment, autism screening.
The paper, titled Categorical Timeline Allocation and Alignment for Diagnostic Head Movement Tracking Feature Analysis is authored by Mitsunori Ogihara, Zakia Hammal, Katherine B. Martin, Jeffrey F. Cohn, Justine Cassell, Gang Ren, and Daniel S. Messinger)
Atypical head movement pattern characterization is a potentially important cue for identifying children with autism spectrum disorder. In this paper, we implemented a computational framework for extracting the temporal patterns of head movement and utilizing the imbalance of temporal pattern distribution between diagnostic categories (e.g., children with or without autism spectrum disorder) as potential diagnostic cues. The timeline analysis results show a large number of temporal patterns with significant imbalances between diagnostic categories. The temporal patterns show strong classification power on discriminative and predictive analysis metrics. The long time-span temporal patterns (e.g., patterns spanning 15-30 sec.) exhibit stronger discriminative capabilities compared with the temporal patterns with relatively shorter time spans. Temporal patterns with high coverage ratios (existing in a large portion of the video durations) also show high discriminative capacity.