IDSC Fellows Demonstrate Tech Skills and Initiative

2018-2019 IDSC Fellows Symposium

IDSC Fellows Demonstrate Tech Skills and Initiative

The 2018-2019 IDSC Fellows and their mentors gathered on Tuesday, April 30, 2019, at the Newman Alumni Center to make the final presentations on their chosen projects.

 

1Steven Anderson 2018-2019 IDSC Fellow

Presenting first was Steven Anderson, a PhD student working under the supervision of Dr. Elizabeth Losin in the Cognitive and Behavioral Neuroscience Division in the Department of Psychology. Steven’s project, titled “Virtual Reality Simulations of Dyadic Medical Interactions” looked at:

  • Role playing combined with real-world conditions thorough VR to examine social factors that influence pain perception and pain report
  • Correspondence between self report of pain and brain activity of pain, and
  • Sociocultural and contextual modulations of pain perception.

He delved into some background regarding known racial and ethnic biases in laparoscopic surgery (use of a machine), in the use of the new UM Simulation Hospital, and in prescribing medications, and set out to answer the following questions:

  1. Does VR simulating a medical environment enhance the realism and experience of receiving experimental pain stimulations?
    [STUDY 1 Undergrads view procedure while getting either pain or warmth. Does the VR modulate their perception?]
  2. Does demographic concordance between medical trainees and a VR patient influence pain-treatment-related outcomes?
    [STUDY 2 Pain Treatment Decisions]

Unity logoTech
Steven said at first he was going to write a review paper, but then he realized it was going to be possible to create something using Unity (a cross-platform real-time game engine). For the tech aspects, he taught himself how to Use Unity (C++/written in C#), then used Google Cardboard VCR glasses (with a phone), and moved up to an Oculus Go untethered device (comes with a wand/cannot navigate in 3D space/just used for viewing).

Possible Collaborators interested in studying chronic pain:  Karuna AVR, applied VP.com, Magic Leap, UM Simulation Hospital (simulated setting/real-world impact) (looking at VR escalating the level of simulations), Social and Cultural Neuroscience Laboratory (aka Losin Lab).

 

Stephano Chang 2018-2019 IDSC Fellow2

The second presenter was Jin Yop “Stephano” Chang with his project on the “Development of Closed-Loop Neuromodulation of Gait and Balance Control After Spinal Cord Injury”. Stephano is interested in neuromodulation strategies to restore gait function after Spinal Cord Injury (SCI) in a large, translationally-relevant animal model. His interest in computational science regarding this research was in finding a way to implement balance control into his neuromodulation.

Arduino logoTech:  Stephano used Arduino hardware and code to develop a wireless, portable, and accurate inertial measurement unit. Using a quarternion-based system, he was able to characterize normal and perturbed states in the animal’s gait cycle. Using a Proportional-Integral-Derivative Control model, he developed an algorithm to correct for perturbations detected by the inertial measurement unit, which will be implemented through a custom-designed electrode.

Q& A

Stephano’s inertial measurement unit was created with ope-source Arduino hardware. He also built a treadmill and speedometer for his experiments using Arduino.

Collaborators/Acknowledgements
The Miami Project to Cure Paralysis; Neurosurgery Research and Education Foundation; members of James Guest,’s lab:  Andrea Santamaria, Pedro Pinheiro; members of Brian Noga‘s lab: Ioan Opris, Luz Villamil, Franciso Sanchez, as well as Juan Solano, Yohjan Nuñez, and José Rodriguez

 

3Amy Ahn 2018-2019 IDSC Fellow

On the topic of ASD, Autism Spectrum Disorder, the third presenter Yeo Jin “Amy” Ahn’s project was titled  “Automating and Accelerating the Autism Diagnostic Process.”

Opening with the shocking 2018 statistic that 1 in 59 children is diagnosed by having Autism by age 8, Amy noted that this rate has been found to be consistent among 4-year-old children in a 2019 report.  Given the numbers, Autism is a common developmental disorder but, to date, but currently we lack objective measures for understanding ASD-related behaviors during ASD assessments. So how are these behavioral disorders are measured? In current best practice, autism severity is informed partially by a gold-standard assessment, ADOS-2. Amy’s project focused on assessing associations between Social Affect scores in the ADOS-2 and the objective, quantifiable measures of ASD-related behaviors.  She aimed to improve objective understanding of children’s social communicative behaviors during the ADOS-2 assessment.

In her study, there were 25 participants, all age 3.

Tech:  The project used Pivothead glasses with a camera on the bridge on the nose (worn only by the adults) that captured children’s gaze and smiles. IMotions software was used to detect social gaze and smiles captured by the Pivothead camera. Audio recordings were processed with LENA software to produce child-adult vocal turn taking measure.

Pivothead glasses

Collaborators
Miami CTSI (Clinical and Translational Science Institute), ASAC (Autism Spectrum Assessment Clinic), Autism Science Foundation

 

 

4Samantha Mitsven 2018-2019 IDSC Fellow

The final project presented  “Timing is Everything:  The Relation Between Temporal Structure of Classroom Vocalizations and Language Abilities” looked at the temporal structure of speech-related vocalizations in children with hearing loss and their typically hearing peers in the context of their preschool classroom. Samantha Mitsven found that children’s vocalizations during interactions with their peers and teachers followed a “bursty” distribution, indicating that vocalizations tended to cluster together rather than be evenly spaced over the course of the school day. The vocalizations of children with hearing loss were more bursty than the vocalizations of their typically hearing peers and this was related to lower receptive and expressive language abilities. Overall, higher language abilities were associated with decreased temporal clustering of vocalizations and shorter intervals between vocalization onsets. Capitalizing on automated measurement techniques which yielded big behavioral data, allowed for the identification of specific temporal features of children’s moment-to-moment vocalizations as promising correlates of developing language capacities.

Tech: Use of LENA audio recorders (placed in a front pocket of colorful vests worn by the children) + processing software

Collaborators:  NSF, Institute of Education Sciences (IES), Interactive Behavior In Schools (IBIS) Project, Lynn Perry, Neil Johnson