Yelena Yesha: Keynote Speaker at Take Action! Bias in Technology 3/17

Take Action! Bias in Technology banner image, 8 happy people of diverse cultural and race backgrounds seated on the floor with laptops, 4 women and 4 men

Yelena Yesha: Keynote Speaker at Take Action! Bias in…

Diversity . Accessibility . Inclusion . . . What are YOU doing to help build an inclusive tech community? This virtual event is part of a series focused on providing actionable advice to leaders and tech talent in the tech community to aid in building a diverse, accessible, and inclusive environment where everyone can thrive. “Actions speak louder than words” is true.

Yelena YeshaAttend the 4th virtual Take Action South Florida event to learn how you can ‘Take Action!’ IDSC Innovation Officer Dr. Yelena Yesha will be a keynote speaker and join a panel addressing issues around bias in AI such as:

  • Everyone has biases and when we don’t recognize that we have them, we are unconsciously building them into the technology that we use
  • Understanding the myriad ways that technology can be biased
  • Creating diverse teams to develop technology can help to ensure that technology is less biased
  • Some cultural biases are so strong that even the people negatively impacted by the bias may not realize that they also have that bias

 

Thursday, March 17, 6:00-8:00 PM

Read More  |  Register Now

 

About Take Action South Florida

The “Take Action!” series is a partnership initiative between local South Florida organizations SIM South Florida, TECH HUB South Florida, and ISACA South Florida focused on providing actionable advice to leaders in the tech community to aid in building a diverse and inclusive environment where everyone can thrive. Take Action! addresses issues like technology that isn’t thought of as artificial intelligence can be affected by biases. Accessibility technologies are also impacted. Whether it’s a technology device that isn’t accessible to all or automated speech recognition that doesn’t recognize accents or an alternative communication device that doesn’t have ethnic voice choices. Examples of the bias in technology comes down to the fact that the data used to train artificial intelligence is not representative of the diversity of the human race:

  • A game system that was tested only on men ages 18–35 didn’t recognize women or children.
  • Voice recognition tools don’t recognize higher-pitched voices.
  • Facial recognition is most accurate with lighter-skin men than with women and in particular darker-skinned people.
  • Algorithms for risk scoring loans said that white men were a better risk.
  • Algorithms reviewing resumes rejected women’s resumes.

Read more . . .