The Future is Female: How Women are fighting bias in AI

By Megan Handley

Fung Fellowship
4 min readApr 24, 2019
Fung Fellows (pictured left to right) Stella Seo, Megan Handley, Devin Pontious, Rosey Stone, with Jennifer Mangold, Fung Fellowship Innovation Coach and Women in Tech Program Manager (pictured center), at the WITI@UC Women in Tech: the Future of AI event

Artificial intelligence has been deemed the tool of the future, suggested to influence every aspect of our lives through innovations such as facial recognition and self driving cars. It has been creeping into our daily lives, being used to influence the shows we watch, products we buy, music we listen to; to determine if we’ll get a loan; and even to process criminal behavior and make hiring decisions. Those currently working on this transformative technology, however, do not resemble the society it is supposed to transform. Estimates show that women comprise only 13.5 percent of those working in machine learning. This well-recognized gender gap continues to deter talent and diminish potential for the field, which will affect innumerable aspects of our economy and everyday lives.

On International Women’s Day (March 8, 2019) the Women in Tech Initiative at UC (WITI@UC) created a platform to highlight the experiences of women in AI and discuss where the technology is headed in the future. The initiative was co-founded by Camille Crittenden, Deputy Director of CITRIS and the Banatao Institute and Tsu-Jae King Liu, Dean of the College of Engineering. The symposium featured the WITI@UC Athena Awards recognizing the achievements of those who have contributed greatly in the movement toward creating an inclusive environment for women in technology.

Among the topics discussed were creating a mutually beneficial relationship between mentors and mentees, fostering an inclusive work environment, and the development of AI for social good.

Panel on Accountability in the Future of AI featuring Brandie Nonnecke, Abigail Jacobs, Chloe Autio, Jamie Lee Williams, and Sabine Gerdon.

Currently, we tend to limit our thinking to one dataset and one problem, and in doing so we create blind spots and biases that AI further amplifies. Diverse teams are necessary in order to flag these blind spots that may lead to negative social consequences prior to a product launch. But how do we create an environment that is attractive to women and minorities? How do we ensure that our coding practices are inclusive? What is already being done that we can learn from?

At this symposium, I was informed of initiatives such as “Snap the Gap,” a large scale mentorship program seeking to mentor 15,000 10–12 year old girls in California, providing mentorship and hands-on learning to girls at an age prior to when there typically seems to be a disengagement in STEM. I also learned about “Double Shelix,” a podcast series created by two UC Berkeley graduate students, featuring 20+ episodes about different conversations regarding women in STEM, grad school, and inclusive science. Even on an individual level, it is important to recognize that everyone has something to offer in lessening this diversity issue. Whether that is through asking companies about their diversity metrics, contributing to open source software, telling our stories and experiences with technology, etc.

As for the corporate level, in order to establish inclusive coding practices companies must employ full spectrum teams to check these blind spots (think about who codes), use large training sets that reflect a richer portrait of humanity (how we code), and lastly, make social change a priority rather than an afterthought (why we code). AI is within our own control, we get to teach it the right values and ethics. We are responsible to provide this technology with diverse experiences to learn from, recognize our own biases, and have a diverse team build it.

Wall featuring attendee-written notes about how to support inclusion in AI.

As a Fung Fellow, I have been working over the last couple semesters building skills to ideate, design, and create technologies that promote the health and wellness of youth and older adults. I have learned extensively about human-centered design and the concept that technology should adapt to people, not the other way around. This semester, I have been working in a team partnered with the Smith Group on creating a memory care engagement center for patients with dementia as a form of sensory therapy. Through this experience, I have had to think about designing universally — for everyone regardless of their sensory and physical abilities or pre-assumed experiences and language abilities. Having a diverse team on these types of projects is essential in sparking new insights and leading to innovation, without it we would not be able to think of all the edge cases and consequently create bias products that may not be useful, or even harmful, toward our users.

At the end of the day, it is up to us to determine the underpins of technology that will be best for everyone, regardless of our demographics or the color of our skin.

Jennifer Mangold, Fung Fellowship Innovation Coach and Program Manager for the Women in Technology Initiative (WITI@UC) presenting Closing Remarks.

Megan Handley is a junior at the University of California, Berkeley, studying Bioengineering. Connect with Megan.

Stay connected with the Fung Fellowship on Facebook | Instagram | Twitter!

--

--

Fung Fellowship

The Fung Fellowship at UC Berkeley is shaping the next generation of health, conservation, and technology leaders for a better world. 🌱