Artificial intelligence-powered voice assistants, many of which default to female-sounding voices, are reinforcing harmful gender stereotypes, according to a new study published by the United Nations.
Titled “I’d blush if I could,” after a response Siri utters when receiving certain sexually explicit commands, the paper explores the effects of bias in AI research and product development and the potential long-term negative implications of conditioning society, particularly children, to treat these digital voice assistants as unquestioning helpers who exist only to serve owners unconditionally.
Interview with Maya Dusenbery, author of Doing Harm: The Truth About How Bad Medicine and Lazy Science Leave Women Dismissed, Misdiagnosed, and Sick.
"On the most basic level, the fact that basically until the 1970s, there were essentially no women involved in medical practice or research. Certainly, I think that is the root problem, but I do think that one of my big takeaways from the research is the systemic problem I see, that I describe as a knowledge gap, where we just don’t have enough information and medical knowledge about women and their bodies and their conditions that disproportionately affect them. And then this trust gap, this tendency to not believe women’s reports of their symptoms."
In the science fields, women still have a ways to before being acknowledge by peers.
"In 2 weeks, 1000 neuroscientists will descend on Vancouver, Canada, for the Third International Brain Stimulation Conference. The first two iterations of the biennial conference were plagued by complaints that few of the featured speakers were women, but this year will be a step in the right direction: Female neuroscientists will deliver six out of 20 of the conference’s featured talks."