Ep 326: Giving Voice To Our Digital Assistants
- I’d Blush If I Could
- We tested bots like Siri and Alexa to see who would stand up to sexual harassment
- Why Siri and Alexa Weren’t Built to Smack Down Harassment
- Hey Siri, stop perpetuating sexist stereotypes, UN says
- Is it time for Alexa and Siri to have a “MeToo moment”?
- Female voice assistants fuel damaging gender stereotypes, says a UN study
- My WIX site and online learning experience in which I re-create the experiment conducted by Loftus and Palmer on eyewitness testimony.
The reason digital assistants acquiesce to harassment isn’t just sexism or gender inequality in the tech world, as disturbing and prevalent as those may be. No, the explanation lies elsewhere, I believe. These machines are meant to manipulate their users into staying connected to their devices, and that focus on manipulation must be laser-like. To clearly state that harassment toward digital assistants is unacceptable would mean having some standard, some line that can’t be crossed. And one line leads to another, and soon you’re distracted—the user is distracted—from selling/buying merchandise, collecting/sharing data, and allowing a device to become ensconced in their life.
The moral standard most compatible with engagement is absolute freedom of expression, the standard of having no standards.
– Noam Cohen, “Why Siri and Alexa Weren’t Built to Smack Down Harassment”