Why are virtual assistants from Amazon, Google and Apple all "female"?

Technically speaking, of course, virtual assistants don't have genders. But they do have names and voices that suggest to users they are more "female" than "male" — a characterization that reinforces some of the worst gender stereotypes in our society.

Amy Ingram, an AI personal assistant, is a popular corporate version produced by x.ai. If you carbon copy Amy on an email, she will help schedule a meeting: she introduces herself to the recipient as your personal assistant, suggests times when you're available and follows up with a calendar invitation to confirm.

Whereas Amy has entered the workforce, Amazon's bot, Alexa, is primarily a home assistant. You can call out to her to set timers when making dinner, and have her reorder detergent when you realize you've run out. She can dim the lights at night, or read you the news headlines in the morning. Alexa's appeal is that she's always a holler away: all you need to do is ask.

'Amy' or 'Andrew'

Some companies do give users the option of switching to a male voice or persona: Amy Ingram's first name can be modified to "Andrew," and as of iOS 7, you can have Apple's Siri talk to you in a male voice. But across the board, the default setting for chatbots and virtual assistants is predominantly female. Even OK Google, which doesn't have a humanized name, has a female voice.

Companies cite all sorts of research in their decision to make bots female. They claim we take orders better from women, and that people have shown a preference for female voices in automated systems. Clifford Nass, the late co-author of Wired for Speech, a widely cited book on the topic, argued that male voices are perceived as being more authoritative, whereas female voices are understood to be more helpful and supportive. These preconceptions carry over into synthetic voices, which essentially means that even computers are gender stereotyped.

Designed to be subservient 

Why does this matter? Because it reinforces a power dynamic that we simply can't overlook: virtual assistants are designed to be subservient, and creators send a clear message by making them all "female." This is especially troublesome when you consider that the majority of early adopters of these tools are men; early surveys show that 60 per cent of owners of Amazon Echo, which operates the Alexa voice service, are male, affluent and middle aged. Their preferences will surely continue to shape these powerful tools as they become adopted more broadly.

Siri

We assume that users want "female" assistants. (iStockphoto)

The irony is that as we watch shows like Mad Men and quietly congratulate ourselves on how far we've evolved as a society, our most cutting-edge consumer technology is a throwback: one where our "secretaries," "sous-chefs" and "housekeepers" are women. And by falling into a trap of assuming users want "female" assistants — instead of challenging their biases — the industry that has appointed itself to design the future is perpetuating outdated gender norms.

Bots are getting smarter, and soon, they'll be everywhere. Now is the time to pay attention to these design details, to make sure that this new trend in tech really is a giant leap forward — not two steps back.

This column is part of CBC's Opinion section. For more information about this section, please read this editor's blog and our FAQ.