Listen to the audio version of this post here.

 

“I’m not a woman or a man. I’m an A.I.”

That’s what Alexa would tell you if you were cheeky enough to inquire about her gender.

Why “her” gender? Well, “Alexa” is hardly a man’s name. Like Michaela, Roberta, and Theodora, it’s the feminine version of what would otherwise be a masculine name.

And, like her Apple counterpart Siri, Alexa replies to your requests in an oh-so-agreeable and distinctly female-sounding voice.

Siri, by the way, is a fairly common given name for girls born in some Scandinavian countries. And the sounds she delivers were derived from hours of recordings by one Susan Bennett, a female Atlanta voiceover performer.

This all may seem insignificant – what difference does it make how your digital assistant replies to your commands?

In fact, a UNESCO report says, “feminized voice assistants perpetuate gender stereotypes of subservience and sexual availability.”

And although Siri and Alexa are prime examples, they’re not alone. The Brookings Institute points out, “Around the world, various customer-facing service robots, such as automated hotel staff, waiters, bartenders, security guards, and child care providers, feature gendered names, voices, or appearances.”

It’s probably worth noting that these I’m-not-a-woman-I’m-code assistants are designed primarily by men. Bennett calls them “uber-geeks.” And while she’s affectionately teasing, we can guess from the way Siri and company are designed that there’s some bias at work.

Australian researchers Yolande Strengers and Jenny Kennedy have written a whole book summing it up. The Smart Wife: Why Siri, Alexa, and Other Smart Home Devices Need a Feminist Reboot also suggests some changes in the way digital assistants are designed and used.

The authors refer to the chores the bots take care of as “wifework.” By which they mean the domestic responsibilities that historically fell to human wives.

And they point to the 1950s American housewife as the prototype for the whole lot of them: “white, middle class, heteronormative, and nurturing, with a spick-and-span home.”

What it comes down to is that the advanced technology is taking us into the past as well as the future. Ironically, it’s designed that way.

Of course, it’s not just the geniuses behind the technology who betray their bias about gender roles.

It’s also the users who interact with them. After all, how do you talk to Cortana or Siri?

I confess, I don’t have much personal experience. I refuse to have one of those devices in my home to turn on the lights or the TV. I’m not sure what they do with the data they collect, and I don’t want them grabbing mine.

Siri lives in my iPhone though, and every now and again she pipes up when she thinks I’m talking to her. She’s almost always wrong. I’ve never made the switch to telling the phone what to do instead of tapping in my instructions, albeit clumsily.

Even so, I notice as I write this, that I instinctively think of Siri as “she” even though she’d tell me to my face, if I asked, that she is genderless. “Like cacti. And certain species of fish.”

That’s the issue, isn’t it? Alexa and company have been updated in the past couple years – they’ll all tell you they have no gender. They’ve been programmed to respond coldly and blandly to sexually suggestive or harassing comments.

And yet, they still sound female, by default. Between their names and their voices, we automatically think of them as “she” and “her,” do we not?

It’s partly the kind of work they do. They order meals, they make calls, they schedule our meetings. We direct them to take care of things the way a personal secretary might.

And some of us must be like Don Draper and his ilk. The research shows about 5% of the questions and demands the digital assistants get are explicitly sexual in nature.

It all raises another question about the way we boss around these quasi-female bots, or snap at them when something goes wrong. And this is where it gets really interesting.

How does that kind of repeated interaction affect the conversations and the relationships among real men and women? And what should we be doing about it?

Some recommendations from the Brookings Institute:

  • Develop industry-wide standards for the humanization of AI (and how gender is portrayed).
  • Encourage companies to collect and publish data relating to gender and diversity in their products and teams.
  • Reduce barriers to entry to STEM education—especially those which disproportionately affect women, transgender, or non-binary students.

The idea is that developing standards and making development teams more diverse would at least be a start toward less stereotyping and sexualizing in the world of chatbots.

And that can’t be a bad thing, can it?

Meantime, I’m proposing we do some research of our own. How do you talk to your Alexa? Do you issue orders and commands? Make gentle requests? Do you find yourself flirting?

And do you think of Siri as a female? I just discovered it’s possible, for an extra fee, to give Alexa the very male voice of Samuel L. Jackson.

But that’s only for fun things like, “Alexa, tell me a story.” If you want a reminder set, or a shopping list, you’ll still get the female Alexa following directions.

This is where I need not Alexa’s voice or Cortana’s but yours. Notice how you interact with your digital assistant, and fill us in, will you?

Just  post a comment and share your observation.