With AI Gender and Chatbot Personalities Skewing Female, Be Careful about Bias

Written by Emily Peck on Mar 8, 2021

Today, March 8, is National Women’s Day. It’s a time when we reflect on incredible women who have made lasting impacts on the world. We celebrate the women in our lives  – the friends, family and coworkers – who influence and inspire us.

I’m reminded of stories about American suffragists who won voting rights for women and brave veterans who fight for our country. I’m once again entranced by people like Eleanor Roosevelt who “was a key figure in several of the most important social reform movements of the twentieth century1” and environmentalist campaigner Greta Thunberg who shows such courage at the tender age of 17. At Netomi, I look around at all of the amazing women that I work with across departments – Nadya Pierre in HR, Anmol Bansal in Customer Success, Ridhima Gugnani in Design, Sheena Noori, Maria Springer and Disha Nagra in Sales – among many others – and am inspired daily by their drive, creativity and work ethic.

As all of these incredible women get recognized, I couldn’t help but think about a new age of “females” that we’re interacting with daily: AI. Increasingly, we’re receiving resolutions to issues from companies via chatbots. We’re setting reminders and listening to music via virtual assistants on our countertops. We’re following driving directions and finding the fastest route with the help of in-dash virtual agents.

Discover the key questions to ask when scheduling a chatbot demo.

More often than not, these virtual assistants have female names and voices: Amazon’s Alexa, Bank of America’s Erica, WestJet’s Juliet, Microsoft’s Cortana, HSBC’s Amy. As these virtual agents become more ingrained in our everyday lives, it’s important that companies have diverse teams to weed out any underlying gender bias in the development and ideation process of these AIs.

AI systems tend to be female? Why? 

I’d like to believe AI systems skew female because women convey strength, comfort and gratitude.  These assistants are helping with tasks that have traditionally been completed by women – whether it’s scheduling an appointment or setting a reminder. While some argue this could lead to gender bias, I think we need to focus on the positive aspects of modern-day women. Today’s women are uplifting, hard-workers, always-on and always-available. Whether they are CEO’s, engineers, board members or stay-at-home moms, we can count on them to get the job done in a respectful, polite manner.

“It’s a well-established phenomenon that the human brain is developed to like female voices.”Clifford Nass

Women and the Trust Factor 

Companies introducing AI agents to interact directly with their customers and provide customer service resolutions need to appease the broadest audience.

As a society, we’re generally more trusting of females. We tend to believe that female voices are warmer and less threatening. In a 2012 study, people who used an automated phone system found female voices more “trustworthy.2

Co-author of Wired for Speech Clifford Nass believes “that people tend to perceive female voices as helping us solve our problems by ourselves, while they view male voices as authority figures who tell us the answers to our problems. We want our technology to help us, but we want to be the bosses of it, so we are more likely to opt for a female interface3.”

In customer support, for instance, companies are adopting AI to enable customers to get resolutions via self-service on email, chat, social and voice platforms. Trust is a huge part of customer service: customers believing that they are being heard, their issue or concern is being taken seriously, and the company is resolving their issue to the best of its ability. By using a female chatbot personality, this research would indicate that trust would be more inherent within automated customer service interactions. 

Research shows female voices drive a higher CSAT 

In addition to trust, using a female personality can drive higher customer satisfaction, according to numerous studies.

Business Insider cites a Stanford University study that argued that the “preference for female voices stems from the fact that the voice automatically triggers certain stereotypical expectations in our minds.” So, we “perceive computers as ‘helpful’ and ‘caring’ when they’re programmed with the voice of a woman.4

Amazon conducted extensive research when developing Alexa and found that a “woman’s voice is more ‘sympathetic’ and better received. By making Alexa a female, she seems more like a friendly older sister or girlfriend we — apparently — prefer interacting with, shopping with, and asking for help, rather than a computer, making it more likely we’ll make purchases.4

AI is becoming the new interface for consumer and brand interaction. To get customers comfortable with this new norm, making the experience as non-threatening and natural as possible is key. [Of course, in addition to highly accurate AI that understands customers and provides a meaningful resolution].

A call for diverse team to ward against programming stereotypical behavior

With all this being said, as a society, as technologists and as business people, we need to ensure women play key roles in the development of AI.

To address any underlying bias, albeit usually unconscious, your teams that are dreaming up, creating and bringing to market your virtual agents need to be diverse. In an article that brings together an impressive lineup of women technologists, Bethany Bongiorno, CEO of Humane, Inc. sums it up perfectly: “When a team lacks diversity they are at risk of assuming that their collective experience represents that of all humans. If that team does not include women, by default they are at risk of not building for the human experience of 50% of the world’s population — and, by extension, potentially 50% of their customer base.”

Can we help you nail the AI personality of your brand – Female, Male, Genderless? Let’s chat. We have in-house conversational AI design experts who can bring to life an AI that your customers will trust. 


  1. https://edsitement.neh.gov/lesson-plans/lesson-5-eleanor-roosevelt-and-rise-social-reform-1930s
  2. https://www.theatlantic.com/technology/archive/2016/03/why-do-so-many-digital-assistants-have-feminine-names/475884/
  3. https://www.wired.com/2015/10/why-siri-cortana-voice-interfaces-sound-female-sexism/
  4. https://www.businessinsider.com/theres-psychological-reason-why-amazon-gave-alexa-a-female-voice-2018-9?IR=T
  5. https://www.refinery29.com/en-us/2020/02/9461140/digital-voice-assistants-siri-alexa-female-bias