Blog

A Surprisingly Simple Fix to Gender Bias in Surveys

What’s wrong with this survey question?

With which of the following do you identify?

    • Male
    • Female
    • Non-Binary
    • Prefer to self-identify _____________

I’ve noticed a trend in the way gender identification questions are typically set up. This applies to quantitative surveys and qualitative screeners. The focus on not ’othering’ has become more standardized – which is great to see and so important. However, there’s still a problem with the way these questions are set up and most are not even aware of this problem.

 

Let’s ask ChatGPT!

I became curious about how ChatGPT would write a gender identification question for a survey. I asked and found theirs was set up the exact same way!

So, I asked a follow up question to understand if ChatGPT would recognize their own gender bias.

“Why did you put male before female in the gender question?”

 

Here’s the ChatGPT reply:

The order in which options are presented in a survey can sometimes unintentionally influence responses. Placing “Male” before “Female” is a common convention, but it’s essential to note that the order can be switched or randomized to avoid any perception of preference or hierarchy.

 

For instance, you can alternately present the options as “Female,” “Male,” and “Non-binary” to reduce potential bias. The key is to be conscious of these considerations and aim for neutrality in survey design.

 

A simple change:

There’s one VERY SIMPLE CHANGE everyone can make to reduce gender bias.

Put the options in alphabetical order or randomize them.

Let’s raise the bar on this. Every bit helps.