Does your company truly promote a fair and equal culture? Have you explored how to build a fair and equal workplace? Has your team taken the time to understand why, when and how employee promotion decisions are guided by assumptions that are sincerely believed to be fact-based? Unbeknownst to you — biased decisions might be hindering the full potential of your company.
The preconceptions we make on people have long-lasting effects — who gets promoted, hired or put in leadership positions. Who receives a raise. Who gets the important clients. Who gets terminated when we need to cut costs such as during the current COVID-19 crisis. It’s our responsibility to understand what’s behind our decisions, our assumptions and errors. This knowledge will help you to make more informed decisions; innovate and contribute at a higher level; improve your employee retention and performance; create more befitting products for your diverse clientele; and ultimately, obtain better results.
Although I’m a trained psychologist, I found it hard to grasp the breadth of the term unconscious bias. Reading about the Heidi/Howard experiment at Columbia Business School was eye-opening and helped me to understand what unconscious bias means. The first time I read about it was in Lean In: Women, Work and the Will to Lead in 2016. I was blown away by the research in the book and realised I wasn’t the only one struggling at work. Many women are not being promoted at the same pace as male colleagues and often receive different performance reviews. What is this invisible force we are up against, I wondered.
During one of his organizational behavior classes at Columbia Business School, Professor Francis J. Flynn wanted to test gender perception. He asked a group of students to read real-life entrepreneur Heidi Roizen’s case study by Harvard Business School (HBS). The case from HBS explains how Heidi, a successful venture capitalist, a former entrepreneur and Apple executive, keeps a ‘broad and deep’ personal and professional network which she leverages to support both herself and others. Meanwhile, another group of students read the exact same case study, with just one change: the name. For this second group, the successful Silicon Valley venture capitalist with a vast and powerful network was called Howard. To conclude, Flynn asked the students about their impressions of Heidi or Howard. In turn, the students’ distinct reactions to Heidi and Howard was mind-boggling.
The students rated Howard and Heidi as equally competent; notwithstanding, they liked Howard, and thought he was ‘a great guy — you want to hang out with Howard’.
The students who read about Heidi perceived that Heidi was more selfish and aggressive and ‘not the type of person you would want to hire or work for’, finding her ‘less humble, more power-hungry and self-promoting’ than Howard. When describing the experiment, Professor Flynn stated: ‘The more aggressive they thought she was, the more they hated her’. And to reiterate: the description of the high-achiever investor was exactly the same — word by word. The only difference was the name. How could a person be perceived so different based solely on the name?
This experiment is a great example of how bias creeps into our daily lives; in the way we perceive and unconsciously judge and catalog people. Those students were totally unaware that their reactions were based on the gender of Heidi or Howard, and even less of how gender stereotypes embedded inside of them deeply influenced their snap judgements of others. I’m sure these students don’t consider themselves sexist. However, it is clear that Heidi broke some unconscious social rules in their eyes, and she paid the price women pay for it: the likability penalty.
The students who participated in this experiment came from affluent families and were considered well-educated; an MBA at Columbia Business School costs $213,240. Still, they fell prey to unconscious bias. And so do we. Money and status doesn’t protect you from unconscious bias.
Education doesn’t protect you either. There have been many studies showing bias in academic settings. For example, a study published in the Journal of Applied Psychology highlighted that university professors are more prone to mentor students with Caucasian male names, like Brad Anderson. Sorry Keisha, Raj, Mei and Juanita!
In another example, the Dean of Science at MIT established a committee to analyse the status of women faculty. In 1999 the examination of data from the previous 30 years revealed that there were ‘differences in salary, space, awards, resources, and response to outside offers between men and women faculty, with women receiving less despite professional accomplishments equal to those of their male colleagues’. You would think that this was something from the last century. However, in 2006 MIT conducted another study and discovered that ‘usual departmental hiring processes do not always identify exceptional female candidates’. As a result, they started taking affirmative actions like ‘specific pressures, policies, and positive initiatives designed to increase hiring of women or minorities’. If MIT falls prey to these inequalities, what do you think might be happening in your company?
Most people would rate their hiring decisions as objective. Unfortunately, data shows that men who believe that they are objective in hiring decisions are more likely to hire a male applicant than an identically described female applicant. In general, we all fall into bias traps. People of all genders, ages, races, sexuality & religion can consciously or unconsciously display biased comments or behaviours that disadvantage others. We need to learn and understand how bias works, so we can find ways to mitigate them. Knowing that bias exists isn’t enough.
As we saw before in academic settings and as we are seeing now more prominently thanks to the Black Lives Matter (BLM) movement, racial bias is also affecting who we hire. And we have known about it for a long time. In 2003, Researchers Bertrand and Mullainathan sent out fictional but identical resumes applying to job positions. In this case, they changed the perceived race of the applicants by allocating very African American sounding names or very White sounding names to the resumes. “White” applicants were called back 50 per cent more often than the identically qualified “African American” applicants. Applicants living in better neighbourhoods received more callbacks. Self-proclaimed “Equal Opportunity Employers” discriminated as much as other employers.
Perhaps now you can better envision how bias appears in different contexts and you may potentially realise that you have fallen subject to unconscious bias.
By unconscious, we mean ‘mental processes that are inaccessible to consciousness but that influence judgments, feelings, or behaviour’.
You will find many other definitions in Psychology books. The definition of bias by the Cambridge Dictionary is ‘the action of supporting or opposing a particular person or thing in an unfair way, because of allowing personal opinions to influence your judgment’.
Putting the terms together we can say:
unconscious bias is a quick, automatic and unfair judgment for or against something or someone based on personal opinions and life experiences. It is also referred to as implicit or second-generation gender bias.
It’s almost terrifying to recognise that regardless of how intelligent and educated we are, our brains are hard-wired to complete inaccurate information, just check the images below and experience it for yourself. That’s why ALL of us exhibit unconscious bias towards people who don’t look like us — on a regular basis and without noticing it.
Nonetheless, it’s important to remember that these cognitive shortcuts kept our ancestors alive; ‘nonconscious thinking is an evolutionary adaptation’. Our human brain needs to put things in categories to make sense of the world. Thanks to those shortcuts we knew if people belonged (or not) to our tribe, if we could trust them; we could quickly decide to fight them, run away from them or trade with them.
Today it means that members of minority (women, Blacks, Muslims, LGBTQI, refugees, etc.) communities are treated less favourably than people who belong to a majority group with otherwise identical characteristics (education level, skills, experience) in similar circumstances.
But why does this happen? Every second, our brain is bombarded with 11 million pieces of information. From those 11 million pieces of information per second, our brain is only capable of processing 40. In order to cope with the situation — processing vasts amounts of information — our brain creates shortcuts and makes assumptions, drawing on our knowledge and experiences, to help us make quick decisions. As a result, we as humans constantly make what researchers call “implicit biased decisions” or more commonly known as unconscious biased decisions.
As mentioned, these cognitive shortcuts helped us to evolve and arrive at where we are today, but those behaviours no longer serve us. Discriminating people based on their gender, nationality, ethnicity, disability, age, sexual preference, or any other characteristic, goes against basic human rights.
The only way to stop bias in organizations, institutions and other establishments is to measure and review the data, as MIT did. We are totally oblivious to bias; they sneak in every decision and go unnoticed until you measure and check the data. Once the data is in front of you, it becomes crystal clear. Once we have the data, it’s time to take action. ‘Awareness’ is not enough. Action is required: positive and affirmative. But what action? How are you going to ensure it will yield positive results?