For a more diverse workforce, businesses need to be aware of the role of unconscious bias in recruitment and take the necessary steps to remove it where possible
The business case for diverse organizations is clear: A more diverse and inclusive business consistently delivers better results. From a legal perspective, with a variety of discrimination legislation long since passed, there should be no barriers to hiring a workforce that is more reflective of society as a whole.
So why, then, do we continually see data and hear of experiences where that is consistently not the case, across all sectors? What is holding back the acceleration of diversity within organizations?
One problem is the role of unconscious bias in recruitment. We all have biases, formed from our individual reactions to a complex mix of cultural, social and economic experiences. The challenge is how these biases inform decision-making, both in our personal and professional lives. They have a direct impact on our actions and, from a business perspective, have significant consequences.
In technology, this can result in our biases being programmed into solutions that are designed to help but end up hindering different parts of society. There have been numerous examples in which human biases have been, whether consciously or unconsciously, programmed into artificial intelligence tools, with those tools then making decisions that disproportionately affect certain sections of society.
This happens when an organization doesn’t have a diverse enough workforce, one that is empowered to challenge decisions and highlight when products might have bias or preference built into them.
This is why getting recruitment right is so important, yet even there, as highlighted before, unconscious bias has an effect. It means that even when presented with two candidates with exactly the same skills, experiences and qualifications, hiring managers are more likely to choose the person that aligns positively with their biases. This affects all professions—a Yale University study found that “both men and women science faculty were more likely to hire the male, ranked him higher in competency, and were willing to pay him $4,000 more than the woman. They were also more willing to provide mentoring to the male than to the female candidate.”
How, therefore, do we root out unconscious bias within recruitment and ensure that it does not hinder the drive to create more inclusive technology workforces?
First, it is important to realize that where you hire from is as important as how you hire. If you only ever recruit from the same schools or with the same websites, then you are leaving yourself at the mercy of the diversity of those institutions. If you only ever use one route to apply, whether that’s only using recruitment consultants or use a website that is only really accessible through a computer, then again you are limiting yourself to candidates with access to those pathways.
Outside of reconsidering talent pools, one way to reduce unconscious bias is through the removal of identifying characteristics—no photos, no names, dates of birth or school and university grades. This can help to an extent, in that decision-makers have to go off the candidates’ experience and how they’ve presented their previous roles, but it does have its restrictions—with entry-level positions, for example, where relevant experience might be limited.
That’s a quick-fix solution. For a more sustainable, thorough and long-term approach, recruiters—particularly those hiring for technology positions—need to look at how they can accurately judge skill sets. This can be challenging in itself—in larger organizations, those involved in early stage selection may not have the requisite background understanding to make the right choices.
That is why having an anonymous, objective environment in which to appraise skills could be the answer. Using coding tests that closely resemble what the successful applicant will be doing will enable organizations to evaluate candidates and provide a clear appraisal of whether they would be suited for the position. This can then be taken and used alongside other tests that look at aptitude and cultural attributes to score the candidate on suitability.
It is this combination of anonymity and focus on skills that help remove unconscious bias; as one study found, if a woman contributing to an open source software community was unidentifiable, their contribution was likely to be accepted more often than men’s. When a woman’s gender was identifiable, they were rejected more often. The message is clear and applicable to technology recruitment: Take the focus away from who is doing the work, and make it about the work, and you will see a more diverse workforce emerge (assuming the applicant pool is diverse to begin with).
We need to have diverse employee bases; it is the only way that we can be sure that the tools and technologies being developed will serve to help society rather than to increase inequalities. To do that, businesses need to be aware of the role of unconscious bias in recruitment and take the necessary steps to remove it where possible. That means anonymity—but more importantly, it means focusing on skills and the ability to do the job in question.