Be the first to know
Sign up for a weekly dose of tech hiring news and updates.
Companies are striving for diversity and inclusion and promoting gender equality in the workplace to drive innovation and gain competitive advantage in their market.
But how can you embrace a diverse workforce and achieve gender equality in the workplace if your technical interviewing processes are potentially biased? Hiring teams should look to their own sourcing, tech screening, and shortlisting methods to ensure that they aren’t ignoring, or accidentally discriminating against diverse, qualified candidates.
What causes the performance gap?
We did a deep dive into our data to uncover whether there are differences in performance between genders on Codility’s coding online tests amongst different companies, and whether there are any factors that perpetuate gender bias. Why?
Because we believe it’s our responsibility to our customers to provide the best possible coding online tests that help mitigate — rather than add — bias to an already complicated technical interviewing process. Although we did not collect any information about candidates’ gender to avoid further bias, we were able to automatically recognize the gender of 53% of candidates with 96% confidence.
We found that while there is a difference in performance among genders, data suggest that there is no gender bias in Codility tasks. As far as we can tell, Codility is excluded from being a causal factor to the performance gap — instead we focused on if there are any correlations where there could be some causality.
Here’s what we discovered.
Performance is correlated to the seniority gap and candidate market
The data below show the proportion of females invited to take a Codility test by recruiters — the highest proportion of females was at 58%, while the lowest proportion of females was at 2%. When looking at our larger-sized customers we can see that, on average, 88% of candidates who received tests are male. The smaller the proportion of women that enters the technical interviewing process, the lower the chances that numerous women advance to further stages of recruitment.
To achieve diversity & inclusion, it’s important to understand what can be causing bias.
Interestingly, we observed that a female recruiter invited a higher proportion of females than males to take the coding online test.
We also found that the seniority gap between genders may be a contributing factor because women tend to be more junior — about 81% of female software engineers have six years or less of professional coding experience, and therefore are less likely to be considered for senior software engineering roles. In looking at the data, we can expect that the inequality in numbers of males versus females in software engineering will decrease in the future.
Codility task characteristics impact genders similarly
We studied several variables regarding the Codility tasks to try and understand performance gaps in a more meaningful way. Overall performance was extremely similar between genders — for example, we observed that adding a picture to a task helped both females and males with their scores.
To our knowledge, the only minor discrepancy was in tasks that contained math formulas. Females performed better on them — one of the reasons might be that they tend to be more comfortable solving tasks that feel familiar and have patterns that they already know.
Make sure your technical interviewing processes help mitigate bias.
*Equals sign indicates no affect on score or time used. Arrow up shows correlation with higher performance (or two arrows up — significantly higher performance). An arrow down shows correlation with lower performance (or two arrows — significantly lower performance).
We also looked at whether or not the task contained difficult vocabulary or a lengthy description — but none of these were statistically relevant enough to explain differences in performance as pictured above.
Females and males have different test-taking behaviors
Our data suggest that women were more likely to “give up” on their session, with an 11% drop-off rate among female candidates versus 9% among males. Giving up is defined as using less than 20% of the allotted time and scoring less than 20% on the coding online test.
Female candidates also edited their solutions more than male candidates, and ran coding online tests more often — that could mean that the females were more likely to double check and fine-tune their code.
We did observe some behavioral similarities between males and females:
- The more often candidates paste code, the lower score they get, and this affects both genders similarly. Pasting code often suggests a trial-and-error approach.
- Both genders began editing their code about 8 minutes into the session, and started their Codility test about 3 days after being invited.
Companies are fully capable of creating gender equality in the workplace
The biggest shift will happen after you review your existing sourcing strategies, are transparent about discrepancies, and work to establish unbiased tactics. Review your existing technical interviewing tools to determine if they actually support your diversity and inclusion efforts, not hinder them. We encourage you to rethink your diversity and inclusion strategy so you can more effectively source candidates — even though we can’t facilitate inviting more of your candidates to take the Codility test, our data suggest that there is a need for diversity-oriented strategy in our current hiring landscape.
Avoid a potentially biased technical interviewing process. Learn how to improve diversity and inclusion initiatives and promote gender equality in the workplace with Codility’s Gender Bias Report.