With the sociopolitical landscape changing and scrutiny of corporate DEI efforts increasing, some employers are diminishing their reliance on diversity data. But this is a step in a disappointing and dangerous direction. As a TA leader, diversity analytics offers an opportunity to anticipate and invest in closing gaps in equity for everyone. More importantly, it lets you understand the nuances of your unique recruiting process – and how it varies across your company – so you can make informed, targeted changes quickly.
But the accuracy of your recruiting data and resulting insights is paramount, particularly when it comes to diversity analytics. The key to building a consistently fair, efficient hiring process is tracking and using diversity data responsibly – adhering to both legal regulations and data science best practices (which are not as intimidating as they sound!). Below, we offer five crucial tips for navigating the complexities of diversity analytics with confidence, ensuring a more inclusive and efficient hiring process that delivers for both your candidates and your business.
Implement good data hygiene practices
Anytime you want to glean insights (diversity-related or otherwise) from your recruiting data, it’s imperative that data is clean. You may be familiar with the expression “garbage in, garbage out” – using dirty (i.e., erroneous) data to power reporting and analytics results in inaccurate findings and misinformed decisions.
Unfortunately, it’s easy to tarnish the data in your applicant tracking system (ATS) – especially if you have dozens of busy recruiters doing things their own way, on their own time. Moving a candidate from one stage to the next late (e.g., advancing them from “screening” to “assessment” when the interview is scheduled instead of when they’re first invited), mis-tagging a candidate (e.g., “disqualifying” someone who has actually withdrawn from the process), and neglecting to enter in data (such as a last-minute change to the interview plan) are all ways that recruiters can accidentally but meaningfully dirty ATS data and hamper your ability to gain valuable insights from it later on.
Before you attempt diversity analytics, ensure that it’s clean and viable by instilling proper data hygiene practices in your organization. These can include using standard conventions for free text fields (e.g., applicant source or rejection reason), closing requisitions immediately after a hire (evergreen jobs wreck recruiting analytics), and updating candidate status in real time, among other things. To cement habits, you may also need to examine your team’s incentive structure and reward recruiters for keeping their data clean. Otherwise, they may be motivated to take actions that boost their individual performance metrics but muddy the data set overall.
Understand the limitations of self-reported data
To learn the demographic breakdown of your candidate pipeline and analyze how different groups are performing, your inclination is probably to look at self-reported applicant data, such as Equal Employment Opportunity Commission (EEOC) survey data – if you’re located somewhere it’s legal to collect identity data. (If not, then you may already acutely understand the limitations of self-reported data.)
But many job seekers choose not to respond to voluntary self-identification questions, or they’re not comfortable being honest about their identity. This makes self-reported EEOC data concerningly incomplete and unreliable. Our research shows that, for the average company, comparing completed applicant survey responses to total applicants – including those who formally declined to answer the survey (selecting the “Prefer not to say” option) and those who simply didn’t submit anything – yields a realistic “compliance rate” of only 65%. (Technically, EEOC compliance only considers applicants with recorded survey responses and excludes missing data from the calculation entirely, but we find that deceiving.)
This doesn’t mean self-reported demographic data isn’t valuable – or that attempting to collect it, when allowed, is unimportant or futile. (In fact, if an overwhelming number of your applicants aren’t responding, it could indicate a technical problem, like incorrect ATS configuration, or a larger reputational issue, like signaling to job seekers that their answers will harm their chances of getting hired.) What it does mean, though, is that there are blind spots in your self-reported data – and the lower your compliance rate, the bigger the blind spot, and the riskier it is to use that data to drive decision-making.
Consider using a rigorously tested inference model
Regardless of the availability or quality of self-reported data, understanding the demographics of your talent pool is crucial for building a fair and inclusive hiring process and diverse workforce. While it’s smart to not base your diversity analytics or strategy on glaringly incomplete data, it’s irresponsible to neglect this type of analysis entirely. After all, you can’t manage what you don’t measure.
That’s where a highly credible inference model, like Datapeople’s gender inference model, comes in. An inference model is, simply speaking, an algorithm trained on a robust set of historical data to estimate results from new data. (In Datapeople’s case, we leverage a proprietary algorithm trained on vast public data sets of self-reported gender and first names, encompassing a global range of combinations beyond just American and English names. The model analyzes our customers’ actual job applicants’ first names, providing probabilistic estimates of gender for the applicant population.) Using a high-quality inference model to augment the diversity data in your ATS gives you valuable, reliable new insights on your candidate pipeline without infringing on individual privacy.
Of course, “inferring” applicant gender and other traits isn’t perfect, so you should scrutinize any algorithms you’re considering or already using, just as you would self-reports. A vendor (or data science team, if you’re lucky enough to have dedicated resources in-house) should be able to tell you exactly how their model works, how it’s been tested, and the accuracy of its results. For example, our gender inference model is over 90% accurate when compared to actual self-reported data.
Anyone developing a demographic inference model should also have a strong perspective on – and clearly, proactively communicate – the limits of their model. For instance, because we know the constraints of binary gender classifications and the importance of inclusivity, Datapeople doesn’t report on individual candidate gender, and our aggregate analyses do not seek to apply non-binary identities due to the complexities of relying upon name-based predictions. Our intent is to illuminate overall trends that may highlight potential issues within a company’s hiring pipeline.
Identify variances in pass-through rates by demographic
Once you’ve trained your team on good data hygiene, assessed how confident you can be in your self-reported applicant data, and possibly leveraged algorithms to fill in gaps, it’s finally time to learn how job seekers of different backgrounds perform in your hiring funnel so you can identify potential biases and start building a more fair and inclusive hiring process.
It might be tempting to jump straight to your offers or hires to see if there’s balance among groups or proportions that match the general population. But while seemingly to-the-point, this is actually the wrong way to measure how equitable your recruiting efforts are. Ideally, your top-of-funnel makeup will be reflected in your bottom-of-funnel outcomes, but if there are inadvertent biases present in your hiring process – whether in an assessment, compensation and benefits, or a hiring manager – pass-through rates by demographic will alert you to that fact.
When pass-through rates for different groups roughly mirror one another, it suggests your process is free of systemic barriers that might prevent candidates of certain backgrounds from advancing or getting hired. But when you observe clear differences between groups at the same stage, it’s a good indicator that something about that stage is unfair or exclusionary. For example, if 90% of male candidates advance from Offer to Hired but only 70% of female candidates do, it’s possible that the benefits or workplace flexibility your company offers are, by and large, more conducive to men than women.
By looking for variances in pass-through rates by demographic, you can pinpoint (and then fix) problem areas in your hiring process that may be eroding diversity at the very bottom of your funnel. If you start at the bottom, however, it won’t be clear what changes you need to make – or if you even need to make any changes at all.
Analyze your job posts, not just your candidate pools
Clean, complete applicant demographic data is a remarkably powerful tool for optimizing your candidate experience and the equity, efficiency, and efficacy of your hiring process. But it only tells you about the job seekers who’ve opted into your selection process – not the ones who declined to apply (or never even knew about your available opportunities).
To truly design the most inclusive hiring process that drives the best outcomes, you have to leverage end-to-end recruiting analytics beyond what’s available in your ATS. In particular, your job listings contain a plethora of key data points that usually go ignored because job post creation typically takes place outside of the applicant tracking system. But when you know how your posts perform (or don’t) with qualified job seekers of all backgrounds, you gain a deeper understanding of your pipeline and unlock brand new avenues to improve diversity at the top of your funnel.
Job ads (not to be confused with internal job descriptions) are highly technical documents in which nearly every word – from the title to the salary range to “must-have” vs. “nice-to-have” qualifications – impacts your ability to attract a diverse and qualified applicant pool. If hiring teams across your organization aren’t posting inclusive, compliant, scientifically backed jobs consistently, you’re likely losing out on qualified talent from historically underrepresented groups. And if you’re not monitoring your job posts both day-to-day and over time, you’re undoubtedly losing opportunities to both remedy live jobs actively working against you and optimize your hiring process for fairness and efficiency from the very start.
Of course, if you have dozens – let alone hundreds – of job openings at any one time, manually auditing and analyzing them is almost impossible. Consider utilizing a comprehensive platform like Datapeople Insights, which gives your team real-time answers about what’s working and what’s not working throughout your entire hiring process – starting with your job posts.
Build a fair, inclusive hiring process with diversity analytics (and more)
As the labor market, economy, and sociopolitical climate all continue to shift rapidly, DEI initiatives are increasingly facing external scrutiny while grappling with limited budget, headcount, and support internally. But a fair, inclusive hiring process is efficient, effective, and a powerful driver of an organically diverse workforce in any environment. That’s why hiring equity should be TA teams’ North Star in 2024. And it’s possible to achieve (no matter how limited your resources) through responsible diversity analytics and a number of other best practices.
We’ve packed all our essential tips into one free, comprehensive e-book, The Talent Acquisition Mega Guide to Equitable Hiring in 2024. From optimizing your job posts for inclusivity to exposing biases in your hiring process through data, learn how you can accomplish hiring equity this year by downloading it today!