Math that Keeps Your Company Out of The Courts

gavel-3577254_1920.jpg

With gender and ethnic diversity being such sensitive topics in recent years, it is the responsibility of both Human Resources, Legal teams and math experts to proactively provide statistical back-up to certain workforce decisions. One important application of math to workforce decisions is in avoiding adverse impact.

Adverse impact refers to workforce decisions that appear neutral but have a biased effect on a protected group. Adverse impact may occur in hiring, voluntary and involuntary RIFs (reduction in force), promotions, transfers, and performance reviews.

In modern day, it is a rare occasion that a company's business decision intentionally discriminates against a portion of the workforce. However, it is very possible that a decision, such as a RIF, can accidentally impact one group more than others. In other situations, a job requirement, such as possessing a certain degree or having to pass a qualifying math test can get you into trouble if you don’t have a statistically valid study behind it to prove that the requirement is truly needed for someone to be successful at the job in question.

In such cases, a lawsuit can be brought against the company and the resulting financial damages can be substantial. So what do we, the math experts, HR and the Legal team do to ensure we keep our companies out of the courts (and out of the negative side of the media)?

In many companies, it's the manager seeking an employee who writes the job description. In these situations, the job description tends to end up being the manager’s wish list for the perfect employee. Managers are not aware of subjects such as adverse impact, so it's very easy for a job description to have additional requirements beyond the minimum needed to be successful in the job. It is HR’s responsibility to provide guidance and education to managers as to the risks involved in writing job descriptions alone.

The Definition of Adverse Impact

The EEOC guidelines and the Uniform Guidelines on Employee Selection Procedures define adverse impact as "a substantially different rate of selection in hiring, promotion or other employment decision which works to the disadvantage of members of a race, sex or ethnic group."

These agencies consider a group's selection rate that is less than 4/5ths of the selection rate for the group with the highest selection rate as a "substantially different" rate of selection. Remember the reference to the "4/5ths Rule" as it tends to be one of the approaches argued in court.

Calculating Adverse Impact

This 4/5th rule for hiring determines whether the hiring procedure or test passing rate for the group with the highest hiring rate is substantially different than for the other groups. If any of the comparison groups do not have a hiring or passing rate equal to or greater than 80% of the rate of the highest group, then it is generally accepted that there is evidence of an adverse impact present in the hiring or testing procedure.

This is one method, a fairly old method but it’s still accepted in practice.  A more mathematically rigourous method would be to use statistical hypothesis testing to test the proportions hired in each group.

The disadvantage to using the 4/5ths rules is that it falls victim to sampling errors when the sample size and selection rates are small. Taking ratios of percentages with small numbers… well, you know how that goes in the math world.

Statistical tests can be used to determine if one group is “significantly” different than the other, but statistical tests have a chance of error as well. In court cases, if the 4/5th rule yields one conclusion and the statistical tests yield another, each side of the court will try to convince the judge or jury that one method is more applicable than the other in terms of accuracy. It is best to run both analyses to know where you stand.

Global and Local Data Challenges

Most countries do not have laws applicable to adverse impact situations. However, in a world of instant global communications, news of discrimination or the mere accusation of discrimination in one country can tarnish a company’s reputation across the globe. Just ask many of the technology companies that reported their UK gender pay gap data in April. I have clients with large gaps but a deeper look into the data with statistical testing proves the gap has little to do with gender.

One of the added challenges when trying to proactively avoid adverse impact is that not all countries gather information on gender and ethnicity in the HR workforce systems or applicant tracking systems (ATS). Most ATS systems I’ve seen ask employees to self-identify. From the data I’ve worked with in the past, it seems that missing gender data can be as much as 30%. Guidelines are available on how to deal with this when it comes to adverse impact situations.

Previous
Previous

Beginner's Guide: Measures, Metrics and Key Performance Indicators [KPIs]

Next
Next

5 Ways to Better Protect Your Confidential Data