[ad_1]
Did you know that until recently, every NFL player had to take a 50-question pre-employment test called the Wonderlic in order to play? A quarterback would take the same cognitive test to land his job that a bus driver or corporate job seeker still takes to land theirs.
For many years, Wonderlic results were confidential, but whenever data has been leaked to the public, the scores have provided a very clear picture of racial bias.
Just how bad is it? The average on the Wonderlic for the general population is 21. In one study, the average among Black NFL draft picks was 19.8, compared to 27.7 for whites. The only players to score below an 18 were also all Black, while the only ones to score above a 30 were all white.
According to Troy Vincent, vice president of operations for the NFL, the results of “an overall audit of all the assessments” caused the league to stop administering this test in 2022 because, “frankly, it’s been an outdated process.”
The legal term for discrimination caused by biased hiring tests is called disparate impact. It occurs when an employment practice appears neutral, but actually has negative effects on underrepresented groups, like people of color and women. Employment testing of some type has been used to sustain segregation in personnel, military and college settings for almost 100 years.
For a local example, in 2012, New York City was hit with a $128 million lawsuit because of the Fire Department’s “neutral” hiring practices. Between 1999 and 2006, FDNY used a written test that disadvantaged Black and Hispanic candidates to screen entry-level firefighters. Upon investigation, it was clear this test had little to do with a person’s fire-fighting abilities, meaning the department had likely denied jobs to thousands of qualified minority candidates. It wasn’t the first time FDNY had turned a blind eye to racial bias in hiring tools. The court described the incident as “part of a pattern, practice, and policy of intentional discrimination against black applicants.”
NYC Local Law 144, passed in December 2021, has two main requirements that get at the heart of disparate impact. First, all employers must conduct bias audits on whatever employment tests or tools they use. Second, employers must publicly disclose the audit results. For the first time in history, employers must reveal information about the disparate impact in any tests they use to screen applicants, whether they create them internally or purchase from vendors.
For decades, my research career has been dedicated to reducing the discrimination caused by biased employment tests. I try to help people see how bias in testing happens. My enthusiasm for this law stems from the fact that it will bring transparency to a very broad range of automated tools, including tests like the Wonderlic.
The Adams administration must provide official rules to clarify the law’s implementation before it goes into effect in January. This should be straightforward, especially since disparate impact is an established legal concept.
Unfortunately, the city’s business community wants to create loopholes to avoid transparency. If they can water down the definition of “automated employment decision tools” so that it only applies to sophisticated computer applications, they will be able to continue using some of the most biased paper-and-pencil cognitive tests. This implementation would be like forcing electric cars to report on carbon emissions, but not gas guzzlers. All automated tools must be audited regardless of how high-tech or low-tech they are.
Despite how bewildered some employers are acting, they are not new to bias audits. Since the civil rights era, federal regulations have required many organizations to collect the same data that NYC’s new law sheds light on. But many of these organizations are also very attached to their existing hiring methods and do not want to face pressure to give them up.
In a classic case of victim blaming, some supporters of traditional testing will even ask why test takers of color do not complain about disparate impact.
The answer is obvious: Individual test takers do not have access to relevant information. They do not know how many applicants of different racial groups applied for a given job. There is currently no public information about the extent of bias in hiring practices. You simply don’t know what you don’t know.
Certain automated hiring assessments are quite possibly creating a racial hierarchy in the workplace. The only way we’ll know for sure is by bringing these numbers into the public eye. All this new local law asks is for employers to make existing reports available to the public. Simple as that.
New York City is on the cusp of opening the door to a fair and transparent process for selecting qualified applicant pools. The business community, the creators of the city’s labor ecosystem, owe it to New Yorkers to provide diverse and qualified environments that reflect the customers, clients and audiences they serve. We need to take it, not flinch.
Helms is an Augustus Long professor emeritus in the Department of Counseling, Developmental and Educational Psychology and Director of the Institute for the Study and Promotion of Race and Culture at Boston College.
()
[ad_2]
Source link