Algorithm-based hiring can be discriminatory

More employers are using algorithm-based hiring tools for remote candidates. But a new report shows that a reliance on these tools can exacerbate inequities.

BY PAVITHRA MOHAN (Culled from the Fast Company)

As businesses have embraced remote work, more and more jobs have become accessible to people who worked from home out of necessity well before the pandemic. For people with disabilities—many of whom have long asked for remote accommodations at work—a lasting acceptance of remote work could be a silver lining of the pandemic.

“My hope is that when we eventually come out of this crisis, the remote work model will be seen under a whole new lens, one that allows companies to confront their own ableism and consider hiring those of us who can and do work remotely,” disability rights activist and journalist Keah Brown wrote for Fast Company. “We have the necessary abilities to be an asset to their companies if they let us.”

But even as workplaces become more flexible—and presumably more inclusive–people with disabilities continue to face discrimination during the hiring process. As highlighted in a new report by the Center for Democracy & Technology (CDT), the widespread adoption of algorithm-based hiring tools has only exacerbated existing inequities and discriminatory practices. Many employers use résumé screeners, while 76% of companies with more than 100 employees reportedly ask job applicants to take personality and aptitude tests; some businesses even use facial and voice recognition technology in the hiring process.

“Many employers who use these kinds of tests do so because they believe that they’re more neutral and more effective—that using these tests will make the hiring process more equitable because it will become more consistent, and it won’t involve potentially biased human recruiters and human HR people,” says Lydia X.Z. Brown, a co-author on the report and policy counsel with CDT’s Privacy and Data Project. “That is, in fact, how many vendors market their tools.”

Academics and advocates have sounded the alarm for years on how artificial intelligence reflects and amplifies human bias, rather than acting as the impartial arbiter that the tech industry once envisioned. The focus has often been on how these technologies have encoded racism and gender discrimination, but the same methods responsible for one kind of bias—training artificial intelligence on flawed data sets, for example—can replicate another.

“The reality is, as we know, anytime we create an algorithm which learns from the environment in which it is deployed, that algorithm will necessarily replicate—and then amplify at scale—the very same structural and systemic discrimination that occurs at the interpersonal level,” Brown says. “When we talk about disability discrimination in particular, we have to understand that ableism is so amorphous because there are so many types of disabled people in the world and each of us experiences our own disabilities in infinitely diverse ways from one another.”

Some of the hiring tools in use today may be clear violations of the Americans with Disabilities Act, which explicitly bars inaccessible test formats if employers can’t provide reasonable accommodations. According to Brown’s report, a number of tests on the market may not be accessible to people with physical disabilities—and many of the companies behind these tests don’t offer an adequate alternative. Applicants who are visually impaired or hard of hearing may not be able to complete a personality test or use a tool that employs facial recognition technology.

Even a résumé screening tool—which might seem like a straightforward way to sift through job applications—can mirror existing pitfalls in the hiring process and inadvertently weed out candidates with disabilities. In 2019, just over 19% of people with disabilities were employed, according to the Bureau of Labor Statistics, as compared to an employment rate of 66% among people without disabilities; nearly a third of disabled workers had only part-time employment. “People with disabilities [are] disproportionately less likely to be employed at all,” Brown says. “We’re more likely to be precariously employed, and we’re more likely to have longer gaps in our work history. I know many people in my life who would be considered highly qualified and highly skilled, who could work a regular work schedule but have months to years long gaps on their résumé—because after a temporary job ended, they were unable to be hired because of discrimination.”

A résumé screening tool that eliminates candidates with gaps in their résumé could impact people with disabilities (as well as other applicants, from pregnant individuals to the formerly incarcerated). And when companies train hiring tools against their own employees, they may select for more homogeneous applicants who, say, went to elite colleges and have similar work experience.

Even when an employer does provide an alternative test, an applicant with a disability has to either reveal their disability in advance—which flies in the face of conventional wisdom, Brown says—or potentially risk not scoring highly on the test. “It’s very common wisdom in the disabled community,” Brown says. “When younger disabled people ask older disabled people, ‘Should I disclose my disability?’ the most common wisdom that we always tell each other is: Never disclose until after you’ve signed a written offer.”

As Brown says, one of the biggest challenges for employers looking to eliminate discriminatory hiring practices—algorithmic or otherwise—is that there is no universal experience of living with a disability. That further complicates auditing efforts, too, which means discrimination can go unreported and undetected. Some hiring tools already undergo auditing, but in many cases only for race and gender bias.

“Even if a company wanted in good faith to try their hardest to make their tool as fair to disabled people as possible, they will never be able to account for all of the different ways that disabled people move through the world,” Brown says. “So for me, at the end of the day, the question comes back to: How can employers reduce the risk that they are going to amplify and exacerbate already devastating disability discrimination, and inequities in employment?”

One recommendation in Brown’s report is to either make automated testing optional, without penalizing applicants for opting out, or adopt universally accessible testing. (Being transparent about what a test is measuring, as well as its role in the hiring process, also gives candidates the ability to decline the test or seek accommodations.) An important consideration for universal testing is to make sure the test is actually measuring an applicant’s ability to perform the essential functions of the job. Beyond that, Brown says, a test should be accessible to as many people as possible and not presume, for example, that all participants can use oral speech. “It is possible to do that,” Brown says. “But it’s tricky, and it takes a lot of imagination [and] careful calibration and hard work.”

Under a new administration, Brown is optimistic that government agencies like the Equal Employment Opportunity Commission (EEOC) might conduct its own investigation into algorithm-driven tools and issue guidance around how it can exacerbate discrimination. “If the EEOC were able to issue that kind of guidance, that would be an enormously important tool not only for advocates,” Brown says, “but also for employers and for vendors to use to inform both their software decisions and their design and development decisions.”

And while this might sound far-fetched at the moment, Brown believes the federal government can—and should—be a model for more equitable hiring practices, without using tests that are more likely to filter out people with disabilities. In the past, the federal government has made a concerted effort to hire disabled workers: The Obama administration hired more than 100,000 people with disabilities, and by the time he left office, disabled employees accounted for more than 14% of the federal workforce. (Obama also issued an executive order eliminating the sub-minimum wage for federal workers with disabilities.)

Though federal guidance is crucial, the ultimate responsibility still lies with employers. And amid the pandemic, concerns about disability discrimination are even more urgent, as remote work is normalized and companies may be seeing an uptick in applicants with disabilities. “More employers are conducting hiring processes and onboarding processes remotely,” Brown says, “which means that employers might be more likely to consider using algorithmic hiring tools than they might have in the past.”

The disabled community has also grown due to the pandemic, as many COVID-19 patients are suffering from lasting symptoms and navigating a difficult, protracted recovery. For many long haulers, going back to work full-time simply isn’t an option, leaving them to seek out disability leave or remote jobs that offer more flexibility. A recent patient-led survey of 3,800 patients—all of whom fell sick in the first few months of the pandemic—found that 72% of respondents had still not returned to work or were working reduced hours. These long-haulers could now face the same type of discrimination that people in the disabled community have for years, due to gaps in their résumé or a need for accommodations.

“We are going to see more disabled people experiencing more particular and perhaps more targeted forms of disability discrimination,” Brown says. “That’s where employers have to be very careful about their legal obligations and their moral obligations—how they can do right by the people that are trying to work with them, and how they can make sure that they are trying to maximize equity in a meaningful way, and not simply in a prepackaged shiny way that superficially feels progressive.”

Pavithra Mohan is a staff writer for Fast Company.

gbenjolaw
Latest posts by gbenjolaw (see all)