Mitigating Bias in Artificial Intelligence Staffing Software
Artificial Intelligence has very recently begun to trend as a great new time saving tool yielding comparable results of mere experienced humans. As interest in AI has grown, Abel Personnel
took a deeper dive into what this technology offers the staffing and recruiting process, and how it might still fall short of the advertised panacea in quickly identifying the best fit for a position opening. The results suggest that AI is another available tool but whose limitations must be recognised in making hiring decisions:
Predictive technology and artificial intelligence are now involved in multiple steps of the hiring process, particularly when using online job boards. While it may seem that using a computer
program to screen resumes would eliminate human bias, giving equally qualified applicants an equal opportunity to be seen, that isn’t always the case. What hiring managers and applicants
alike should understand is that the algorithms that online job boards and social media use to suggest jobs and candidates are only as objective as the data used to program them.
historical data regarding hiring trends is based on the outcomes of inherently biased decisions; therefore, hiring algorithms are designed to replicate the outcomes that humans typically decide. Predictive technology has the potential to amplify bias in presenting applicants to hiring managers, as well as suggesting jobs to job-seekers. In a study conducted by Harvard,
Northwestern University, and USC, 75% of targeted employment ads on Facebook for low-paying positions such as taxi drivers were shown to Black viewers, and 85% of employment ads for grocery store cashiers were targeted to female viewers. According to Harvard Business Review, “[Algorithms] can replicate institutional and historical biases, amplifying disadvantages lurking in data points like university attendance or performance evaluation scores.”
Large companies often develop their own programs that use artificial intelligence to screen resumes. Many of these programs have been scrapped or redeveloped because they were proven to be biased. According to Rutgers, Amazon’s former candidate sourcing algorithm identified “masculine language” and learned that using those specific words on a resume was more desirable to hiring managers than the actual job qualifications. The result was that the program presented an exponentially increasing number of male candidates to the hiring managers, regardless of their qualifications. According to Quartz, an audit of a different company’s resume screening program “…found that the algorithm found two factors to be most indicative of job performance: their name was Jared, and whether they played high school lacrosse.”
While this can seem discouraging to hiring managers and candidates alike, programs can be revised and even rewritten when hiring inequity is detected. According to MIT Technology Review, “…some of the world’s largest job search sites—including CareerBuilder, Zip Recruiter, and Monster—are taking very different approaches to addressing bias on their own platforms.” However, they go on to say “Since these platforms don’t disclose exactly how their systems work, though, it’s hard for job seekers to know how effective any of these measures are at actually preventing discrimination.”
The good news is that there are champions for correcting these flaws. Jamison Rotz, founder of Nearly Human, an AI development company focused on ethical programming, has watched big tech companies misuse personal data and ignore faulty algorithms that foster race, gender, economic, and age biases. According to The Founder Beat, “He’s on a mission to help steer a world ever more immersed in AI tech toward building with integrity and ethical guardrails.” Jameson also notes that the big, online job boards use AI bots to apply for jobs on behalf of candidates, which is why hiring managers who rely solely on those platforms for recruiting will spend countless hours flipping through resumes of unqualified “applicants” and reaching out to candidates who aren’t interested in their positions.
AI can also be misused by candidates to develop resumes in the same way it is employed to write term papers. AI can scan the job opening description and identify the keywords and qualifications that will attract the resume review algorithm, perhaps fudging some of education and employment facts to better align with the requirements.
At Abel Personnel, we mitigate these issues for our candidates as well as our business clients. Although our recruiters are equipped with tools that contain components of artificial intelligence, personal attention to both our clients and candidates helps us achieve the best matching of people with positions. Each candidate is personally interviewed, taking time to
really understand their goals, interests and skills. Our business clients are only introduced to interested, qualified candidates.
Sources:
hbr.org/2019/05/all-the-ways-hiring-algorithms-can-introduce-bias
qz.com/1427621/companies-are-on-the-hook-if-their-hiring-algorithms-are-biased
www.technologyreview.com/2021/06/23/1026825/linkedin-ai-bias-ziprecruiter-monster-artificial-intelligence/
thefounderbeat.com/harrisburg-tech-ceo-unlocks-business-potential-with-new-ai-platform/