"Coded Bias in Applicant Tracking Systems" by Prithvi Dhanabal and Emily Hao
  •  
  •  
 

Proposal

This Tech Justice Lab project explored potential hiring discrimination against transgender and gender-nonconforming job applicants in the hiring process by evaluating the impact of including pronouns in résumés submitted to an applicant tracking system (ATS) platform. The research team began by collecting background information regarding existing discrimination against transgender job applicants and how ATS platforms can reflect and magnify existing biases in the hiring process. A “sock puppet” audit was then conducted by submitting fake user résumés to test the platform’s response. The chosen platform was Jobscan, an applicant-facing software claiming the ability to simulate a résumé review by major corporate ATS platforms. Fake example résumés were gathered from Kaggle and submitted four times each to Jobscan alongside a corresponding job posting: first with no pronouns, then with he/him, she/her, and they/them pronouns. For each submission, data was collected about Jobscan’s feedback regarding the résumé quality and its match to the job listing. One finding was that adding pronouns only had the result of having the résumé listed as too lengthy and as adding “noise” in the résumé, meaning pronouns are seen by ATS as potentially irrelevant information, which could adversely impact the ATS’s assessment of the relevancy of the résumé to the positions. Another result was a lack of evidence to support Jobscan’s claimed ability to tailor feedback to specific companies’ ATS. This was tested by adding a target company’s name to the appropriate field in the platform, which resulted in no changes to the evaluation results.

Share

COinS