For more information please call  800.727.2766


EEOC Gathering Ways To Prevent AI Discrimination Against Job Applicants

The Equal Employment Opportunity Commission (EEOC) held a public hearing with workplace experts to examine the ways artificial intelligence can result in discrimination. Entitled "Navigating Employment Discrimination in AI and Automated Systems: A New Civil Rights Frontier," speakers discussed the benefits and pitfalls of AI, such as how discrimination can occur and how it might impact diversity, equity, inclusion, and accessibility efforts. The expert speakers also suggested ways the EEOC may be able to regulate it.

As reported by NPR, 83% of employers, including 99% of Fortune 500 companies, use automated tools as part of their hiring processes. Employers often find good resources and economic value in using these resources. The EEOC found automated tools that evaluate facial expressions and speech patterns may be perpetuating discrimination and bias. In examples provided by NPR, programmed chatbots may reject candidates with gaps in their resumes, gaps that could be caused by disabilities or giving birth to a child. An AARP advisor at the session said AI may impact older workers in a variety of ways, including algorithms seeking data from social media and professional digital profiles overlooking workers with smaller digital footprints. For older candidates that make it past an initial resume review, chatbots may perceive poor interactions during the initial interview. This perception could cause train the chatbot to rank other similar candidates lower. A recent lawsuit against an employer suggests some of this bias. The female applicant did not realize there was an age cutoff until she re-applied for the same job with a different birth date. A 2022 study conducted by Learning Collider found AI-driven tools selected 50% fewer Black applicants than humans. Rhonda Moll, EPS Senior Consultant, explored these risks in late 2022.

The session panelists agreed audits of this programming will be required to avoid biases. It is unclear who would conduct these audits in a way that fosters trust and innovation.