Algorithmic Jim Crow
65 Pages Posted: 17 Nov 2017
Date Written: October 25, 2017
This Article contends that current immigration- and security-related vetting protocols risk promulgating an algorithmically driven form of Jim Crow. Under the “separate but equal” discrimination of a historic Jim Crow regime, state laws required mandatory separation and discrimination on the front end, while purportedly establishing equality on the back end. In contrast, an Algorithmic Jim Crow regime allows for “equal but separate” discrimination. Under Algorithmic Jim Crow, equal vetting and database screening of all citizens and noncitizens will make it appear that fairness and equality principles are preserved on the front end. Algorithmic Jim Crow, however, will enable discrimination on the back end in the form of designing, interpreting, and acting upon vetting and screening systems in ways that result in a disparate impact.
Currently, security-related vetting protocols often begin with an algorithm-anchored technique of biometric identification — for example, the collection and database screening of scanned fingerprints and irises, digital photographs for facial recognition technology, and DNA. Immigration reform efforts, however, call for the biometric data collection of the entire citizenry in the United States to enhance border security efforts and to increase the accuracy of the algorithmic screening process. Newly developed big data vetting tools fuse biometric data with biographic data and internet and social media profiling to algorithmically assess risk.
This Article concludes that those individuals and groups disparately impacted by mandatory vetting and screening protocols will largely fall within traditional classifications — race, color, ethnicity, national origin, gender, and religion. Disparate-impact consequences may survive judicial review if based upon threat risk assessments, terroristic classifications, data-screening results deemed suspect, and characteristics establishing anomalous data and perceived foreignness or dangerousness data — nonprotected categories that fall outside of the current equal protection framework. Thus, Algorithmic Jim Crow will require an evolution of equality law.
Keywords: Algorithmic Screening Process, Biometric Data Collection, Biometric Identification, Database Screening, Discrimination, Disparate-Impact Consequences, Equality, Jim Crow, Immigration Vetting Protocols, Security-Related Vetting Protocols, Separate but Equal
JEL Classification: K00, K10, K32, K37, K39
Suggested Citation: Suggested Citation