Physiognomic Artificial Intelligence

39 Pages Posted: 24 Sep 2021 Last revised: 12 Apr 2023

See all articles by Luke Stark

Luke Stark

Western University, Faculty of Information and Media Studies

Jevan Hutson

Hintze Law PLLC

Date Written: September 20, 2021

Abstract

The reanimation of the pseudosciences of physiognomy and phrenology at scale through computer vision and machine learning is a matter of urgent concern. This Article, which contributes to critical data studies, consumer protection law, biometric privacy law, and anti-discrimination law, endeavors to conceptualize and problematize physiognomic artificial intelligence (AI) and offer policy recommendations for state and federal lawmakers to forestall its proliferation.

Physiognomic AI, we contend, is the practice of using computer software and related systems to infer or create hierarchies of an individual’s body composition, protected class status, perceived character, capabilities, and future social outcomes based on their physical or behavioral characteristics. Physiognomic and phrenological logics are intrinsic to the technical mechanism of computer vision applied to humans. In this Article, we observe how computer vision is a central vector for physiognomic AI technologies, unpacking how computer vision reanimates physiognomy in conception, form, and practice and the dangers this trend presents for civil liberties.

This Article thus argues for legislative action to forestall and roll back the proliferation of physiognomic AI. To that end, we consider a potential menu of safeguards and limitations to significantly limit the deployment of physiognomic AI systems, which we hope can be used to strengthen local, state, and federal legislation. We foreground our policy discussion by proposing the abolition of physiognomic AI. From there, we posit regimes of U.S. consumer protection law, biometric privacy law, and civil rights law as vehicles for rejecting physiognomy’s digital renaissance in artificial intelligence. Specifically, we argue that physiognomic AI should be categorically rejected as oppressive and unjust. Second, we argue that lawmakers should declare physiognomic AI to be unfair and deceptive per se. Third, we argue that lawmakers should enact or expand biometric privacy laws to prohibit physiognomic AI. Fourth, we argue that lawmakers should prohibit physiognomic AI in places of public accommodation. We also observe the paucity of procedural and managerial regimes of fairness, accountability, and transparency in addressing physiognomic AI and attend to potential counterarguments in support of physiognomic AI.

Keywords: Artificial Intelligence, Computer Vision, Physiognomy, Abolition, Consumer Protection, Biometric Privacy, Antidiscrimination, Public Accommodations, Critical Data Studies

Suggested Citation

Stark, Luke and Hutson, Jevan, Physiognomic Artificial Intelligence (September 20, 2021). Fordham Intellectual Property, Media & Entertainment Law Journal, Available at SSRN: https://ssrn.com/abstract=3927300 or http://dx.doi.org/10.2139/ssrn.3927300

Luke Stark (Contact Author)

Western University, Faculty of Information and Media Studies ( email )

FIMS Nursing Building
London, Ontario N6A 0A2
Canada

Jevan Hutson

Hintze Law PLLC ( email )

Do you have negative results from your research you’d like to share?

Paper statistics

Downloads
1,862
Abstract Views
17,147
Rank
16,675
PlumX Metrics