Auto-essentialization: Gender in Automated Facial Analysis as Extended Colonial Project

Scheuerman, Morgan Klaus; Pape, Madeleine; Hanna, Alex
Big Data & Society

Scholars are increasingly concerned about social biases in facial analysis systems, particularly with regard to the tangible consequences of misidentification of marginalized groups. However, few have examined how automated facial analysis technologies intersect with the historical genealogy of racialized gender—the gender binary and its classification as a highly racialized tool of colonial power and control. In this paper, we introduce the concept of auto-essentialization: the use of automated technologies to re-inscribe the essential notions of difference that were established under colonial rule. We consider how the face has emerged as a legitimate site of gender classification, despite being historically tied to projects of racial domination. We examine the history of gendering the face and body, from colonial projects aimed at disciplining bodies which do not fit within the European gender binary, to sexology's role in normalizing that binary, to physiognomic practices that ascribed notions of inferiority to non-European groups and women. We argue that the contemporary auto-essentialization of gender via the face is both racialized and trans-exclusive: it asserts a fixed gender binary and it elevates the white face as the ultimate model of gender difference. We demonstrate that imperialist ideologies are reflected in modern automated facial analysis tools in computer vision through two case studies: (1) commercial gender classification and (2) the security of both small-scale (women-only online platforms) and large-scale (national borders) spaces. Thus, we posit a rethinking of ethical attention to these systems: not as immature and novel, but as mature instantiations of much older technologies.