Social Science Research Council Research AMP Mediawell
Citation

The Digital Poorhouse: From the novel Silverview

Author:
Eubanks, Virginia
Publication:
Harper's Magazine
Year:
2018

By Virginia Eubanks, from Automating Inequality, which was published this month by St. Martin’s Press. Eubanks is an associate professor of political science at the University at Albany, SUNY, and a founding member of the Our Data Bodies project.
Forty years ago, nearly all the major decisions that shape our lives—whether or not we are offered employment, a mortgage, insurance, credit, or a government service—were made by human beings. They often used actuarial processes that functioned more like computers than people, but human discretion still prevailed.
Today, we have ceded much of that decision-making power to machines. Automated eligibility systems, ranking algorithms, and predictive risk models control which neighborhoods get policed, which families attain needed resources, who is short-listed for employment, and who is investigated for fraud. Our world is crisscrossed by information sentinels, some obvious and visible: closed-circuit cameras, GPS on our cell phones, police drones. But much of our information is collected by inscrutable, invisible pieces of code embedded in social media interactions, applications for government services, and every product we buy. They are so deeply woven into the fabric of social life that, most of the time, we don’t even notice that we are being watched and analyzed.
Even when we do notice, we rarely understand how these processes are taking place. There is no sunshine law to compel the government or private companies to release details on the inner workings of their digital decision-making systems. With the notable exception of credit reporting, we have remarkably limited access to the equations, algorithms, and models that shape our life chances.
We all live under this new regime of data analytics, but we don’t all experience it in the same way. Most people are targeted for digital scrutiny as members of social groups, not as individuals. People of color, migrants, stigmatized religious groups, sexual minorities, the poor, and other oppressed and exploited populations bear a much heavier burden of monitoring, tracking, and social sorting than advantaged groups.
The most marginalized in our society face higher levels of data collection when they access public benefits, walk through heavily policed neighborhoods, enter the health care system, or cross national borders. That data reinforces their marginality when it is used to target them for extra scrutiny. Groups seen as undeserving of social support and political inclusion are singled out for punitive public policy and more intense surveillance, and the cycle begins again. It is a feedback loop of injustice.