Supporters of algorithmic reparation suggest taking lessons from curation professionals such as librarians, who’ve had to consider how to ethically collect data about people and what should be included in libraries. They propose considering not just whether the performance of an AI model is deemed fair or good but whether it shifts power. The suggestions echo earlier recommendations by former Google AI researcher Timnit Gebru, who in a 2019 paper encouraged machine learning practitioners to consider how archivists and library sciences dealt with issues involving ethics, inclusivity, and power. Gebru says Google fired her in late 2020, and recently launched a distributed AI research center. A critical analysis concluded that Google subjected Gebru to a pattern of abuse historically aimed at Black women in professional environments. Authors of that analysis also urged computer scientists to look for patterns in history and society in addition to data. Earlier this year, five US senators urged Google to hire an independent auditor to evaluate the impact of racism on Google’s products and workplace. Google did not respond to the letter. Source: A Move for ‘Algorithmic Reparation’ Calls for Racial Justice in AI | WIRED
News
Smart Cities, Bad Metaphors, and a Better Urban Future | WIRED
Maybe it’s a cliché—I think I’ve used it myself—to say that scientists’ and philosophers’ explanations for how the brain works tend to metaphorically track the most advanced technology of their time. Greek writers thought brains worked like hydraulic water clocks. European writers in the Middle Ages suggested that thoughts operated through gear-like mechanisms. In the 19th century the brain was like a telegraph; a few decades later, it was more like a telephone network. Shortly after that, no surprise, people thought the brain worked like a digital computer, and that maybe they could build computers that work like the brain, or talk to it. Not easy, since, metaphors aside, nobody really knows how the brain works. Science can be exciting like that. Source: Smart Cities, Bad Metaphors, and a Better Urban Future | WIRED
Neighborhoods Watched | The Lens
New Orleans has spent millions to expand its police surveillance powers in recent years, providing the city with an unprecedented ability to monitor public spaces and track individuals. Similar mass surveillance systems have rapidly spread to cities across the US in the last two decades, for the most part without any formal oversight or local regulation. As a result, the public is often left in the dark about what tools and techniques the police use to spy on them. In partnership with the Fund for Investigative Journalism, The Lens has spent the past year obtaining and reviewing thousands of city documents to get a snapshot of New Orleans' current surveillance apparatus and the rapid, largely unchecked nature of its growth. Source: Neighborhoods Watched