The Supreme Court is shortly expected to issue its decision on a challenge to Roe v. Wade that will—if a leaked draft version of the opinion holds—end federal protection for abortion access across the US. If that happens, it will have far-reaching consequences for millions of people. One of those is that it could significantly increase the risk that anti-abortion activists will use surveillance and data collection to track and identify people seeking abortions, sending authorities information that could lead to criminal proceedings. Opponents of abortion have been using methods like license plate tracking for decades. In front of many clinics around the US, it remains a daily reality. To get to the parking lot at Preferred Women’s Health Center in Charlotte, North Carolina, for example, people often have to drive through a gauntlet of protesters carrying cameras and clipboards, filming their arrival and recording details about them and their cars. Heather Mobley, a board member of Charlotte for Choice, volunteers as a clinic defender there. Clinic defenders put themselves in between people attending the clinics and protesters, verbally engaging with protesters as needed. They also surveil the surveillance. Mobley uploads examples of anti-abortion protesters’ tactics to TikTok; hers is one of several accounts that document the extent of daily protests there. [...] Source: Anti-abortion activists are collecting the data they’ll need for prosecutions post-Roe | MIT Technology Review
Sofina Tanni
Social impacts of algorithmic decision-making: A research agenda for the social sciences | Sage Journals
Academic and public debates are increasingly concerned with the question whether and how algorithmic decision-making (ADM) may reinforce social inequality. Most previous research on this topic originates from computer science. The social sciences, however, have huge potentials to contribute to research on social consequences of ADM. Based on a process model of ADM systems, we demonstrate how social sciences may advance the literature on the impacts of ADM on social inequality by uncovering and mitigating biases in training data, by understanding data processing and analysis, as well as by studying social contexts of algorithms in practice. Furthermore, we show that fairness notions need to be evaluated with respect to specific outcomes of ADM systems and with respect to concrete social contexts. Social sciences may evaluate how individuals handle algorithmic decisions in practice and how single decisions aggregate to macro social outcomes. In this overview, we highlight how social sciences can apply their knowledge on social stratification and on substantive domains of ADM applications to advance the understanding of social impacts of ADM. Source: Social impacts of algorithmic decision-making: A research agenda for the social sciences – Frederic Gerdon, Ruben L Bach, Christoph Kern, Frauke Kreuter, 2022
FTC to Prioritize Cybersecurity and Data Minimization Enforcement Under COPPA to Bolster Student Privacy | Center for Democracy and Technology
The Center for Democracy & Technology (CDT) welcomes the unanimous approval by the Federal Trade Commission (FTC) of a policy statement that underscores education technology vendors’ responsibilities under the Children’s Online Privacy Protection Act (COPPA). The statement acknowledges the importance of technology in students’ lives, and makes known that the FTC intends to increase its enforcement of COPPA’s existing requirements related to data security and data minimization. This development is an important step toward improving privacy for students and children and securing their data, as student and children’s privacy laws have long been criticized for their lack of enforcement. “The FTC’s policy statement underscores the importance of thoughtful data practices in protecting students’ privacy,” said CDT President & CEO Alexandra Givens. “Limitations on data collection, use, and retention are essential to protect individuals from privacy harms and cybersecurity risks. We applaud the FTC for its work to strengthen enforcement of children’s privacy requirements in the context of education technology, and particularly thank the Commissioners who championed data minimization as a vital component of this work. While this policy statement represents an important step forward, we also join the call for the FTC to complete its long-awaited review of the regulations that govern children’s privacy, and to align those reforms with the wider movement to protect everyone’s privacy at the federal level.” [...] Source: FTC to Prioritize Cybersecurity and Data Minimization Enforcement Under COPPA to Bolster Student Privacy – Center for Democracy and Technology
Allied Media Projects is seeking a Human Resources Coordinator! – Allied Media Projects
Background Allied Media Projects (AMP), created approximately 20 years ago, cultivates media for liberation. Rooted in Detroit and connected globally, AMP is a network of media makers, artists, educators, and technologists working for social justice. Through its programs, AMP envisions and attempts to model a world in which we cultivate care and joy, dismantle harmful systems, and assume responsibility for creating new and liberatory ways of being. AMP’s current programs include: The Sponsored Projects Program which provides fiscal sponsorship, training and capacity building to people and projects aligned with AMP’s mission; The Allied Media Conference, a convening of AMP’s vast network where participants celebrate, strategize and skillshare; The Speakers Bureau which disseminates the skills, ideas and media of the AMP network to the wider world. [...] Source: Allied Media Projects is seeking a Human Resources Coordinator! – Allied Media Projects
Ableism And Disability Discrimination In New Surveillance Technologies: How new surveillance technologies in education, policing, health care, and the workplace disproportionately harm disabled people – Center for Democracy and Technology
Introduction Algorithmic technologies are everywhere. At this very moment, you can be sure students around the world are complaining about homework, sharing gossip, and talking about politics — all while computer programs observe every web search they make and every social media post they create, sending information about their activities to school officials who might punish them for what they look at. Other things happening right now likely include: Delivery workers are trawling up and down streets near you while computer programs monitor their location and speed to optimize schedules, routes, and evaluate their performance; People working from home are looking at their computers while their computers are staring back at them, timing their bathroom breaks, recording their computer screens, and potentially listening to them through their microphones; Your neighbors – in your community or the next one over – are being tracked and designated by algorithms targeting police attention and resources to some neighborhoods but not others; Your own phone may be tracking data about your heart rate, blood oxygen level, steps walked, menstrual cycle, and diet, and that information might be going to for-profit companies or your employer. Your social media content might even be mined and used to diagnose a mental health disability. This ubiquity of algorithmic technologies has pervaded every aspect of modern life, and the algorithms are improving. But while algorithmic technologies may become better at predicting which restaurants someone might like or which music a person might enjoy listening to, not all of their possible applications are benign, helpful, or just. [...] Source: Ableism And Disability Discrimination In New Surveillance Technologies: How new surveillance technologies in education, policing, health care, and the workplace disproportionately harm disabled people – Center for Democracy and Technology
Call for Abstracts: Race, Ethnicity, Technology & Elections – Defining Problems, Developing Solutions | Tech Policy Press
Call for Abstracts: Race, Ethnicity, Technology & Elections - Defining Problems, Developing Solutions In order to submit an abstract, please complete the form available here. Even as the 2022 U.S. midterm elections and 2024 presidential campaign cycle draw near, America’s long history of voter suppression targeting Black and Latino communities seems set to add another grim chapter. New state laws put in place following the 2020 cycle – many premised on false claims of a stolen election and massive voter fraud embraced by the former President and his loyalists – seem set to disadvantage Black voters, in particular. Courts continue to dismantle gains made since the passage of the Voting Rights Act of 1965, and efforts to introduce legislative reforms to protect the franchise at the federal level have failed. These very real challenges to democratic participation are compounded by an information environment which continues to reinforce the oppression of Black and Brown people, as disinformation, bigotry, and discrimination continues to thrive on online platforms. Despite some gains, such as an audit at Facebook that led to the creation of a team focused on civil rights issues, there remains a great deal to do to ensure the major tech platforms are a part of the solution, rather than part of the problem. [...] Source: Call for Abstracts: Race, Ethnicity, Technology & Elections- Defining Problems, Developing Solutions
The Pandemic’s Innovation Lessons for the Climate Crisis | Project Syndicate
As with the development of COVID-19 vaccines, confronting today’s mounting climate challenges requires close cooperation between the public and private sectors, as well as between countries. Shaping markets and industries via the right policy mix of targeted government financing and procurement can accelerate the green transition. LONDON – While the COVID-19 crisis brought much suffering and many socioeconomic burdens, it has also shown how targeted cooperation between the state and business can speed innovation. Addressing the climate crisis calls for similarly creative collaboration. In both cases, accelerating innovation and experimenting with local solutions is necessary but not sufficient. Essential technologies – whether vaccines or renewables – must be diffused globally. The first COVID-19 vaccines were granted emergency-use authorizations in the United States and the European Union less than a year after the pandemic began. Established innovation systems and adequate manufacturing capacity were important factors. Without longstanding cooperation between private and public institutions, and government promotion and funding of research, rapid COVID-19 vaccine development would not have been possible. [...] Source: The Pandemic’s Innovation Lessons for the Climate Crisis by Antonio Andreoni & Rainer Quitzow – Project Syndicate
If Tech Fails to Design for the Most Vulnerable, It Fails Us All | WIRED
What do Russian protesters have in common with Twitter users freaked out about Elon Musk reading their DMs and people worried about the criminalization of abortion? It would serve them all to be protected by a more robust set of design practices from companies developing technologies. Let’s back up. Last month, Russian police coerced protesters into unlocking their phones to search for evidence of dissent, leading to arrests and fines. What’s worse is that Telegram, one of the main chat-based apps used in Russia, is vulnerable to these searches. Even just having the Telegram app on a personal device might imply that its owner doesn’t support the Kremlin’s war. But the builders of Telegram have failed to design the app with considerations for personal safety in high-risk environments, and not just in the Russian context. Telegram can thus be weaponized against its users. Likewise, amid the back and forth about Elon Musk’s plan to buy Twitter, many people who use the platform have expressed concerns over his bid to forefront algorithmic content moderation and other design changes on the whim of his $44 billion fancy. Bringing in recommendations from someone with no framework of risk and harms to highly marginalized people leads to proclamations of “authenticating all humans.” This seems to be a push to remove online anonymity, something I’ve written about very personally. It is ill-thought-through, harmful to those most at risk, and backed by no actual methodology or evidence. Beyond his unclear outbursts for changes, Musk’s previous actions combined with the existing harms from Twitter’s current structures have made it clear that we’re heading toward further impacts on marginalized groups, such as Black and POC Twitter users and trans folks. Meanwhile, lack of safety infrastructure is hitting home hard in the US since the leak of the draft…
Design From the Margins | Belfer Center for Science and International Affairs
In an age of virtual connectivity and increased reliance on the internet for daily functions, including by marginalized groups, can companies and technologists reframe their features or standards to support the most marginalized users’ needs? Can the modes of resilience within digital spaces from some of the most marginalized groups be listened to, learned from, and centered when creating technology? Design From the Margins (DFM), a design process that centers the most impacted and marginalized users from ideation to production, pushes the notion that not only is this something that can and must be done, but also that it is highly beneficial for all users and companies. For this to happen, consumer interest conversations need to be framed outside the “biggest use case” scenarios and United States and European Union-centrisms and refocused on the cases often left in the margins: the decentered cases. This report outlines how the DFM method can be used to build our most well-known and relied-upon technologies for decentered cases (often deemed “edge cases” which is atypical or less common use case for a product) from the beginning of the design process, rather than retrofitting them post-deployment to cater to communities with what are perceived to be extra needs. [...] Source: Design From the Margins | Belfer Center for Science and International Affairs
U.S. cities are backing off banning facial recognition as crime rises | Reuters
Facial recognition is making a comeback in the United States as bans to thwart the technology and curb racial bias in policing come under threat amid a surge in crime and increased lobbying from developers. Virginia in July will eliminate its prohibition on local police use of facial recognition a year after approving it, and California and the city of New Orleans as soon as this month could be next to hit the undo button. Homicide reports in New Orleans rose 67% over the last two years compared with the pair before, and police say they need every possible tool. "Technology is needed to solve these crimes and to hold individuals accountable," police Superintendent Shaun Ferguson told reporters as he called on the city council to repeal a ban that went into effect last year. Efforts to get bans in place are meeting resistance in jurisdictions big and small from New York and Colorado to West Lafayette, Indiana. Even Vermont, the last state left with a near-100% ban against police facial-recognition use, chipped away at its law last year to allow for investigating child sex crimes. [...] Source: U.S. cities are backing off banning facial recognition as crime rises | Reuters