Interview

Examining Violence and Black Grief on Social Media: An Interview with Desmond Upton Patton

Desmond Patton, Catalina Vallejo
March 1, 2022
Interview

Examining Violence and Black Grief on Social Media: An Interview with Desmond Upton Patton

Desmond Patton, Catalina Vallejo
March 1, 2022

As part of our “What Is Just Tech?” series, we invited several social researchers—scholars, practitioners, artists, and activists—to respond to a simple yet fundamental question: “What is just technology?” This interview was conducted by Just Tech program officer Catalina Vallejo, who spoke with Desmond Upton Patton, Professor of Social Work at Columbia University and Just Tech Advisory Board member. Patton (he/him) studies how gang-involved youth conceptualize threats on social media and the extent to which social media may shape or facilitate youth and gang violence. He is the founding director of SAFElab, which centers young people’s perspectives in computational and social work research on violence, trains future social work scholars, and actively engages in violence prevention and intervention.

In their conversation, Vallejo and Patton spoke about social media as an amplifier of violence, the importance of lived experience informing computational research, and misunderstandings about Black grief.

Catalina Vallejo (CV): How did you come to work at the intersection of technology and social justice?

Desmond Upton Patton (DUP): I went to the University of Michigan to study social work to develop skills and have an evidence base for how to best support Black men who were traversing challenging experiences in their lives. I realized that I wasn’t a clinical personI would take on the emotions of my clientsso I decided to get a Ph.D. in social welfare policy at the University of Chicago. That is where I ignited an interest in the impact of violence and trauma on the black community. I wrote a dissertation that looked at how young black men at a charter high school on the West side of Chicago navigated violence in their community while remaining extremely high achieving. I learned that social media was a major part of how youth navigated violence in their neighborhood. It was an important tool for identifying safe and unsafe locations, but also seemed to be an amplifier of violence, a more efficient way of sending out a threat or starting arguments.

I became more interested in computational tools. I developed a body of research based on a young woman named Gakirah Barnes. She had this mythology on Twitter as one of the most violent gang members in Chicago by the time she was 17 years old. The word on the digital street was that she had shot or killed 20 people. It’s unclear if any of that is true, but that was the word on the street. She was murdered in the Woodlawn neighborhood of Chicago on April 20, 2014. I thought if we could understand this person who is like a node in a digital networkthe connector to lots of violence happening in the cityit would be interesting to see if we could detect any patterns of her communication that might predict future retaliation for her death.

None of the AI tools worked on this language set because they weren’t good at context. It became clear that we often don’t know what young people are saying.

However, none of the AI tools worked on this language set because they weren’t good at context. A tool would consistently misfire and automatically detect things like the N-word as being aggressive, but that wasn’t necessarily the case because the N-word has a different connotation within the Black community. As these types of situations kept coming up, it became clear that we often don’t know what young people are saying.

So, doubling down on my social work background, I said, “Well, there are experts in this space: young people.” They would be the experts on how we should translate and interpret what we’re seeing on social media. So I hired groups of young people from Chicago to be research assistants in my lab who would help us translate text, emoji, video, and hashtags to make sure that we had a better understanding of how we should interpret things like street names. Street names for the untrained person could mean nothing. But in Chicago, on the Southside, street names could have lots of meaning that can be indicative of gangs and gang rivalries. Over time, we got better at being able to automatically disentangle social concepts of grief, loss, and aggression.

Emojis used to convey grief, loss, and aggression
Emojis used to convey grief, loss, and aggression. Courtesy Desmond Upton Patton

It became clear to me that focusing on threats and aggression was the wrong thing to do because that’s actually not preventative. We transitioned to focus on grief and trauma because it is a window into what people need. It also shows you different strategies that young people engage in to help themselves.

We don’t have a robust understanding of Black grief, and we don’t have a robust understanding of Black grief expressed digitally. What has happened over and over again is that someone memorializes the life of a person online: they may place an image of that person, they may post a picture of them with friends and family members with the picture of the person on a T-shirt. They may write something to that person as if they are still alive. They often communicate to a person as if they’re still alive. The challenge comes when people within and outside of your network begin to comment, begin to make fun of, or begin to disrespect what you are saying online. That disrespect can lead to 600 comments, back-and-forth. That is amplified and becomes hypervisible within the context of whatever social media platform you’re on, consistently crossing lines over and over and over again.

In doing this work, I tried to identify resources around black grief and trauma. But most of what we know about grief and trauma is based on studies of white people—in particular white college folks—so we don’t understand that rage and anger might be expressed in ways that people outside the community don’t understand as being a part of grief. It’s been a wonderful experience to turn this on its head and say, “This is a normal and natural part of grief. It’s not aggression. It’s not problematic. It’s not bad.”

I quickly learned that I am not the expert. The young people are. Their lived experience is critical expertise.

CV: Could you talk about some of the interventions that have been inspired by this research?

DUP: One thing that has come out of this research is the need to help develop expertise for community members and young people. I quickly learned that I am not the expert. The young people are. Their lived experience is critical expertise. So I’ve created a set of programs to help develop that expertise. I co-run a summer program for young people of color called AI4All. I created a user research lab called the UXR Lab, geared to help formerly incarcerated persons get involved in user research technology work. And then I run a technology lab at the Brownsville Community Justice Center with the goal of helping young people develop skills in human-centered design and technology development.

You don’t have to be a mathematician to get involved in technology and I don’t think a lot of people know that. I’ve been in tech for a while and I am not a data scientist, nor do I need to be one. Nor do I have to desire to become one. The insights that I have as a Black person and as a social work scientist—all of these things matter to how we design technologies.

There are a couple of reasons why I developed the UXR Lab. My father was in prison most of my life. He got out of prison maybe two or three years ago, and then within six months, he was dead—he was found under a bridge in our hometown. My interpretation of my father’s experience is that he did not have enough support; in particular, he lacked job opportunities. People need a diverse array of opportunities. There are many programs that are connected to construction and the service industry. Those are great. Everyone needs a job, we all need to make a living, but what if there were other opportunities that could lead to more financial independence?

I wanted to create a program so that people can do a couple of things. For example, understanding that their lived reality is expertise. Social media companies tap into what we like all the time and make billions and billions of dollars. You should be paid for the insights that you offer to people. I also wanted them to gain skills in user research. How do you brainstorm, how do you prototype, how do you ask questions? I also wanted them to get some digital literacy. They would take a redefined coding class that was less about you walking out being a coder, and more about you understanding what coding is and how coding works in a technological environment.

You’re either told that a technology is the best thing ever and you should use it everywhere, or it is the worst thing ever and you should never use it. I enjoy technology, and it’s not going anywhere, but we need to understand that it is a tool.

CV: One of the things I think is fascinating about your research is that it doesn’t paint the role of technology in the specific communities that you work with as black-and-white. Stepping a bit outside of your specific research, what do you think people get wrong about technology?

DUP: What bothers me the most in tech spaces is that we are not having nuanced conversations based on the human experience. You’re either told that a technology is the best thing ever and you should use it everywhere, or it is the worst thing ever and you should never use it. I enjoy technology, and it’s not going anywhere, but we need to understand that it is a tool. It is a tool in how we can connect better, how we can communicate, and how we can make our lives more efficient.

But because so many platforms—so many tools—have not been developed by diverse teams or with diverse perspectives in mind, we have not anticipated the challenges and problems with how technology can be deployed within our society. That could be addressed if we open up opportunities for more diverse populations to get involved—if we invest in digital education. And I’m not just saying have your kid take a computer science class—I mean digital literacy for digital environments. I think tech companies have a responsibility to offer the right support, but you’re going to be hard-pressed to find a public-facing education initiative from a tech company that is aimed at helping citizens to be equipped to be in this space.

The biggest thing that I have learned in my work is that the complexity of doing this work is really in who benefits from having these tools and who is harmed by these tools. When I talked to family members about the work that I do, they want surveillance tools because at the end of the day you want your kids to be alive. I think a lot of researchers like me don’t get that. It’s easy to say don’t do this because there are lots of harms that can occur by deploying these tools, but when I ask a mom about these tools her lived experience may suggest something else. Integrating, considering, and centering those lived experiences in our research is critical. There aren’t enough folks that are doing that work, and I think that’s the opportunity and the possibility for building more ethical technologies.

CV: I have one last question. I can imagine that you have spent many hours on Twitter and social media. Could you recommend three Twitter accounts that we should be following?

DUP: Wow this is difficult. I would say to follow Mutale Nkonde. She is the Executive Director of AI for the People here in New York City. I think they should follow the SAFElab account. And Ruha Benjamin as well.

Our Network