Discover more from WORLD GONE WRONG
Partisan kids, siren songs and other science things
Welcome to Research Roundup #2
It’s been a busy week, so without further ado, let’s look at some new and newly available research:
The beliefs and behaviors of U.S. adults are increasingly sorted and polarized along partisan lines. We draw on studies of partisanship and social identity formation to argue that children develop partisanship as a social identity during the political socialization process. For a group of children, their partisan social identity produces an affective (and largely negative) evaluation of the political world. Analyzing survey data collected from 1500+ children ages 6–12 in 2017 and 2018, we show that some children develop a partisan identity as they learn about politics that operates similarly to other social identities like gender and race."
As some of you have probably noticed, I’m deeply interested in developmental factors in extremism. After I finish a few long-term projects, I am probably going to turn to this topic in a more serious way.
Systems of online content moderation governance are becoming some of the most elaborate and extensive bureaucracies in history, and they are deeply imperfect and need reform. Would-be reformers of content moderation systems are drawn to a highly rule-bound and formalistic vision of how these bureaucracies should operate, but the sprawling chaos of online speech is too vast, ever-changing, and varied to be brought into consistent compliance with rigid rules. This essay argues that the quest to make content moderation systems ever more formalistic will not fix public and regulatory concerns about the legitimacy and accountability of how platforms moderate content on their services. The largest social media platforms operate massive unelected, unaccountable, and increasingly complex bureaucracies that decide to act or not act on millions of pieces of content uploaded to their platforms every day. A formalistic model, invoking judicial-style norms of reasoning and precedent, is doomed to fail at this scale and level of complexity. As these governance systems mature, it is time to be content moderation realists about the task ahead.
There’s a lot to like in this chapter by Evelyn Douek, and I am not really disagreeing here, but because tech bros are constantly looking for permission to do less, I want to briefly highlight the case for the formalistic model. Many content moderation decisions are easy; some are not. By articulating formal rules and processes, companies are forced to try to live by them. We all know that the rules are going to fail at some point, or at least those of us who aren’t in politics know that. But having rules and processes creates an imperative to explain why non-conforming decisions are different, or rather, it allows you to create an exception process that is something other than throwing darts. In other words, my philosophy is: Have well-articulated, fully developed processes and standards; have some guidelines about how and when to create exceptions; and use those guidelines as opportunities to explain and defend the inevitable and controversial decisions that don’t neatly fit your policies.
This is a good overview from Dr. Julia Ebner of the ways that extremist movements can capture the center of a mainstream political movement, or even a society. Obviously I have a lot to say on this topic from a theory perspective; Dr. Ebner is coming at it from a slightly different angle, more grounded in what’s happening right now. For more, check out her book on the topic.
Subscriptions and external links help drive resentful users to alternative and extremist YouTube channels
Do online platforms facilitate the consumption of potentially harmful content? Using paired behavioral and survey data provided by participants recruited from a representative sample in 2020 (n = 1181), we show that exposure to alternative and extremist channel videos on YouTube is heavily concentrated among a small group of people with high prior levels of gender and racial resentment.
The folks at YouTube are probably relieved to see that this team of researchers found less evidence than expected of the YT algorithm driving people into radical spaces, a shift from less cheerful results a few years ago. But, you will be shocked to learn, the news is not all good.
Over on Bluesky, everybody’s favorite godfather of terrorism studies Dr. John Horgan this week shared a link to all of the open access articles in Terrorism and Political Violence, which he capably edits and oversees. There’s more free, high quality content over there than you could get through in a year, so check it out!