New research from CTEC and beyond
Women and the far-right, militias, content moderation, Ammon Bundy, and more
The Right Fit: How Active Club Propaganda Attracts Women to the Far-Right
My CTEC colleagues Robin O'Luanaigh, Hannah Ritchey and Frances Breidenstein have published a really interested piece on how women are depicted in Active Club propaganda and social media postings, in ways that reflect but also defy the gendered expectations associated with the far right. There’s been a lot of great work on gender in extremism lately, including some stuff in the pipeline, so stay tuned.
Nostalgia, Nationalism, and the US Militia Movement
My CTEC colleague Dr. Amy Cooter has a book coming out early next year that promises to be a great and comprehensive introduction to the militia movement in the United States, chock-full of new information from archives and interviews, while also delving into the complex motivations that drive this complicated movement. You can pre-order the Kindle version at the link above, or pre-order a hard copy for 20 percent off through Routledge through the end of the year.
Best Practices for TikTok Research: Emerging Methods
Robin O'Luanaigh, staying busy, also contributed a timely Research Note to the CTEC website talking about best practices for researching TikTok, given some legitimate concerns and a lot of noise about securely collecting data on the platform.
Lawful Extremism: An introduction
A busy week at CTEC continues with a cross-post of last week’s short introduction to the Lawful Extremism concept and paper, including video of a short presentation from me, now with slides. (The video is from a panel discussion; my part is 20 minutes but the whole thing is worth your time.) Read and share wherever it is you read and share things these days.
The Ethics of Content Moderation: Who Should Decide What We Say Online?
My Swansea colleague Dr. Alastair Reed visited the Tech Against Terrorism podcast to discuss his work on ethical considerations around content moderation, how the field has evolved over the last decade, including the concentration of power in the hands of a relative few companies. If you missed it, I was on the podcast a couple weeks back talking about extremist manifestos.
Moderating borderline content while respecting fundamental values
My Swansea colleague Dr. Stuart Macdonald co-authored a piece for Policy & Internet about how to approach the moderation of online content that is “lawful but awful.” It’s a complicated question, and one that is only going to become more important as content moderation becomes increasingly politicized.
ChatGPT one year on: who is using it, how and why?
Seven scientists—not the usual manel of White guys—talk about ChatGPT, how it’s been used one year out, and where it’s going. Some refreshingly critical takes in here, alongside some I wish were more refreshingly critical.
Ammon’s army inside the far-right “people’s rights” network
The Institute for Research and Education on Human Rights is out with an interesting, highly detailed report on the COVID-driven networking drive by Ammon Bundy, whom readers will remember from such hits as the Bundy Ranch standoff and the Malheur Wildlife Refuge standoff. Bundy is apparently “in hiding” after some recent setbacks, which couldn’t happen to a nicer guy, but on the other hand, “desperate” and “nothing to lose” are not phrases that engender optimism.
MC 10/16: Facebook’s Ex-Counterterrorism Lead on Moderating Terrorism
Many or most readers will know Brian Fishman, either for his counterterrorism work at Facebook or his work at the Combating Terrorism Center at West Point. He’s now working for a content moderation startup, and he sat down with the Moderated Content podcast for a wide-ranging conversation about the considerable amount of turbulence happening in that space over the last year.