Discover more from WORLD GONE WRONG
A look at some recent papers on extremism, disinformation and interventions
Thanks to Twitter’s continuing decline, it’s getting harder to keep track of what colleagues are working on. I’ll try to fill the gap here by occasionally offering a quick rundown of some of the more interesting research projects to cross my feeds.
Observers of extremism and politics often debate whether far-right actors believe the misinformation they spread, or whether they’re just spreading it with the cynical intention of “triggering the libs.” A new study suggests that people spreading Trump-oriented election disinformation overwhelmingly believe that their claims are either true or entirely plausible.
Evaluation of the “We Can Do This” Campaign Paid Media and COVID-19 Vaccination Uptake, United States, December 2020–January 2022
Denison et. al.
This paper uses novel methods to examine the efficacy of public health campaigns to promote vaccination against COVID and finds that a U.S. Depatment of Health and Human Services campaign successfully increased audience vaccination behaviors over both the short and long term. Interestingly, digital media was more likely to produce immediate changes in behavior, while TV produced a more gradual shift.
Analyzing the Interaction Between Posting Behaviors on Incels.is and Violent Events Perpetrated by Members of the Community
One of the holy grails of research on extremism is translating a sea of data into actionable insights for anticipating and even preventing violent attacks. This paper’s findings are looking to move the needle on that question, with an in-depth examination of posts on a prominent incel forum.
Influencing recommendation algorithms to reduce the spread of unreliable news by encouraging humans to fact-check articles, in a field experiment
A study of fact-checking finds that a) readers can be prompted to fact-check information and b) fact-checking caused Reddit’s algorithm to downrank articles that were fact-checked. This is interesting and useful. I’d caution against overinterpreting the results outside of the specific context of the experiment, but the study points the way to additional research.
Zou, Wang, Kolter, Fredrikson
A new study from Carnegie-Mellon demonstrates methods to jailbreak AI chatbots in order to make them produce harmful content. A Wired write-up provides a more accessible introduction to the content.
Extremism in North America is not exclusively sited in the United States. The Canadian extremism scene is bigger and weirder than ever, and this paper offers a deep look at the political scene that surrounds the “Freedom Convoy” movement.
Biddlestone, Cichocka, Główczewski, and Cislak
The authors find that collective narcissism, “ a belief in in-group greatness that is not appreciated by others,” predicts a willingness to conspire against other in-group members for personal gain. There’s a lot to unpack in this paper, which is summarized more readably by Psychology Today. This approach runs parallel to some of my analysis of in-group critique and intergroup conflict. This GNET blog post summarizes my findings in a shorter, more accessible form.
QAnon Beliefs, Political Radicalization and Support for January 6th Insurrection: A Gendered Perspective
Moskalenko, Pavlović, and Burton
“In our sample, more women reported believing QAnon conspiracy theories, and their average endorsement of QAnon conspiracies was higher than that of men. In women in our study, support for January 6th riot was positively related to Openness to Experiences, and activism and radicalism were positively related to extraversion; these relationships were reversed among men. These gender differences suggest a different psychology underlying QAnon’s appeal for men versus women, and radicalization stemming from beliefs in QAnon conspiracy theories.”
Thanks for reading WORLD GONE WRONG! Subscribe for free to receive new posts and support my work.