I started working as a social media consultant (well, blog consultant, because social media wasn't a thing then) in 2004. I spent a lot of time over the following years trying to interest key players in the social media world to up their content moderation game, to no avail whatsoever. My exhortations were ignored by venerable institutions such as the BBC and Google – I gave talks about the human side of social media and the need to nip community problems in the bud, but might as well have just gone out into the garden and howled for an hour, for all the good it did.
There has never been much willingness to tackle these issues, even though the major players were eventually forced to. Like me, the community experts I knew back then largely left the industry by 2014, and now we just gnash our teeth in frustration every time this same problem comes up again.
I have a note to myself to eventually write a post titled 'Substack doesn't have a Nazi problem, it has a libertarian techbro problem'. Because that is, ultimately, the root of the issue. Many social media tools were built by people who wanted to fill a void in their lives, and that void was a lack of social skills. But that sadly meant that they also lacked the skills to create a platform that can actually function in reality, because humans are messy and difficult and tricksy and the platforms' big ideas about 'free speech' will always, always hit the brick wall of Nazis, CSAM, misogynistic death/rape threats and other forms of abuse.
I just hope that enough people here continue to put pressure on Substack to push them towards Step 5, because otherwise this place really is going to turn into a Nazi bar.
My tiny sliver of research tells me that you’re an expert on extremism, and I have a lot of respect for that expertise. We may not agree on moderation specifics, but i wanted to make it clear upfront that I know you know what you’re talking about.
That said-- where do put this moderation approach in your list?
I was definitely subtweeting Substack with this post. It feels like they haven't given much thought to how they want this to work. Whatever content moderation approach a platform lands on, it's a huge part of the site experience that would ideally be in place *before* launch.
Even though I’m on the other side of this debate, I’m eager to hear out the smart people on your side, so thanks for your thoughtful post. I don’t think it’s right to be a free speech proponent without taking seriously the points people like you are making. I do think a lot of people who crow about free speech are way too glib about it. I don’t want to be in that camp.
I started working as a social media consultant (well, blog consultant, because social media wasn't a thing then) in 2004. I spent a lot of time over the following years trying to interest key players in the social media world to up their content moderation game, to no avail whatsoever. My exhortations were ignored by venerable institutions such as the BBC and Google – I gave talks about the human side of social media and the need to nip community problems in the bud, but might as well have just gone out into the garden and howled for an hour, for all the good it did.
There has never been much willingness to tackle these issues, even though the major players were eventually forced to. Like me, the community experts I knew back then largely left the industry by 2014, and now we just gnash our teeth in frustration every time this same problem comes up again.
I have a note to myself to eventually write a post titled 'Substack doesn't have a Nazi problem, it has a libertarian techbro problem'. Because that is, ultimately, the root of the issue. Many social media tools were built by people who wanted to fill a void in their lives, and that void was a lack of social skills. But that sadly meant that they also lacked the skills to create a platform that can actually function in reality, because humans are messy and difficult and tricksy and the platforms' big ideas about 'free speech' will always, always hit the brick wall of Nazis, CSAM, misogynistic death/rape threats and other forms of abuse.
I just hope that enough people here continue to put pressure on Substack to push them towards Step 5, because otherwise this place really is going to turn into a Nazi bar.
I so agree. And I loved the close.
My tiny sliver of research tells me that you’re an expert on extremism, and I have a lot of respect for that expertise. We may not agree on moderation specifics, but i wanted to make it clear upfront that I know you know what you’re talking about.
That said-- where do put this moderation approach in your list?
https://substack.com/profile/2-chris-best/note/c-15396859
I was definitely subtweeting Substack with this post. It feels like they haven't given much thought to how they want this to work. Whatever content moderation approach a platform lands on, it's a huge part of the site experience that would ideally be in place *before* launch.
Even though I’m on the other side of this debate, I’m eager to hear out the smart people on your side, so thanks for your thoughtful post. I don’t think it’s right to be a free speech proponent without taking seriously the points people like you are making. I do think a lot of people who crow about free speech are way too glib about it. I don’t want to be in that camp.
👏👏👏👏
Can’t shout fire in a crowded theater. Oliver Wendell Holmes, 1919. Haven’t got much further than that, maybe not even that far.