Welcome back to the Baekdal Plus newsletter. Today we are mostly going to talk about social media and Twitter, but before we get to that, my latest Plus article is all about innovation, defining your audience, and creating a community.
In this week's Plus article, I talk about the problem of innovation in the news industry, and why publishers always end up doing the same thing anyway. In short, it's about how publishers are stuck with just one way of thinking about news, how many forget to really think about the audience, and how building a community is often seen as an afterthought (or not at all).
But take a look at: "Why publishers who try to innovate always end up doing the same as always".
As you probably know, Twitter has been taken over by Elon Musk, which is a bit of a problem.
One of the major social channels is now in the control of someone who wants to define it in his own narcissistic views, (he's already proven to be more than willing to spread falsehoods, misinformation, and just general intolerance) ... and this is a big problem for journalism.
If you are not on Twitter, this might not seem like a big deal, but Twitter is, by far, the most used social channel within the media industry.
Here, for instance, is a report from last year:
Mind you, what I'm talking about here is not traffic. We all know that Twitter is terrible for driving traffic to publishers. Instead, we are talking about how people in the media stay up-to-date.
For instance, as a media analyst, I have five main ways that I stay informed about what is going on in the media industry. These are (in this order):
So, Twitter is massively important for the media industry, and so the idea of it being in the hands of someone like Elon Musk is a big problem.
How big of a problem is this? Well, there is a brilliant article by Nilay Patel over at The Verge. It's called "Welcome to hell, Elon", and it perfectly explains the mess he just put himself into.
No, seriously, you should read that article.
What we have also seen is an uptick in right wing activity, from people thinking that they now get to do whatever they want. And we have even seen a massive increase in racial slurs. Although, at closer inspection, it was revealed that 50,000 posts containing racial slurs were coming from only about 300 accounts, and Twitter quickly removed them.
What was also interesting was that, while Elon Musk had promised to bring back Trump's account, Trump said that he wasn't going to do that because he wanted to stay on the social channel he has created himself (and then he lied about its performance).
So, it's a mess. And, in a few months, it may be that as journalists, editors, etc., we will have to say that we cannot be a part of it anymore. I hope it doesn't come to that, because Twitter is very important for journalism. But if the bad elements who attack and harass people start to take over, we have to disassociate ourselves with it.
Also, there were two comments I liked:
The first one was from Evan Hamilton, who used to work for Reddit back when it thought it could just be a platform where anything goes. He wrote:
Reddit tried being the free speech platform, and then I and many other people had to spend years cleaning it up just to get women and advertisers to consider coming back.
And this is really the key.
The second tweet is this one. If Elon isn't really careful, this is going to be the most costly thing he has ever bought.
However, there is one more thing we need to talk about, and that's this whole free speech narrative.
One thing we often hear on social media platforms is the claim that people shouldn't be moderated because of 'free speech'. That somehow people should be allowed to do what they want because it's a free speech thing.
Here is a simple example of what we often see. This was directed at Twitter's former general counsel and the head of legal, policy, and trust at Twitter (and one of the three people fired by Elon Musk).
As you can see, this is pretty horrible, and most people in the media industry have experienced this. But notice how many times they talk about 'free speech'.
This is one of the biggest misconceptions people have, free speech and content moderation are two very different things. It's about how you conduct yourself. Free speech means for instance, that you can put up a blog where you write whatever you like (well, kind of), but it does not mean that you can walk into someone's home and start yelling at them in their living room.
That's not free speech. That's attacking someone in their home. And if you do that, in any country, the police will show up and arrest you. And, when you later face the courts, you can't claim "free speech, dude!", because the judge doesn't accept entering people's homes as a form of free speech.
This is an important distinction to understand. Free speech and attacking someone verbally, mentally, or physically in their home are not the same.
But this also illustrates why social media channels are so special. On social media, there are two ways to communicate. You can either communicate indirectly, or directly.
The difference is quite simple. An indirect tweet does not show up in people's notifications, whereas a directed tweet does.
This is a pretty big difference because you can have a person who tweets that the Earth is flat, but who cares since it's just a tweet that someone made.
But a directed tweet is what you see here:
Every single one of these tweets was directed at her, meaning that they all showed up in her notifications, interrupting her day, and preventing her from just using Twitter like normal.
This is the difference between someone just saying something and an attack, and it's the attacks that we need to prevent.
And we all know how damaging this can be. Everyone of us knows people who have been harassed so much that they had either locked down their social channels, preventing them from having the same opportunities as everyone else, or outright had to quit social media.
One of my friends even had to create a pseudo account, under a fake name, so she could still post pictures to her family and friends on Instagram without constantly being attacked.
Heck, even my retired mother, who is only posting about plants and knitting has had to protect her account because of this. I myself have had to limit my account. Sometimes I have to limit individual posts, other times I have even taken drastic actions, severely limiting who can write to me.
But the problem with doing this is that, while it prevents the attacks from showing up in my notifications, it doesn't stop the attackers as a whole, and it also blocks non-attackers who I do want to hear from.
And this is what many also don't get. I have had people tell me that social media should just allow this, and then if I don't like it, I can just block it. Except it doesn't work.
In order to block something, you first have to be the victim of the attack. For instance, the last time I was the target of one of these online mobs, I had to endure more than 2,200 tweets attacking me, over a three day period.
I mean, think about that. In order for me to block that, I would have to read all those tweets, and manually block each person ... just to get my feed under control. This doesn't work. Blocking puts the burden on the victim.
So, it's vitally important that social media channels take steps to stop this, because otherwise, the social channels become an intolerant place to be.
Of course, the tricky thing about social platforms is that we have two kinds of attackers. There are those who carry out the attacks, and there are those who orchestrate it.
For instance, in the most recent report about hate crimes in the US, they found that it has risen 339% over the past two years. And a lot of this was caused by how politicians or other popular people had started an anti-asian narrative. Both of these groups are responsible. This is true both online and offline.
So this is why we need strong moderation tools. Again, this has nothing to do with free speech. It's about how some people attack others. And, on social media, an attack is when you direct your posts or messages to someone, causing their lives to be interrupted and degraded.
The point I'm trying to make here is that it's so important that we focus on this distinction. So often I see people talk about moderation in the wrong way, including in the press.
We don't moderate to stop people from talking. We moderate to stop people from attacking others. And we moderate to protect the victims from the burden that would otherwise force them to close their accounts to protect themselves.
So when we talk about social media, it's so important that we keep this focus in mind. Attacking someone, either in person, or online is not a free speech issue. It never was. It's just an attack.
Since we are talking about social media, last year I wrote this:
Also, remember that while this newsletter is free for anyone to read, it's paid for by my subscribers to Baekdal Plus. So if you want to support this type of analysis and advice, subscribe to Baekdal Plus, which will also give you access to all my Plus reports (more than 300), and all the new ones (about 25 reports per year).
A look at the trend of brand+publisher, and the future for epaper
Asking an AI to do media analyst, and what does it mean when social becomes content focused?
It's tempting to just take a picture of your desk, but be mindful of what it might reveal
A guide to AI for publishers, the end of a million views, and what read metric is best?
Depression is impacting all level of news, from the journalists, the audience, to the businesses.
Founder, media analyst, author, and publisher. Follow on Twitter
"Thomas Baekdal is one of Scandinavia's most sought-after experts in the digitization of media companies. He has made himself known for his analysis of how digitization has changed the way we consume media."
Swedish business magazine, Resumé