Our #HyperChat about social media and #misinformation was a great success👏— ZN 🏡 (@ZNConsulting) April 30, 2020
with @BrusselsGeek @DamianRadcliffe @pweiss, hypermoderator @liorakern and an engaged audience @LucChome @Seanski50 @EuropeNthecity #WeMakeEUdigital pic.twitter.com/sxH6hbCyqY
So what’s the most fun you can have on a Wednesday afternoon in late April while maintaining social distancing? We reckon it was our #Hyperchat on the impact of COVID-19 on social media platforms. If you weren’t one of the many live participants, you can catch the video on YouTube or watch the short summary on Twitter, where the discussion kicked off the day before, including this contribution from vaccine specialist Dr Todd Wolynn and a great link provided by Mathew Lowry. That’s the attraction of a #Hyperchat – bouncing from Twitter to Zoom and back again – and, in a step up from the Hyperdrinks we’ve been having, the whole thing was made even more interactive by Hypermoderator Liora Kern, who brought viewers into the chat on audio, ran polls and shared the feedback from Twitter.
The whole event was activated by ZN’s social media team from their homes and was such a success – “Grown-up conversation about grown-up things,” tweeted Sean Hayes – that we’ve already planned our next #Hyperchat on vaccination, social media and Covid-19. Science journalist Gary Finnegan, Jane Barratt from the IFA and influenza specialist Bram Palache will be joining us at 15:30 on 20 May – so log it in your calendars now!
Of course, the topic of this first chat is itself fascinating and significant. I got the idea from seeing how the platforms have started directing people to legitimate sources of information about the coronavirus – and wondering whether this is a kind of admission that they can have a lot more control over content. Is this a real change they won’t be able to come back from? Does it mean they are liable and responsible for information?
“This is only the beginning of the conversation” says @pweiss. “We will see more accountability put on social media platforms and we will learn how to detect #FakeNews better”. pic.twitter.com/3H0clyx7XF— ZN 🏡 (@ZNConsulting) April 29, 2020
At the end of the chat, I reckoned that the complexity of the situation should not be an excuse for the tech companies and that we can get to a much better landscape than we are in today. COVID-19 is clearly having an impact and this conversation is important to be having at the moment with misinformation about the virus all over Facebook, as can be seen in this Politico article.
My old Oxford friend, journalism professor Damian Radcliffe, who got up incredibly early to join the chat from the US left us with this thought: “We as consumers should also take some responsibility. We need to see what can be done in schools to understand modern-day news infrastructure and with adults to help them discern fact from fiction.”
Jennifer Baker, who, as the host of #EUTweets of the Week needs no introduction in Brussels – one participant even dubbed her the “Beyoncé of tech” – rounded up by saying: “Good laws can help defend freedom of speech. A legal framework whereby platforms have to take down disinformation is not impossible. In journalism, we’ve established rules & legal responsibilities. Social media should do too. We need regulations based on the demands of society and not of the money makers.”
These were our conclusions, but do read on to discover the journey we went on to arrive at them. And, as Liora reminded everyone, the debate is continuing on Twitter, not least with this great post-event thread from Damian!
I started the chat by asking Jennifer whether she thought COVID-19 was going to make social media platforms into better companies?
“Maybe,” was the short answer. Jennifer then took us through the difference between being accountable and being legally responsible. She reckons there is a window of opportunity with the Digital Services Act to change how these platforms are formally regulated instead of relying on the self-regulation called for after the Cambridge Analytica scandal, when micro-targeting on the basis of Facebook information was used to manipulate people. At the time, some thought it was a just a case of political propaganda, so nothing new. But, Jennifer insisted, the digital age has had a dramatic effect on how people consume information. It all comes down to the “three Vs”: Volume, Velocity and Vector.
Volume is to do with the amount of news in our daily intake, which we find it hard to manage. Velocity has increased: the information is also coming at us even faster than from 24-hour rolling news channels. Anything major will have been all over Twitter before even the fastest reacting mainstream media channel can bring their resources to bear on the story. Vectors are the directions the news is coming from. Not from an authoritative source, but from friends, often making it impossible to trace the source, and sites people have chosen to prioritize, which are what create filter bubbles where no counter arguments are heard.
Propaganda has transmuted into active disinformation campaigns and become a massive problem for tech companies: the content needing moderating not only overwhelms fact-checkers with its volume and speed, but moderators even suffer from PTSD from reviewing extremist material. It is also a geopolitical problem with state-backed actors pushing ideas such as that the virus is a bioweapon deliberately created by Western governments to control the populace. These are the malicious actors. There are also the scared actors, often vulnerable people micro-targeted precisely because they will rush to share disinformation they think might help. People who are angry and distrustful of big government/tech are likely to fall for health hoaxes such as around 5G. The malicious actors calculate, correctly, that these people will amplify the message and get it bouncing around the filter bubbles.
And this is where the responsibility of the social media platforms is most in question. Mark Zuckerberg doesn’t create hoaxes, but he has built a machine where the algorithms make it more likely extremist views will spread. Of course, “Even mainstream traditional media can get their reporting tone absolutely wrong. Misinformation is not only on social media.” And then she said something so memorable we immediately tweeted it as a quote pic:
“Even mainstream traditional media can get their reporting tone absolutely wrong. Misinformation is not only on social media” says @BrusselsGeek and agrees that it’s difficult to tell real news from #fakenews. #Hyperchat pic.twitter.com/5spuPxhc3L— ZN 🏡 (@ZNConsulting) April 29, 2020
Jennifer agreed with #1 #EUinfluencer Ryan Heath who tweeted that he didn’t want platforms to be censors. However, as social media corporations are in the business of selling people’s data for advertising, to malicious as well as benign actors, she thinks there has to be real regulation and real accountability.
“Perhaps the business model can be reviewed to make spreading disinformation more difficult – a bit of social distancing in terms of data sharing. There is little to be done about the volume, but it may be possible to break up the vectors and reduce the velocity to protect the most vulnerable,” she concluded.
Hypermoderator Liora pointed out that #dontdrinkbleach was trending on Twitter and relayed the suggestion on the Zoom chat that Facebook could start by deleting all the fake and anonymous accounts. Among other viewer input was the comment that “it’s easy to say yes to accountability, but the reality is much harder”. On audio, Philip Verhaeghe asked how to between determine misinformation and truth, for example regarding wearing face masks, followed by Sean Hayes querying who gets to fact-check the fact checkers in, for example, the health versus economics debate?
Jennifer responded by insisting that the focus should be on the intent to do harm. She reckoned the health/economy debate was for politicians, unless one side or the other was using disinformation to push their agenda. She also stressed that only malicious accounts should be deleted, as is current practice, as some people have legitimate reasons to remain anonymous.
Liora presented the results of the first poll – with a clear majority in favour of social media platforms being held accountable for misinformation and introduced a second question on how COVID-19 is changing social media: most respondents thought it was having a positive effect.
I then asked Damian whether he thought there was a revival in local journalism and the possibility of turning away from fake information.
On the revival question, Damian said it was complicated. Even if local journalists are the most trusted because they are linked into their community, the challenge they already faced from the loss of advertising revenue has only been made all the greater because of the effects of the virus on marketing budgets – in the US alone, over 36,000 journalists have lost their jobs, been furloughed or seen their hours cut in less than two months. As people face their own economic problems and possibly cut back on subscriptions, he wondered whether journalists will even be around to answer the questions communities need answering?
He went on to say that misinformation and fake news have been around since antiquity, but the “three Vs” make them a pressing issue today. The problem is that social media platforms make all content look the same, making it hard to distinguish what is credible, and that they serve up material based on people’s browsing history, thereby reinforcing any prejudices
Damian insisted that the motivation and intent are key – even journalists are capable of misunderstanding aspects of the science and thereby passing on the wrong information, but this is not malicious. The next stage involves forms of national propaganda, where countries are weaponizing social media to push their point of view.
“You could employ all the journalists and fact-checkers you want at a social media network and you would not even begin to get close to reviewing everything,” he went on, so there will be a need for AI and machine-learning to cope. However, the only way of reducing the spread of misinformation in closed networks, like WhatsApp groups, is to limit the number of times a piece of content can be forwarded.
“After the lockdown, social media will be the handwritten diaries of the 21st century,” he said, and will show just how complex it is to decide what is true in a hyper-partisan environment. And even if social networks take stuff down, other media can provide a platform and the removal of the content plays into their narrative. Not surprisingly people are confused as to which ‘reality’ they should trust and turn instead to their friends or like-minded groups. They are also overwhelmed by news of the pandemic and lack the energy to dig down for the truth. Meanwhile, most people do not seem to understand that their news feed is governed by algorithms, not an editorial authority, and the amplification of items gives them a veneer of credibility.
.@damianradcliffe notifies that there is pressure on social media platform to either get this right or risk have regulations being imposed on them. That can impact not only their reputation, but also their business model. #Hyperchat pic.twitter.com/lxsN5l0rX1— ZN 🏡 (@ZNConsulting) April 29, 2020
In the policy space, there are discussions about channelling advertising revenue back to news organisations and an increased risk of regulation, which are strong incentives for tech companies to get this right. Perhaps they could open up their data to research to help come up with mitigation strategies. As incredibly powerful lobbyists, perhaps they could work on behalf of a wider information interest than just for their own cause.
Thanking Damian, I said that, despite being complex, I feel we shouldn’t give up and instead find a way of dealing with the phenomenon in a much stricter way and place greater emphasis on rooting information on scientific methods: “Misinformation can kill – and should be penalized.”
“People are exhausted with covid news so the last thing they want to do is dig deeper” says @damianradcliffe. “However we must still check the validity of the source of news” adds @pweiss. Our #Fakenews detection guide might help.👇🏼#Hyperchat pic.twitter.com/XsuUyDpjb7— ZN 🏡 (@ZNConsulting) April 29, 2020
Liora displayed the ZN fake news detection guide we had tweeted to go with what Damian was saying, and then Luc Chomé joined by audio to ask whether just because it’s difficult it’s really impossible given the means available to social media platforms.
Damian agreed that perhaps we’re being duped into thinking it’s impossible and if there were a financial incentive to find a way to address the issue, he has no doubt they would be able to. Perhaps we could mandate a certain amount of money per user to be spent on tackling it.
Also on audio, Audrey Verger asked about journalists on YouTube. Damian reckons this is still a massively unexplored space, which is consistently used by millennials to access news. He sees Twitter as much more of an echo chamber and much narrower in reach.
For the record, before we closed up the chat, Hypermoderator Liora announced the results of the third poll, in which the majority favoured EU regulation to govern social media handling of misinformation, before conducting a fourth in which most people saw freedom of speech as more important than fighting fake news. A final poll found a majority saying it is possible to tell real from fake news most of the time.
Until next time,