If you don’t read the newspaper, you’re uninformed. If you read the newspaper, you’re misinformed.
That Mark Twain quote is more than a hundred years old, but it still raises a smile today. Lots of us read the news to stay informed. Some of us suspect that we’re being misinformed. And some of us downright know it.
The tendency for media to portray a one-sided view of the news isn’t news in itself. Newspapers have traditionally been lumped into a having left-wing or right-wing portrayal of the world. But what’s striking is the level of trust – or mistrust – that we apparently have in them.
That’s particularly the case when it comes to young people. The Washington Post recently reported that 88% of 18 to 29-year-olds only “sometimes” or “never” trust the press. That’s a lot of mistrust.
When we think of “the press” we tend to consider national newspapers and broadcasters who’ve moved from print and TV to build big presences online, where they’re apparently viewed with equally large doses of scepticism.
But hang on – the internet surely means that we’re not restricted to getting our diet of news from these media mainstays? Now we’re able to pick and choose and go to places we can trust. The internet means we can get what we want from countless news outlets.
And here we have another problem. Even if a news outlet is completely politically neutral, even if it’s totally squeaky clean, with absolutely no tendency to exhibit any bias whatsoever, and even if we know this for sure, can we really always be certain we can trust it?
“Newsgathering” has changed. Journalists have less time. And there are fewer of them. As The Guardian highlights, newsrooms are now potentially all about the clicks – and in the process of trying to get them, checking facts can fall by the wayside.
As the Guardian’s article also mentions, there are also lots of other people beyond the world of journalism who gain money (or just attention) by putting out fake news. There are a few examples that have subsequently been exposed, such as the case of Phuc Dat Bich. The Vietnamese-Australian gained worldwide coverage when he said that Facebook had banned him because of his name. The thing was, he’d made it all up.
This poor guy! Man called Phuc Dat Bich posts passport after he is repeatedly banned from Facebook https://t.co/Bh1C6LlefP
— Jennifer Patterson (@Real_Estate_Jen) March 2, 2016
Talking of Facebook, it’s an increasing trend for users of social media to see these platforms as more than simply a way of viewing cute photos of their friends’ kids or finding out about an old school pal’s swanky new five-bedroom house.
They’re also places where people go to get news. Last year, 63% of both Twitter and Facebook users said each platform was a source for news and events outside the realm of friends and family. That was up from the figures for 2013 (52% for Twitter, 47% for Facebook).
It’s not difficult to see why users would turn to places like Twitter for news, particularly when there’s a breaking story. This user-generated content (UGC), often sent straight from people on the ground at the scene, certainly meets the insatiable requirement for instant information. But, relatively speaking, it’s also seen as trustworthy – and 50% more trustworthy than other media according to some reports.
So that’s that then. We can all sleep soundly knowing that social media platforms are there to be trusted. Meanwhile, what those big, bad media outlets report should automatically be taken with a massive pinch of salt, right?
Maybe not. For one thing, UGC isn’t always accurate. During a breaking news story for example, a lot of what’s posted on social media may simply be wrong. And try following your favourite football team on Twitter while they’re playing. It can be so partisan that you end up with a totally skewed (or simply confusing) version of how the game is going.
But there’s something else at play – the mysterious, shadowy world of the algorithm.
Let’s use Facebook as an example, although most of this goes for anything that selects what we see through an algorithm.
Facebook filters what you see to keep your newsfeed relevant and selects the content it thinks you’re most interested in. There are plenty of factors that go into how likely you are to see content, including what you click, watch, like, comment on or even hover over. The precise factors that make up the algorithm are a closely guarded secret.
In short, Facebook is selecting the news you see or don’t see. And we assume that it’s doing it in a completely fair way. To be clear, we’re not suggesting for one second that Facebook is adding its own agenda to the content that appears in our newsfeeds. But it’s technically possible.
So who exactly can we trust when it comes to getting our news? And what are the implications of us losing trust? After all, it’s not just news that we view with scepticism. Research has shown that nearly one-in-three employees don’t trust their employer, while politicians (perhaps predictably) languish at the bottom of the league table of trust.
Perhaps everyone should take a look at what brands are doing to increase trust. But that’s another story.