bubble-main

On 8th November 2016, many people watched events unfold that they thought would never happen. As Donald Trump was elected president, they were left baffled and confused. Not just because they didn’t think he was a good choice for president, but because what they were seeing on social media convinced them that he could not possibly win. All they were seeing were anti-Trump stories. Surely, everyone else was seeing them, too. If not, what were they seeing?

With all that scrolling and swiping, what they’d been shown was only part of the picture. Not just that, but the stories people were sharing weren’t necessary true. Misinformation tumbled around, catching people’s eye and their appetites for what they wanted to see (a story claiming that the Pope had endorsed Trump for president was widely shared on Facebook). Biased and even fake news stories about the candidates were leapt on by supporters from either side to condemn, discredit or mock the opposition.

The power of the filter

You’d think that the digital age would open all of us up to more points of view. We no longer just know what’s going on in our town but in places we’ve never seen; we interact daily with people we’ve never met faceto-face. Smartphones mean that we can communicate easily at any time. They become our way of interacting with the wider world, breaking down geographical restrictions and giving us a global view. We may think the screens in our pockets are windows on the world, but it turns out these ‘windows’ are often more restrictive than we think.

An average Facebook user could have 1,500 potential updates from their friends and liked pages every time they log on. Users with lots of friends and liked pages may have as many as 15,000. Because viewing all of these would be overwhelming, Facebook uses an algorithm – an automated formula – to decide which of these many posts comprise the 300 that you’ll actually see on your newsfeed.

It does this using thousands of factors – what your friends like, what friends of your friends like, your previous likes, how many other likes and shares the post has received, who shared it, your location, their location, how much time you spend viewing stories, which stories match your interests, who you interact with most, what looks promotional or ‘spammy’, what type of media you tend to like (words, images, links, videos)…and so it goes on.

The existence of algorithms means that social media keeps track of what has interested us in the past, but also assumes that we will be interested in similar things in the future. This is similar to the ‘recommended for you’ section on a retail websites, such as Amazon, which flood your recommendations with products similar to those you've already purchased! It’s a whole new way of pigeonholing people. It gets harder to discover new things when our current decisions are directed by previous selections.

Often we think that what we are viewing is all there is to see. If a post is proving popular on your network, or the algorithm thinks you’ll like it, it will be more visible. In a scroll and swipe culture, the things at the top of the page – or in the main newsfeed – get even more views. Other posts remain unseen. There is, sometimes, an opt-out – but often the filter is the default, and you need to search around in settings for the relevant tick box to circumvent it. Although some social networks such as Twitter have what are called ‘real-time’ feeds, on many what you see first is not the most recent posts. Scrolling through Instagram recently, I realised that the photos were no longer displaying in chronological order.

It can be hard work to cut through this filter effect, but some do try – for example, by following/liking people and pages they disagree with. It takes more than this, though, because the algorithms note which stories we tend to click on. Back in 2011, Eli Pariser coined the phrase ‘filter bubble’ (writing a book of that title). Pariser noted that despite him following a diverse set of people, Facebook’s algorithm had begun detecting which stories he preferred, and started showing him those. Those he disagreed with (and therefore spent less time on) disappeared from his feed. “Facebook was looking at which links I clicked on, and it was noticing that, actually, I was clicking more on my liberal friends’ links than on my conservative friends’ links,” said Pariser in a TED Talk. “Without consulting me about it, it had edited them out. They disappeared.”

Just ‘liking’ someone’s page is not enough to bypass an algorithm. We need to engage. It takes more effort for us, at a brain-deep level, to read and engage with things we disagree with but, if we want to be exposed to a wider range of views in the age of the algorithm, we need to spend more time looking at them. Counteracting the algorithm means working against our own tendencies.

A mirror of myself

Because of our urge to cluster together, the likelihood is that we have surrounded ourselves with similarthinking people. They in turn follow us. Retweet, like, share. The result is that most of the views we hear are echoes of our own.

In this echo chamber, we get so used to hearing our own views bounced back at us that if we ever do come across opposing points of view, we are less equipped to handle them.

But this isn’t just about personal updates. Increasingly, social media has become a source of news. With these filters in place, two social media users can be seeing completely different pieces of news – or at least, completely different angles on that news, doing the rounds in their particular echo chamber.

Could an algorithm influence the outcome of an election?

We see this in search engines too. The algorithms they use, based on our previous searches and clicks, mean that we can all get different results for the same query. At a media conference in October 2016, Angela Merkel commented that such algorithms could lead to “a distortion of perception”, and called for more transparency for how they work. Could an algorithm influence the outcome of an election? No wonder politicians are getting twitchy about it.

Distortion of the truth?

Prior to the 2016 EU referendum, I saw friends on social media expressing disbelief that the majority could ever vote ‘leave’. Everything they saw on their newsfeed or heard in their circles was firmly in the ‘remain’ camp. The idea that the vote could end up the way it did seemed impossible. On the day after the Brexit vote, they were not just disappointed. They were stunned. They had gone to bed feeling safe – “it will never happen” – only to wake up to find that it had. I was more prepared for the US election result because I’d been blindsided by the Brexit vote. Once bitten, twice shy.

Thanks to Brexit and Trump, the existence of the echo chamber has become more obvious in recent months. People are growing suspicious and disillusioned and beginning to recognise and talk about the filter effect. The echo chamber itself has become a topic of discussion on social media.

The lie has more staying power than the truth

I often see breaking news on Twitter first. I wonder, “What’s happened?” and start clicking hashtags to find out. Once I find a link to a news item, I will click through to get the details. But the initial breaking news is encapsulated in one snappy phrase.

Stories get retweets because they make great one-liners, but headlines can be misleading. If I don’t bother to check the full story or the reliability of the source, my understanding is limited, or even warped. I make an assumption based on what is effectively clickbait. What ends up bouncing round the echo chamber is not true at all, but gains momentum and sticks in people’s minds, as it did in the social media bedlam surrounding the US elections.

Are you hiding your light in a bubble?

If we want to be honest, loving, faithful disciples of Jesus, we need to be aware of the challenges facing this current generation online – and help them to live in a Jesus-like way for a digital age.

Do we challenge the echo chamber or do we uphold it? We make great statements about our online witness, but a lot of what we do simply attracts other Christians. We visit each other’s blogs, talk about faith with one another and yes, even tailor our content for one another. We don’t notice the levels of jargon we’re using, because we’re so accustomed to it.

It’s good to talk to other Christians, to build each other up and resource one another. However, if all we do online is gather together with other Christians, we reinforce our own bubble. The algorithms perceive what we like and we see more of what fits our viewpoint. If we don’t make the effort to seek out other opinions we are encased in an echo chamber, strengthened by the filter effect, but largely of our own making.

Instead of having gracious conversation, Christians denounce each other in comment sections – it can be hard to find the love behind the vitriol. And it’s all so public – so different from the conflict resolution model Jesus suggests in Matthew 18:15-17.

Sometimes what we call news is actually just gossip. We’re so keen to retweet and share, we don’t stop to consider the context; we just get caught up in the wave of opinion that flares in every quarter. It can have tragic consequences – shattering a reputation, altering the course of person’s life. Gossip was a reality long before social media came into being, but it has now intensified. The medium means it’s easier to share and embarrassing to recant when the original statement continues rebounding around the chamber long after it has been disproved. The amended statement just doesn’t get the same number of shares; the lie has more staying power than the truth. As Christians, we need to take responsibility for what kind of noise we’re making, and what it is we’re passing on to others.

We do need boundaries in place; not boundaries set by algorithms but by our own prayerful choosing. We need wisdom in our online interactions, to not swallow everything we read but to test it for its truthfulness, kindness and godliness.

For a Christian in this digital age, discernment is essential. The spirit of God is not limited by media. God sees us online as clearly as God sees us offline, and we are called to follow Jesus in all aspects of our lives.

We need to step back, to note where the tool is shaping us, but also where we are shaping the tool. Dare we open the conversation, take our light out from under the bushel – or bubble – and see where God leads us?

“You are the light of the world. A city built on a hill cannot be hidden. No one after lighting a lamp puts it under the bushel basket, but on the lampstand, and it gives light to all in the house. In the same way, let your light shine before others, so that they may see your good works and give glory to your Father in heaven.” (Matthew 5:14-16, NRSV)

Ways to burst the bubble

Recognise both the strengths and the weaknesses of the medium.

Be willing to listen to and seek out other views, perhaps by following those with different backgrounds or those who disagree with you. Be prepared to read some of the things they share!

Find out if the platform you are using has a real-time option and choose that if you can. (Be warned that some – for example, Facebook’s ‘most recent’ feed – may default back to ‘top stories’ without you noticing.)

If you’ve not seen anything from specific people for a while, visit their personal profiles directly and don’t rely on the main newsfeed to give you information about them.

If someone makes a statement you strongly disagree with, ask them (respectfully) why they came to that conclusion. Don’t just shout them down.

Check your facts before sharing. Is the source reliable?

React calmly even if others do not. Behaviour is echoed and reinforced as much as words and opinion.

Are you guilty of using too much ‘Christianese’? How can you get rid of the jargon and convey the truth of the gospel to those outside your bubble?

Don’t be fooled by unhelpful words such as ‘virtual’, as if online behaviour doesn’t matter. Online is real. We really are saying what we are saying.

Click here to request a free copy of Premier Christianity magazine