A new study conducted by Facebook found that the friends we make and the links we click effortlessly overwrite news-sorting algorithms and shape our feed to match our personal preferences.
Scientist examined the social media habits of 10.1 million Americans with publicly listed political preferences over a period of six months and noticed that they mostly make online friends who share their beliefs and mostly read news articles that preach what they love and criticize what they hate, thus creating an echo chamber.
The echo chamber allows content that matches the users own worldview to reach them, isolating them from opposing opinions. It makes sense from a psychological point of view. Generally, people don’t like to have their beliefs challenged; they like to have them reinforced.
Researchers found that, on average, 23 per cent of the participants had Facebook friends with an opposing political affiliation and about 29 per cent of the stories in a user’s news feed go against their own ideology.
Eytan Bakshy, a data scientist at Facebook and lead author of the study, empathizes that this is the first time scientist have been able to quantify this type of effects and that despite what they expected when they started the study back in 2012, echo chambers don’t isolate users from conflicting perspectives entirely, but they do severely diminish articles the user might not agree with.
The research has shown that news-sorting algorithms still hold some power in deciding which news to send people (1 per cent change), though not quite as much as the individual choices we all make (4 per cent change).
Friends and article clicks are not the only influential factors here. It turns out that other social media activities such as liking a post or commenting on an update also help create the echo chamber.
Eli Pariser, chief executive of Upworthy (a viral content website), dubbed the resulting effect “Filter Bubble”. He strongly believes that we mostly have our own avoidance of opposing views to blame for the lack of diversity in our news feeds.
The issue is more serious than it may sound, with several Facebook users outright admitting that they unfollow people who post content they disagree with. Researches had a chance to observe the phenomenon and saw that it escalated even further during the run-up to next year’s presidential election, presumably because of the opposing political parties publicly facing each other, inciting people to join in on the debate.
In support of Facebook’s newly published study come older studies that support the idea of an echo chamber. Last year’s Pew Research Center report showed that people gravitate towards media outlets that share their political views. And another study published in the National Bureau of Economic Research last year concluded that Twitter usually exposed people only to opinions matching their own during the 2012 election.
On the other hand, Christian Sandvig from Social Media Collective is quick to point out that the Facebook users with public political views who volunteered for the study might behave very differently from the average Facebook user, thus making the results inconclusive. He went on to question Facebook’s choice of wording for the study saying that they’re basically stating “It’s not our fault! You do it too!”, making it an excuse, not an explanation.
Image Source: rt.com