Website algorithms on social media, especially Twitter, have been almost subconsciously altering the kind of information we receive online, creating divisive filter bubbles.
The term ‘filter bubble’ has been attributed to Eli Pariser who has been speaking about this issue since 2011. These algorithms are based on the location, search history, click behavior and personal information of the user. With this information, the internet companies give users suggestions based on similarities to their interests.
This concept seems to be innocent and beneficial to the user, but in certain contexts like education and politics, it can be very negative.
Filter bubbles create states of intellectual isolation for users in the matter of intellectual, international and political importance. Let’s say you follow liberal-leaning voices on social media. Algorithms will not make suggestions from libertarian, conservative and other political groups because they do not align with your interests.
This reduces the diversity of thought and creates a bubble of isolation from different modes of thinking. In a country that values people from all backgrounds, it seems that algorithms on social media can unintentionally create divisions among the population.
The intellectual side of social media “lets you go off with like-minded people, so you’re not mixing and sharing and understanding other points of view,” Bill Gates said in an interview with Quartz. “It’s super important. It’s turned out to be more of a problem than I, or many others, would have expected.”
Social media has proven to be a popular place for politics because it is a very convenient place for communication. Twitter accounts such as Barrack Obama and Donald Trump have well over 50 million followers each, which proves to be massive vessels for people interested in following and engaging with politicians.
Social media comes with more complications than just engaging with politicians.
“Simply exposing people to information they disagree with is often ineffective for meaningful engagement,” author at Slate Stephanie Tam said. “We generally dislike information that contradicts our worldviews, and we’re remarkably good at interpreting it through self-serving biases.”
Currently, we have a big issue on the greater theme of polarization that social media companies have been trying to find ways to tackle. Ultimately, all of the known attempts have been thwarted by people who do not want wish to interact with “the other side” and differing viewpoints.
“A world constructed from the familiar is a world in which there’s nothing to learn … [since there is] invisible auto-propaganda, indoctrinating us with our own ideas,” Pariser said in his book, “The Filter Bubble: What the Internet is Hiding From You.”
Not only do we need to change the way we receive information, but we also need to change the way we interact with disagreeable information. Algorithms pose a threat to how diverse the information we receive is, which then skews our opinions and perceptions of the world. Let us not forget, however, that education and changing our behaviors is the key to making this solution have any real impact.