Snapchat is usually praised by their users for their updates to the app, which seem to consistently make it more enjoyable. But according to TechCrunch, Snapchat is getting rid of their auto-advance feature, which automatically moves a user to the next Snapchat story their friend posted as soon as they are done watching the previous one. This auto-play feature is being replaced with a story playlist, which will allow users to select the stories they want to view and then play the selected ones back-to-back.
This story playlist feature of the app makes it more customizable since the users can now select their friends stories based on which ones they want to see. Another aspect of Snapchat that varies with each person is the order of the discover channels that appear across a user’s screen. Snapchat analyses the channels you click on the most often, along with a number of other variables, in order to put the channels you are more likely to click on first.
Many other social media platforms do something similar to this, where they display what they predict you’ll want to see, creating a filter bubble. Techopedia defines a filter bubble as “the intellectual isolation that can occur when websites make use of algorithms to selectively assume the information a user would want to see, and then give information to the user according to this assumption.” A Ted Talk by Eli Pariser draws attention to Facebook and Google specifically, noting how Facebook analyses which of your friends’ post you click on and then edits out the posts that their algorithm doesn’t think you will click on or care about from your newsfeed. It also mentions how Google uses a number of different variables, even if you are signed out of your account, to customize the search results that appear so that the results you and someone else on a different computer get could be very different.
If I only use Google for entertainment, like searching for online games and music videos, then when I want to use it for learning more about the current state of the Presidential election I might not see all the relevant information I need. If on Facebook I only encounter the posts of my friends that have similar viewpoints as I do, then their filter bubble can have the same impact as group polarization. Group polarization is the idea that when a group of individuals share an attitude, that attitude is likely to become intensified after group discussion. For example being exposed to only the posts of your liberal friends on Facebook, even if they are all moderate, will strengthen your liberal beliefs if you were already liberal before. Filter bubbles have an impact on your beliefs and viewpoints, which is an issue considering that you are being exposed to what you are expected to like, which isn’t always an accurate representation of the information available.
(Image from Eli Pariser’s Ted Talk: Beware online “filter bubbles”)
Filter bubbles present a serious concern regarding the search for online information. With Facebook and Google you can’t even see the information that is being excluded from your search. This is a problem, especially when you consider that many times what people want to see is not necessarily what they need to see. Often times I turn to Google and the internet to provide me with points of view that I am less likely to encounter in real life because I’m surrounded with people from similar backgrounds. I rely on many of the things Facebook’s and Google’s algorithms will predict I don’t want to see to challenge my existing world views and cause me to look at subject areas from a different mindset. Being able to see viewpoints you disagree with and search results you wouldn’t normally search for intentionally provides individuals with the information they need to learn and grow as a person.
Transparency and openness are the keys to combating the negative impacts of filter bubbles. One of the dangers of filter bubbles is that you don’t know what information is being excluded from the results of your search. By simply including the search results that aren’t filtered by any algorithm to predict what an individual will want to see or including an option to turn off the algorithm for a search result will provide people with a less skewed set of information to draw conclusions from and learn from. At the very least they will be more cognizant of the filter bubbles they encounter and will hopefully take the filter bubbles into consideration when analyzing the information provided.