Image Credit: Wikipedia
When the Internet was first created, it was perceived as something that would connect people together, introducing more worldviews and perspectives. However, an increase in personalization of content has caused an invisible shift in the way the Internet functions. This rise in personalization has created a filter bubble—a term coined by internet activist Eli Pariser describing the informational isolation caused by content personalization from website algorithms.
Eli Pariser discusses filter bubbles in his Ted Talk, “Beware Online ‘Filter Bubbles.’” He explains how large news sites including TheNew York Times, Huffington Post and TheWashington Post all include aspects of personalization, filtering content to prioritize what algorithms analyze as what consumers want to read. Personalization is also prevalent across social media platforms. According to Mike Kaput, the Director of Marketing AI Institute, Facebook uses artificial intelligence algorithms to filter content that shows up on users’ News Feeds. Kaput quotes Facebook on how “‘machine learning models are part of ranking and personalizing News Feed stories…[and] ranking search results.’” Facebook is not the only social media platform implementing artificial intelligence to bolster personalization. According to an article from Bernard Marr & Co., an independent think tank, Twitter uses artificial intelligence to “determine what tweet recommendations to suggest on users’ timelines.”
Pariser explains why this rise in personalization is a problem. All of the filters of information across content providers ranging from news sources to social media platforms create a personalized bubble of information for each individual person—a filter bubble. These filter bubbles confine people to pre-existing beliefs, removing or deprioritizing news and posts on perspectives and stances that oppose people’s beliefs. For a functioning democracy, a good flow of information is crucial. But that does not happen when algorithms prevent consumers from seeing things that they are uncomfortable with or challenge previous beliefs. As stated by Pariser, with the presence of algorithmic filters, “instead of a balanced diet, [consumers] can end up with information junk food.” This lack of a balanced information diet has resulted in amplification of biases and increased polarization, with less tolerance of opposing views.
Interestingly, the problem of the filter bubble created by artificial intelligence can also potentially be combated by artificial intelligence. An article by Ben Dickson, a software engineer and the founder of Tech Talks, introduces Nobias, a browser extension that tracks the political standing of the news articles people consume. This increases awareness regarding the political standing of what is being consumed, and can help people work towards a balanced diet of online content. To analyze bias in news articles, Nobias employs machine-learning algorithms that are trained on politician’s speeches to analyze keywords and text of articles for political slant.
Overall, the presence of artificial intelligence in personalization of media has caused an imbalance in information consumption, resulting in increased polarization. However, the same technology that has created this filter bubble can be applied to increase awareness of the political standing of the information being consumed, allowing consumers to regain control of their information diet.
Eli Pariser’s Ted Talk, “Beware Online ‘Filter Bubbles’”: https://www.ted.com/talks/eli_pariser_beware_online_filter_bubbles/transcript?utm_source=tedcomshare&utm_medium=referral&utm_campaign=tedspread
Bernard Marr & Co.: https://www.bernardmarr.com/default.asp?contentID=1373