cover image

Filter bubble

Intellectual isolation involving search engines / From Wikipedia, the free encyclopedia

Dear Wikiwand AI, let's keep it short, summarize this topic like I'm... Ten years old or a College student

A filter bubble or ideological frame is a state of intellectual isolation[1] that can result from personalized searches when a website algorithm selectively guesses what information a user would like to see based on information about the user, such as location, past click-behavior, and search history.[2] As a result, users become separated from information that disagrees with their viewpoints, effectively isolating them in their own cultural or ideological bubbles.[3] The choices made by these algorithms are only sometimes transparent.[4] Prime examples include Google Personalized Search results and Facebook's personalized news-stream.

The term filter bubble was coined by internet activist Eli Pariser, circa 2010.

The term filter bubble was coined by internet activist Eli Pariser circa 2010 and discussed in his 2011 book of the same name. The bubble effect may have negative implications for civic discourse, according to Pariser, but contrasting views regard the effect as minimal[5] and addressable.[6] The results of the U.S. presidential election in 2016 have been associated with the influence of social media platforms such as Twitter and Facebook,[7][8] and as a result have called into question the effects of the "filter bubble" phenomenon on user exposure to fake news and echo chambers,[9] spurring new interest in the term,[10] with many concerned that the phenomenon may harm democracy and well-being by making the effects of misinformation worse.[11][12][10][13][14][15]

Technology such as social media “lets you go off with like-minded people, so you're not mixing and sharing and understanding other points of view ... It's super important. It's turned out to be more of a problem than I, or many others, would have expected.”

Bill Gates in 2017 [16]