Filter bubble

The term was coined by internet activist Eli Pariser in his eponymous book

A filter bubble is a result of a personalized search in which a website algorithm selectively guesses what information a user would like to see based on information about the user (such as location, past click behavior and search history)[1][2][3] and, as a result, users become separated from information that disagrees with their viewpoints, effectively isolating them in their own cultural or ideological bubbles.[4] The choices made by the algorithms are not transparent. Prime examples are Google Personalized Search results and Facebook's personalized news stream. The term was coined by internet activist Eli Pariser in his book by the same name; according to Pariser, users get less exposure to conflicting viewpoints and are isolated intellectually in their own informational bubble. Pariser related an example in which one user searched Google for "BP" and got investment news about British Petroleum while another searcher got information about the Deepwater Horizon oil spill and that the two search results pages were "strikingly different".[5][6][7][8] The bubble effect may have negative implications for civic discourse, according to Pariser, but there are contrasting views suggesting the effect is minimal[8] and addressable.[9]

Concept

Pariser defined his concept of filter bubble in more formal terms as "that personal ecosystem of information that's been catered by these algorithms".[5] Other terms have been used to describe this phenomenon, including "ideological frames"[6] or a "figurative sphere surrounding you as you search the Internet".[10] The past search history is built up over time when an Internet user indicates interest in topics by "clicking links, viewing friends, putting movies in your queue, reading news stories" and so forth.[10] An Internet firm then uses this information to target advertising to the user or make it appear more prominently in a search results query page.[10] Pariser's concern is somewhat similar to one made by Tim Berners-Lee in a 2010 report in The Guardian along the lines of a Hotel California effect which happens when Internet social networking sites were walling off content from other competing sites—as a way of grabbing a greater share of all Internet users—such that the "more you enter, the more you become locked in" to the information within a specific Internet site. It becomes a "closed silo of content" with the risk of fragmenting the Worldwide Web, according to Berners-Lee.[11]

In The Filter Bubble, Pariser warns that a potential downside to filtered searching is that it "closes us off to new ideas, subjects, and important information"[12] and "creates the impression that our narrow self-interest is all that exists".[6] It is potentially harmful to both individuals and society, in his view. He criticized Google and Facebook for offering users "too much candy, and not enough carrots".[13] He warned that "invisible algorithmic editing of the web" may limit our exposure to new information and narrow our outlook.[13] According to Pariser, the detrimental effects of filter bubbles include harm to the general society in the sense that it has the possibility of "undermining civic discourse" and making people more vulnerable to "propaganda and manipulation".[6] He wrote:

A world constructed from the familiar is a world in which there’s nothing to learn ... (since there is) invisible autopropaganda, indoctrinating us with our own ideas.
Eli Pariser in The Economist, 2011[14]

A filter bubble has been described as exacerbating a phenomenon that has been called splinternet or cyberbalkanization,[15] which happens when the Internet becomes divided up into sub-groups of like-minded people who become insulated within their own online community and fail to get exposure to different views; the term cyberbalkanization was coined in 1996.[16][17][18]

Reactions

There are conflicting reports about the extent to which personalized filtering is happening and whether such activity is beneficial or harmful. Analyst Jacob Weisberg writing in Slate did a small non-scientific experiment to test Pariser's theory which involved five associates with different ideological backgrounds conducting exactly the same search—the results of all five search queries were nearly identical across four different searches, suggesting that a filter bubble was not in effect, which led him to write that a situation in which all people are "feeding at the trough of a Daily Me" was overblown.[6] A scientific study from Wharton that analyzed personalized recommendations also found that these filters can actually create commonality, not fragmentation, in online music taste.[19] Consumers apparently use the filter to expand their taste, not limit it.[19] Book reviewer Paul Boutin did a similar experiment among people with differing search histories, and found results similar to Weisberg's with nearly identical search results.[8] Harvard law professor Jonathan Zittrain disputed the extent to which personalisation filters distort Google search results; he said "the effects of search personalization have been light".[6] Further, there are reports that users can shut off personalisation features on Google if they choose[20] by deleting the Web history and by other methods.[8] A spokesperson for Google suggested that algorithms were added to Google search engines to deliberately "limit personalization and promote variety".[6]

Nevertheless, there are reports that Google and other sites have vast information which might enable them to further personalise a user's Internet experience if they chose to do so. One account suggested that Google can keep track of user past histories even if they don't have a personal Google account or are not logged into one.[8] One report was that Google has collected "10 years worth" of information amassed from varying sources, such as Gmail, Google Maps, and other services besides its search engine,[7] although a contrary report was that trying to personalise the Internet for each user was technically challenging for an Internet firm to achieve despite the huge amounts of available web data. Analyst Doug Gross of CNN suggested that filtered searching seemed to be more helpful for consumers than for citizens, and would help a consumer looking for "pizza" find local delivery options based on a personalized search and appropriately filter out distant pizza stores.[7] There is agreement that sites within the Internet, such as the Washington Post, The New York Times, and others are pushing efforts towards creating personalized information engines, with the aim of tailoring search results to those that users are likely to like or agree with.[6]

See also

References

  1. Bozdag, Engin (23 June 2013). "Bias in algorithmic filtering and personalization". Ethics and Information Technology. 15 (3): 209–227.
  2. Web bug (slang)
  3. Website visitor tracking
  4. The Huffington Post "Are Filter-bubbles Shrinking Our Minds?"
  5. 1 2 Parramore, Lynn (October 10, 2010). "The Filter Bubble". The Atlantic. Retrieved April 20, 2011. Since Dec. 4, 2009, Google has been personalized for everyone. So when I had two friends this spring Google "BP," one of them got a set of links that was about investment opportunities in BP. The other one got information about the oil spill....
  6. 1 2 3 4 5 6 7 8 Weisberg, Jacob (June 10, 2011). "Bubble Trouble: Is Web personalization turning us into solipsistic twits?". Slate. Retrieved August 15, 2011.
  7. 1 2 3 Gross, Doug (May 19, 2011). "What the Internet is hiding from you". CNN. Retrieved August 15, 2011. I had friends Google BP when the oil spill was happening. These are two women who were quite similar in a lot of ways. One got a lot of results about the environmental consequences of what was happening and the spill. The other one just got investment information and nothing about the spill at all.
  8. 1 2 3 4 5 Boutin, Paul (May 20, 2011). "Your Results May Vary: Will the information superhighway turn into a cul-de-sac because of automated filters?". The Wall Street Journal. Retrieved August 15, 2011. By tracking individual Web browsers with cookies, Google has been able to personalize results even for users who don't create a personal Google account or are not logged into one. ...
  9. Zhang, Yuan Cao; Séaghdha, Diarmuid Ó; Quercia, Daniele; Jambor, Tamas (February 2012). "Auralist: Introducing Serendipity into Music Recommendation" (PDF). ACM WSDM.
  10. 1 2 3 Lazar, Shira (June 1, 2011). "Algorithms and the Filter Bubble Ruining Your Online Experience?". Huffington Post. Retrieved August 15, 2011. a filter bubble is the figurative sphere surrounding you as you search the Internet.
  11. Bosker, Bianca (November 22, 2010). "Tim Berners-Lee: Facebook Threatens Web, Beware". The Guardian. Retrieved August 22, 2012. Social networking sites are threatening the Web's core principles ... Berners-Lee argued. "Each site is a silo, walled off from the others," he explained. "The more you enter, the more you become locked in....
  12. "First Monday: What's on tap this month on TV and in movies and books: The Filter Bubble by Eli Pariser". USA Today. 2011. Retrieved April 20, 2011. Pariser explains that feeding us only what is familiar and comfortable to us closes us off to new ideas, subjects and important information.
  13. 1 2 Bosker, Bianca (March 7, 2011). "Facebook, Google Giving Us Information Junk Food, Eli Pariser Warns". Huffington Post. Retrieved April 20, 2011. When it comes to content, Google and Facebook are offering us too much candy, and not enough carrots.
  14. "Invisible sieve: Hidden, specially for you". The Economist. 30 June 2011. Retrieved June 27, 2011. Mr Pariser’s book provides a survey of the internet’s evolution towards personalisation, examines how presenting information alters the way in which it is perceived and concludes with prescriptions for bursting the filter bubble that surrounds each user.
  15. Note: the term cyber-balkanization (sometimes with a hyphen) is a hybrid of cyber, relating to the Internet, and Balkanization, referring to that region of Europe that was historically subdivided by languages, religions and cultures; the term was coined in a paper by MIT researchers Van Alstyne and Brynjolfsson.
  16. "Cyberbalkanization" (PDF).
  17. Van Alstyne, Marshall; Brynjolfsson, Erik (November 1996). "Could the Internet Balkanize Science?". Science. 274 (5292). doi:10.1126/science.274.5292.1479.
  18. Alex Pham and Jon Healey, Tribune Newspapers: Los Angeles Times (September 24, 2005). "Systems hope to tell you what you'd like: 'Preference engines' guide users through the flood of content". Chicago Tribune. Retrieved December 4, 2015. ...if recommenders were perfect, I can have the option of talking to only people who are just like me....Cyber-balkanization, as Brynjolfsson coined the scenario, is not an inevitable effect of recommendation tools,,,,
  19. 1 2 Hosanagar, Kartik; Fleder, Daniel; Lee, Dokyun; Buja, Andreas (December 2013). "Will the Global Village Fracture into Tribes: Recommender Systems and their Effects on Consumers". Management Science, Forthcoming.
  20. Ludwig, Amber. "Google Personalization on Your Search Results Plus How to Turn it Off". NGNG. Retrieved August 15, 2011. Google customizing search results is an automatic feature, but you can shut this feature off.

Further reading

External links

This article is issued from Wikipedia - version of the 11/10/2016. The text is available under the Creative Commons Attribution/Share Alike but additional terms may apply for the media files.