Flickr photo by antoonsfoobar
I recently saw an interesting TED talk by Eli Pariser on the next wave of cyberbalkanization. [Read his fascinating new book “The Filter Bubble” here.]
Background: Marshall Van Alstyne predicted 15 years earlier that users would self-segregate on the net and choose to get exposed to ever more narrow communities of interest.
We’re now onto the “The Daily Me” 2.0. Some news sites originally let users click on their interests a user could limit his/her news to say sports and entertainment news. Cass Sunstein and Nicholas Negroponte predicted that it would lead to stronger news blinders and expose us to less and less common information, what they called “The Daily Me”.
Well, it turns out that users actually choose to subject themselves to more diversity in opinions and networks on the net than people predicted.
But the latest onslaught, what Eli Pariser calls “The Filter Bubble”, is more invidious. More and more user sites (Facebook, Google Search, Yahoo News, Huffington Post, the Washington Post) now automatically tailor your stream of results, facebook feed, and news feed based on your past clicks, where you are sitting, what type of computer you use, what web browser you use, etc.
Unlike in the past, this is not “opt in” cyberbalkanization but automatic. And since it happens behind-the-scenes, you can’t know what you’re not seeing. One’s search of Tunisia on Google might not even tell you about the political uprising if you haven’t expressed interest in politics in the past. Eric Schmidt of Google said “It will be very hard for people to watch or consume something that has not in some sense been tailored for them.”
Pariser notes that we all have internal battles between our aspirational selves (who want greater diversity) and our current selves (who often want something easy to consume). In most of our lives or Netflix queues we continually play out these battles with sometimes our aspirational selves winning out. These filter bubbles edit out our aspirational selves when we need a mix of vegetables and dessert. Pariser believes that the algorithmic gatekeepers need to show us things that are not only junk food but also things that are challenging, important and uncomfortable and present competing points of view. We need Internet ethics in the way that journalistic ethics were introduced in 1915 with transparency and a sense of civic responsibility and room for user control.
It’s an interesting talk and I clearly agree with Pariser that gatekeepers should be more transparent and allow user input to tweak our ratio of dessert to vegetables, to use his analogy. But I think Pariser, in forecasting the degree of our Filter Bubble, misses out the fact that there are other sources of finding about news articles. Take Twitter retweets. Even if my friends are not that diverse — and many of us will choose to “follow” people we don’t agree with — as long as one of the people I’m following has diverse views in his/her circle of followers and retweets their interesting posts, I get exposed to them. Ditto with e-mail alerts by friends of interesting articles or social searches using Google. We live in far more of a social world where information leads come from many other sources than Google searches or Yahoo News. So let’s work on the automatic filters, but the sky is not falling just yet.
See “The Filter Bubble.” (Feb. 2011 TED talk)
Posted in Cass Sunstein, cyberbalkanization, Daily Me, Eli Pariser, facebook, Filter Bubble, Filtering, google, Huffington Post, internet, marshall van Alstyne, Nicholas Negroponte, preferences, TED, The Filter Bubble, twitter, washington post, yahoo, Yahoo News
Tagged Cass Sunstein, cyberbalkanization, Daily Me, Eli Pariser, facebook, Filter Bubble, Filtering, google, Huffington Post, internet, marshall van Alstyne, Nicholas Negroponte, preferences, TED, The Filter Bubble, twitter, washington post, yahoo, Yahoo News
A comment here on Search Engine Journal suggests that social bookmarks like reddit, delicious, StumbleUpon, may replace Google as the search engines of the future.
The author hints to the advantage of these social bookmarks as incorporating human intelligence, but the author ignores the fact that Google is already powered by links incorporating human intelligence as well. The fact that Google ranks sites by (among other things) the number of external websites linking to those website URLs is already a social form of bookmarking or search. The sites that other people find powerful, influential or authentic get linked to and hence are listed higher in the Google rankings.
In Ian Ayres very interesting read, SuperCrunchers, he discusses Google’s beta search efforts as a way of using personalized information about searchers.
“Tera mining of customer records, airline prices, and inventories is peanuts compared to Google’s goal of organizing all the world’s information. … Google has developed a Personalized Search feature that uses your past search history to further refine what you really have in mind. If Bill Gates and Martha Stewart both Google ‘blackberry,’ Gates is more likely to see web pages about the email device at the top of his results list, while Stewart is more likely to see web pages about the fruit. Google is pushing this personalized data mining into almost every one of its features. Its new web accelerator dramatically speeds up access to the Internet–not by some breakthrough in hardware or software technology–but by predicting what you are going to want to read next. Google’s web accelerator is continually pre-picking web pages from the net. So while you’re reading the first page of an article, it’s already downloading pages two and three. And even before you fire up your browser tomorrow morning, simple data mining helps Google predict what sites you’re going to want to look at (hint: it’s probably the same sites you look at most days). “
I’ve long been interested in how websites can use network knowledge (the wisdom captured within its usebase). Slashdot.org found a way to do this in distributing the ability to praise or ding posts of members (without giving anyone veto power); Wikipedia does this through distributing editorial input; Craigslist does this by giving users the power to flag postings as spam. And I’ve separately written about “viral popularity” as a way of using social networks to spread the popularity of interest in media of various sorts.