Background: Marshall Van Alstyne predicted 15 years earlier that users would self-segregate on the net and choose to get exposed to ever more narrow communities of interest.
We’re now onto the “The Daily Me” 2.0. Some news sites originally let users click on their interests a user could limit his/her news to say sports and entertainment news. Cass Sunstein and Nicholas Negroponte predicted that it would lead to stronger news blinders and expose us to less and less common information, what they called “The Daily Me”.
Well, it turns out that users actually choose to subject themselves to more diversity in opinions and networks on the net than people predicted.
But the latest onslaught, what Eli Pariser calls “The Filter Bubble”, is more invidious. More and more user sites (Facebook, Google Search, Yahoo News, Huffington Post, the Washington Post) now automatically tailor your stream of results, facebook feed, and news feed based on your past clicks, where you are sitting, what type of computer you use, what web browser you use, etc.
Unlike in the past, this is not “opt in” cyberbalkanization but automatic. And since it happens behind-the-scenes, you can’t know what you’re not seeing. One’s search of Tunisia on Google might not even tell you about the political uprising if you haven’t expressed interest in politics in the past. Eric Schmidt of Google said “It will be very hard for people to watch or consume something that has not in some sense been tailored for them.”
Pariser notes that we all have internal battles between our aspirational selves (who want greater diversity) and our current selves (who often want something easy to consume). In most of our lives or Netflix queues we continually play out these battles with sometimes our aspirational selves winning out. These filter bubbles edit out our aspirational selves when we need a mix of vegetables and dessert. Pariser believes that the algorithmic gatekeepers need to show us things that are not only junk food but also things that are challenging, important and uncomfortable and present competing points of view. We need Internet ethics in the way that journalistic ethics were introduced in 1915 with transparency and a sense of civic responsibility and room for user control.
It’s an interesting talk and I clearly agree with Pariser that gatekeepers should be more transparent and allow user input to tweak our ratio of dessert to vegetables, to use his analogy. But I think Pariser, in forecasting the degree of our Filter Bubble, misses out the fact that there are other sources of finding about news articles. Take Twitter retweets. Even if my friends are not that diverse — and many of us will choose to “follow” people we don’t agree with — as long as one of the people I’m following has diverse views in his/her circle of followers and retweets their interesting posts, I get exposed to them. Ditto with e-mail alerts by friends of interesting articles or social searches using Google. We live in far more of a social world where information leads come from many other sources than Google searches or Yahoo News. So let’s work on the automatic filters, but the sky is not falling just yet.
See “The Filter Bubble.” (Feb. 2011 TED talk)