I’ve been looking for this book for a long time. Or maybe I should rephrase that. I was pondering this topic and had been looking for a book which covered it for a while so I was pleased to come across The Filter Bubble.
Digging into search engine optimization and analytics while building my own website, I was often confused by inconsistent Google search results. Realizing I was on a different computer, or I was logged into Google services I would logout to see the untainted results, the results everyone else was seeing. Or was I?
As Google+ personalization launched the topic of search really piqued my interest. Why had I been given different results at different times? The coverage on Gigaom, AllThingsD, TechCrunch and ReadWriteWeb cautioned that this could be the turning point for Google, in a bad way.
It’s true Google takes signals from many different sources. With the launch of Google+ they now incorporate additional social signals. As Facebook becomes the default dashboard for more and more internet users, the means of finding content is shifting from Search to Social. So Google is responding to this by making their overall service more social as well.
The impact though for users of the service could be confusion. Many users I’ve spoken to, working in tech or otherwise think the results they see on Google are unbiased and the same for each user. Google’s secret sauce has always been its algorithm that returned the best results. Now that social signals are mixed into the page rank brew, will users continue to value Google results?
A cause for concern
Pariser’s illustrates the difference in Google search today with great examples. After the gulf oil spill, he asked two friends to search for “BP”. One saw breaking news on the topic, the other got investment information about BP. Filter bubble, indeed.
Behavior Targeting, as it’s termed in the industry, is all about figuring out what you want before you ask. But sociologist Danah Boyd argued in a Web 2.0 Expo speech in 2009, that with all this personalization giving us exactly what we *want* that
“If we’re not careful, we’re going to develop the psychological equivalent of obesity”.
Even foundres, Sergey Brin and Larry Page in the early days apparently thought that this bias might turn out to be a problem
[quote_left]“We expect that advertising funded search engines will be inherently biased towards the advertisers and away from the needs of the consumers”.— Sergey Brin & Larry Page, Google[/quote_left]
Firms like Recorded Future promise to “unlock the predictive power of the web” and lesser known but formidable Acxiom which specializes in marketing and personalization, combing through mountains of data to figure out what coast you live on, how you vote, what you eat and drink, and almost what you’re going to do before you do it.
Pariser touches on everything in this book from present bias to meaningful threats, the priming effect, accessibility bias and warns of getting trapped in what he terms a “you loop” where you continue to see things framed and personalized by what you’ve viewed and reacted to before, ultimately narrowing your view, and limiting your exposure to new information.
Perhaps the biggest problem with these opaque transformations applied to your data is that they play judge and jury with no appeal; sometimes without your knowledge that you were in a courtroom being judged. Programmers write algorithms and code to perform these transformations, sorting people into groups. If they put people in a group that doesn’t match them they call it “overfitting”. In society we might call this some sort of stereotyping.
One chapter titled the Adderall Society, asks if this filter bubble isn’t part of a larger transformation that’s pushing our thinking towards the conservative and calculative, limiting creativity or foreign ideas that can break categories, and encouraging us to ignore or steer around serendipity.
The book bumps into a great spectrum of thinkers on this topic, from Danny Sullivan of SearchEngineLand, to Amit Singhal an engineer on Google’s team, and John Battelle’s SearchBlog. He speaks to Chris Coyne – okcupid.com, David Shields on what he calls truthiness and former CIA consultant John Rendon who says “filter bubbles provide new ways of managing perceptions” and suggests the garden variety Thesaurus as a powerful tool that nudges debates with new language.
Be aware but don’t be paranoid
Although I think the book gives valuable insight, I was a little dismayed by the mood of paranoia in its title. With a subtitle like What the Internet is Hiding from You, it suggests a conspiracy or hidden agenda. Now obviously these large corps have a motive to make money, but I don’t think anyone is surprised at that. To some, Pariser’s views may appear somewhat left leaning but the issues raised in his book transcends political boundaries. They are matters that concern society at large.
In the end I think I’m probably more optimistic about these things. With a long view, society tends to work out these issues, through public pressure or simply buying differently. As Google is quick to remind us, we can easily choose an alternate search engine. In the future perhaps public pressure will push firms to provide more transparency about these filtering mechanisms allowing end users to manage their own filter settings.
I’ll leave you with a few ideas to chew on. Can code and algorithms curate properly? Should there be another button alongside the Like button such as “Important”?
Pariser quoted the folks at the New York Times: “We don’t let metrics dictate our assignments and play, because we believe readers come to us for our judgement, not the judgement of the crowd”. Indeed. But in the internet age, is that what they *buy* or *click*?