I enjoyed Jon Udell’s thoughts on Filter Failure.
The problem isn’t information overload, Clay Shirky famously said, it’s filter failure. Lately, though, I’m more worried about filter success. Increasingly my filters are being defined for me by systems that watch my behavior and suggest More Like This. More things to read, people to follow, songs to hear. These filters do a great job of hiding things that are dissimilar and surprising. But that’s the very definition of information! Formally it’s the one thing that’s not like the others, the one that surprises you.
A filter bubble is a result state in which a website algorithm selectively guesses what information a user would like to see based on information about the user (such as location, past click behaviour and search history) and, as a result, users become separated from information that disagrees with their viewpoints, effectively isolating them in their own cultural or ideological bubbles. Prime examples are Google‘s personalised search results and Facebook‘s personalised news stream. The term was coined by internet activist Eli Pariser in his book by the same name; according to Pariser, users get less exposure to conflicting viewpoints and are isolated intellectually in their own informational bubble.
The filter bubble is still being heavily debated to if its real or not but the idea of filters which get things wrong to add a level of serendipity sounds good. But I do wonder if people will be happy with a level of fuzziness in the algorithms they become dependable on?
I’m always on the lookout for ways to defeat the filters and see things through lenses other than my own. On Facebook, for example, I stay connected to people with whom I profoundly disagree. As a tourist of other people’s echo chambers I gain perspective on my native echo chamber. Facebook doesn’t discourage this tourism, but it doesn’t actively encourage it either.
The way Jon Udell is defeating the filters, he retains some kind of control. Its a nice way to get a balance, but as someone who only follows 200ish people on Twitter and don’t look at Facebook much, I actively like to remove the noise from my bubble.
As I think back on the evolution of social media I recall a few moments when my filters did “fail” in ways that delivered the kinds of surprises I value. Napster was the first. When you found a tune on Napster you could also explore the library of the person who shared that tune. That person had no idea who I was or what I’d like. By way of a tune we randomly shared in common I found many delightful surprises. I don’t have that experience on Pandora today.
Likewise the early blogosophere. I built my echo chamber there by following people whose lenses on the world complemented mine. For us the common thread was Net tech. But anything could and did appear in the feeds we shared directly with one another. Again there were many delightful surprises.
Oh yes I remember spending hours in Easy Everything internet cafes after work or going out checking out users library’s, not really recognizing the name and listening to see if I liked it. Jon may not admit it but I found the dark net provides some very interesting parallels with this. Looking through what else someone shared can be a real delight when you strike upon something unheard of.
And likewise the blogosphere can lead you down some interesting paths. Take my blog for example, some people read it because of my interest in Technology, but the next post may be something to do with dating or life experience.
I do want some filter failure but I want to be in control of when really… And I think thats the point Jon is getting at…
I want my filters to fail, and I want dials that control the degrees and kinds of failures.
Where that statement leaves the concept of pure Perceptive Media, who knows…? But its certainly something I’ve been considering for a long while.
Reminds me of that old saying… Its not a bug, its a feature…