We Are The Filter Bubble

In a world where computer algorithms play an increasingly important role in determining what information we consume, there is another force pulling us in the other direction. I’m talking about the content curation” that people do to filter information so it’s more useful for themselves and others.

We curate content every time we bookmark a page on the web or share a link to it with our friends on social networks.

We share a lot of stuff on Facebook, of course; not just web links, but images, messages, events, and all kinds of other stuff. One of the reasons Twitter took off, in spite of Facebook, is that it was particularly good at helping us share information.  Twitter actually thinks of itself as a “real-time information network” – not a social network like Facebook. The engine that drives that information network is a dedicated core of “information networkerspeople who use the service to curate content for other people.

I believe Google+ will eventually eclipse Twitter as the web’s main content curation engine because it’s a better service for sharing interests with other people. I personally spend most of my social media time on Google+, and in the process, I’ve learned a few things about what I believe are some of the core aspects of social media networks. This post focuses on an important aspect of them: curation.

Curating the Curators

If you spend any time looking at networks, one of the things you’ll find is that the rich tend to get richer. As people build lots of followers, the probability that additional people follow them increases. There are lots of explanations for this, but one important one centers on the notion of “social proof.” That’s what happens when we think to ourself, “boy, so-and-so sure has a lot of followers – they must have something important to say.”

There are lots of problems with this “rich get richer” phenomenon, one of which popped up for me a while back, almost by accident. I had decided earlier that I wanted to identify a group of people who were sharing interesting content, but who hadn’t yet built big followings on Google+. After a bunch of work to identify this group, I shared the results as a “circle share” on Google+. The results were quite positive. Several hundred people added these individuals to their own circles, which is great visibility for this group – a nice opportunity to get their content in front of more people.

That wasn’t my accidental discovery though. No, that came after I started looking at the resulting stream of content from this new circle I had created. It was new and interesting stuff; way fewer LOL cats and a much more diverse flow of information. I wasn’t the only one seeing it either. Yesterday, Dusan Vrban shared a post talking about the importance of the human touch in content curation and his excitement at the results of this new circle:

“In just a few minutes, my stream changed. Suddenly there were posts about history books, science, psychology, food and all of them had some added value. Not just resharing of some article, but adding some comment or thought that made it more relevant.”

This is what led to my “discovery.” You see, when it comes to networking information, curating content is only half of the problem. The other half is curating people.

[Tweet “…curating content is only half of the problem. The other half is curating people.”]

When we take the time to build interesting, diverse circles on Google+ or lists on Twitter, we improve the way we filter information. It’s up to us. We can pursue strategies that concentrate the stream of content into just the same old stuff, or we can go out of our way to increase the diversity – and the quality – of what comes to us. It’s all in the people we follow.

Bubble Machines

Eli Pariser published a book a while back called The Filter Bubble, which makes a very important point about what algorithms are doing to the way we receive information today. We now live in a world that is the antithesis of Walter Cronkite’s reassuring “And that’s the way it is..” wrap-up of the news each night. There’s no single way anymore.

The results you see when you search for “Egypt” on Google are quite different from what I see. This is even more true now than it was when Pariser first started talking about this problem, because of the way Google and Bing have recently started integrating social signals into search results. Another version of this problem occurs within the information you get in your stream on Facebook, because of Facebook’s “EdgeRank” algorithm.

Pariser’s point is that algorithms are curating the world for us and that we don’t get to decide what stays in and what gets edited out. Here’s Pariser speaking about the “filter bubble” at TED:

We Create Our Own Bubbles

There is no getting around it. Pariser is right: if we aren’t careful, we’ll end up in a world where the truth is so relative that there will be little place for common ground. We will silo ourselves into increasingly entrenched circles of people who share our views, cutting ourselves off from a diversity of opinions and experiences.

What’s more, my experiment with curating that circle on Google+ highlighted for me that it’s not just the algorithms that are the problem.

When we use “social proof” (“everyone else is following that person, so I should too”) as the only criteria for deciding whether or not to follow someone on a social network, we create our own organic filter bubble. We concentrate connections in the network into the hands of a relatively small number of “network celebrities,” and though they may have very interesting things to say, it tends to result in a more homogenous flow of content in the network, as the ideas of interesting, but lesser-known people are drowned out in a sea of celebrity re-tweets and re-shares.

And that’s not the only way that our choices of who to follow shape the information we receive. When we only follow people who look and think the way we do, we limit our exposure to different types of information. In this way, of course, our online social networks are no different than the physical world. The failure to surround ourselves with a diverse group of friends and acquaintances narrows our worldview, it filters out our experience of the world. 

Now it may be that you’re fine with that. Filters do help reduce complexity and noise, after all, and life can be a lot simpler when we surround ourselves with people just like us. And yet, in doing that, we cut off something really valuable, something called “reality.” Reality doesn’t always conform to how we want to see the world. But surrounding ourselves with diverse perspectives does help give us a more complete picture of what’s really happening in the world – and that’s no less true on Facebook, Twitter, and now, Google+.

Today, social networks powerfully affect the information we receive. Pariser is absolutely right that with the great algorithmic power that Facebook, Twitter and Google now have with these services, comes the burden of great responsibility.

And yet, it’s the content curation that people do on these services that deeply affects each of our respective “Internet Filter Bubbles.” So some of that responsibility still sits with us. Without us, these social networks are useless. On social media networks, the information filter is us. We create our information filter bubbles every time we choose whether or not to connect with another individual on these services.

[Tweet “On social media networks, the information filter is us.”]

In the world of the information networker, curating content is only half the game. The other half is curating the curators. And in that power to choose our connections, rests our ultimate power to reshape our information filter bubbles and radically improve our perception of reality.

 

Image credits:

Girls on coach
Monopoly game
Cat modified from original
Bubble-blowing machine
Bubble blower

 

Your comments are welcome here:Cancel reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Exit mobile version