There was a time when the Internet promised to offer us a vibrant cultural milieu that would expand our horizons.

But as the Web develops, the opposite may be happening.

In his fascinating new book, The Filter Bubble: What the Internet Is Hiding from You (Penguin Press), Eli Pariser, board president and former executive director of MoveOn.org, explores the rise of "personalized" Internet filters whose very purpose is to narrow, rather than expand, the world that we see.

The implications are serious, Pariser notes. In a personalized world, Web users are increasingly fed only the news and information that fits with their Internet profile, as gleaned from past behavior tracked by a burgeoning industry of data companies that can divine everything from your political leanings to your favorite color. Rather than expand our world, critics note, personalization is pushing us into advertiser-driven "tunnels" of information online that can limit the unexpected encounters that spark innovation and the democratic exchange of ideas.

PW recently caught up with Pariser in his New York office, where he heads the Avaaz Foundation, one of the world's largest citizen organizations, to talk about the rise of "personalized" Internet filters and the implications for the health of our culture and our democracy.

In late 2009 there was an event of sorts that led you to write this book, correct? Can you tell us a little about that?

I was taking a couple days off to do a little thinking, sort of a retreat, and I came across a Google blog post saying that they were turning on personalized search for everyone. So I immediately got a friend to come over and we started doing searches to see how different our personalized results would be. And they were really different. It dawned on me that if Google, a service that I use every day, could do this right under my nose, without me really noticing, this kind of filtering could be happening all over the Web. And indeed it is. And, you'd have to be following Google's corporate blog to know this was going on, and even then you'd have to be able to read between the lines to realize that this change was actually a pretty huge deal. This also speaks to the nature of the personalization phenomenon, which is that it's impossible to notice the changes on your own, because you can't see how different your search results are from anyone else's until you put your computers side by side.

Can you briefly explain what the filter bubble is?

Filter bubble is a term I came up with to describe the effects of these personalized filters that are increasingly editing the Web for us. When you go online, whether it's Netflix, Amazon, Pandora, Google, Facebook, or now even the Washington Post and the New York Times, what you're really seeing is the world the Internet thinks you want to see, not the view you might expect to see or need to see. So the term filter bubble serves as a metaphor for this personalized universe of information we live in.

There have always been filters, of course—the buyer at Barnes & Noble, your editor at Penguin. Why are these new breed of personalized Internet filters different?

For one, Internet filters are invisible, and oftentimes, you're not aware that filtering is even going on. It's one thing if you turn on Fox News or pick up the New York Times, because you know what kind of stories you are likely to see there. When you search for something on Google, however, you have no idea what Google thinks you want to see, and most people have no idea that Google is even making these kinds of choices. The other thing is that you have no control over them. There's a big difference between choosing to pick up an issue of the Weekly Standard or the Nation, and having a search engine decide which things are placed in front of you.

As the world continues to go digital, are these Internet filters affecting or otherwise changing the way traditional, analogue-world filters function? For example, changing how an editor might choose content?

Well, they're not changing quickly enough, really, because Internet filtering is totally changing the dynamics of how information moves. In the print and broadcast world, we are so used to just assuming that if you put a story out, people will see it. But online, those stories run through a second layer, which is an Internet filter saying, "I'm going to find the stories that are right for you." That changes the dynamic a lot. Newspapers now really have to consider where people's attention is likely to go, and what people are likely to click on, and that means that what people click on has a lot more power than any publisher saying, "This story is important because we published it." That proposition is now relatively less powerful.

On the subject of clicks, you write in the book about the way Gawker.com operates vs. the New York Times. Can you talk about that?

Sure. I was surprised to find that, at least when I was doing my reporting, bloggers at the New York Times were not allowed to look at traffic data. They are supposed to just write the best reportage they can. At Gawker, on the other hand, there is a big board that posts the traffic numbers. The ethos at Gawker is to do whatever you can to get people to share and link and come to the Web site.

Well, the Times and Gawker are different organizations with two different aims, right?

Yes, two different aims, but basically the same business model—in fact, they're basically in the same business. They both compete for people's attention, and largely the same group of people's attention. So, how do we keep some sense of journalistic ethics when we know that very few people will willingly choose to read about Afghanistan, even though we all need to know what is going on there? I think the sweet spot is probably somewhere in the middle, between Gawker and the Times. I think traffic data can help inform us about the right approaches to getting people to care about serious stories. But I do worry about losing some of the best aspects of 20th-century journalism as we move into the 21st.

You write about the potential decline of the "general audience" as people are served content based on how these filters perceive them. Are we at risk of being segmented into camps, like Fox News or New York Times readers?

Yes, but I'm really less concerned about the liberal or conservative narratives than I am about the people for whom the whole political sphere drops out entirely, because, let's face it, it's not as relevant or as interesting as, say, a basketball game, or whatever. If all these filters prioritize is what people engage with or click on most, which is what Facebook does, then we miss this whole part of being in a society, and that's really important. These algorithms just don't contemplate that, and they don't correct for that. For me, it's not so much that you might be in a left-wing or right-wing bubble, it's that you're in a Britney Spears bubble, or whatever, and you're not even seeing these other things at all.

In the book, you write about a phenomenon you dub "the Adderall Society." Can you explain what you mean by that?

People who take Adderall [an amphetamine drug used to treat ADHD], especially those who don't need it, report a sense of hyperfocus, of being really zoomed in on something, and it's very hard to zoom out. And I think that's one of the dynamics at play with these personalized filters—that we're not able to zoom out and see the bigger picture, only narrowly related things, either related to your online profile or related to the thing that you're looking at. For example, on Amazon, the personalization is very basic: "You like this author. Here's another thing by this author. Here's another thing by this author's son." But creativity and innovation and learning all come from being able to connect disparate concepts, being able to get outside of the zone of immediate relevance. When you look at the history of invention, or creativity, it's all like, "I'm taking this idea from over here, and combining it with this idea from over here." But if you can't even see those ideas, if you're just focused on the immediate vicinity of something, you lose this possibility of sparking new thoughts.

Just a decade ago there was a sense that cultural power might be moving away from big corporations, and that we'd see more democratization of culture online. That isn't necessarily happening, though, is it?

You're exactly right. Rather, there are new power centers, like Facebook and Google, and they have an incredible amount of power to shape what people know and don't know, just as much as, and in a lot of ways more than, the New York Times or any other elite, old media institution ever did. And for Facebook and Google, their mythology is very convenient: "We're just helping people do what they want to do, we're just passive participants." But that's not true. They're making decisions all the time about how to adjust the flow of information in ways that suit their advertisers and keep people coming back to their sites.

As you note, a lot of these decisions made by Internet filters are not transparent to consumers. In the book, you write that Barnes & Noble can be paid to put things on a table, but algorithms, like Amazon's or Google's rankings, are more easily bought. Can you explain?

It's no secret that if you adjust the variables a little bit, different things will come out at the top of Google search or the Facebook news feed. And it's all totally opaque, right? Google doesn't explain why its top result is its top result. They don't have to. And book publishers know—but many consumers do not—that online recommendation engines are bribable, that you can pay to get something to appear to be a computerized recommendation. That's a great revenue model for these companies, but I think the ethics are very questionable from a consumer standpoint. And it means that those with money have a pronounced advantage, even though a consumer might never know that money changed hands.

There is a wave of interest in DIY and self-publishing these days, empowered by technology. But while authors can now bypass gatekeepers on the editorial side, are their chances of being discovered by readers limited by these filters?

This is really the core question. On one hand, there's so much information and so many things to sort through that, yes, we do need help from code. But we also must be aware that this kind of filtering is going on, that something is making decisions about what we see and what we don't. Filters can be used well or poorly. But once we are aware these filters are in place, we can focus on what the filter is geared to let through and what it isn't. In my experience, human curators are still way better than even the best algorithmic ones, because, as the old investment cliché goes, past performance isn't always an indicator of future results. For the medium term, I think, there will be gatekeepers, and they're going to be partly algorithmic, partly human. And hopefully a sense of ethics will emerge about how they do what they do.

I must say, libraries come to mind, in terms of ethics that have served us well for centuries. Yet in the online and e-book world, we're poised to gut our library system in favor of these powerful but still relatively new companies, like Google. Does that concern you?

Totally. Libraries recognize that in a democracy you need to provide citizens with the resources to be well-informed. That's a bedrock value. And I'm concerned that we now have these corporations taking over that responsibility, taking over the job of providing and [selecting] information for people without considering the health of our democracy as a core part of what they're doing. It's not baked into Google to help build citizens. Google would like to help people find information, sure, but its business is providing advertisers with eyeballs and clicks. That kind of shift should worry us, that we're moving from systems that had a sense of ethics and a sense of democratic values built into them, to ones that don't.

These filters can help publishers make money online. But how thin is the line between making money and inhibiting, or even harming, the vibrancy of the culture publishers trade on?

It's a very thin line, and we face a challenging set of issues. But generally, in terms of how content creators make money on the Internet, I think the whole system will look very different in 10 years. I'm not an expert on the publishing business, but it just feels like we're going through this awkward, transitional phase. In doing research for the book, I went to a conference where I learned that, in the early Internet days, most of every dollar spent on online advertising went to the publishers. Now, that money is going to the people who have the audience data. Consider the consequences of that for publishers. Advertisers now think: "why should we pay a premium for the New York Times if we can advertise to their readers on a cheaper blog?" One of the people in the audience said that the New York Times would obviously have to change, and might eventually have to start thinking of itself as a data vendor. That would be a massive shift.

As this kind of data-driven content market matures, do you have any thoughts on the privacy implications?

Well, I do get infuriated by the sort of "nobody cares about privacy" rhetoric some of these companies put out, because I think having some control over your own life and your own information is important. And I think many of these companies fall far short of a basic, reasonable standard when it comes to our personal information. I mean, Facebook in particular has repeatedly solicited information on the premise that the information would remain private, and then made it public. I think eventually we will need some regulatory action, or some new laws, that adjust privacy protections to the new online reality we live in, because I don't see these corporations doing it on their own.

As you've said, these Internet filters can make our online experience better. How do we balance the need to have something filter the Internet for us vs. being pushed into these bubbles?

I think you've said it right—it is a balance. I think the place we want to get to is a world in which people are cognizant enough and fluent enough, and it's clear enough what's going on, that they can use these filters to their own ends, rather than having filters by default control what they see and do. Ideally, you want a filter to be like a magnifying glass, where you can pick it up and use it as a tool to zoom in or out, and not like a set of blinders that prohibits you from seeing a certain part of the world. Right now, we're more in the blinder zone.