Online Book Reader

Home Category

The Filter Bubble - Eli Pariser [91]

By Root 823 0
share. We ought to be able to say, “You’re wrong. Perhaps I used to be a surfer, or a fan of comics, or a Democrat, but I’m not any more.”

Knowing what information the personalizers have on us isn’t enough. They also need to do a much better job explaining how they use the data—what bits of information are personalized, to what degree, and on what basis. A visitor to a personalized news site could be given the option of seeing how many other visitors were seeing which articles—even perhaps a colorcoded visual map of the areas of commonality and divergence. Of course, this requires admitting to the user that personalization is happening in the first place, and there are strong reasons in some cases for businesses not to do so. But they’re mostly commercial reasons, not ethical ones.

The Interactive Advertising Bureau is already pushing in this direction. An industry trade group for the online advertising community, the IAB has concluded that unless personalized ads disclose to users how they’re personalized, consumers will get angry and demand federal regulation. So it’s encouraging its members to include a set of icons on every ad to indicate what personal data the ad draws on and how to change or opt out of this feature set. As content providers incorporate the personalization techniques pioneered by direct marketers and advertisers, they should consider incorporating these safeguards as well.

Even then, sunlight doesn’t solve the problem unless it’s coupled with a focus in these companies on optimizing for different variables: more serendipity, a more humanistic and nuanced sense of identity, and an active promotion of public issues and cultivation of citizenship.

As long as computers lack consciousness, empathy, and intelligence, much will be lost in the gap between our actual selves and the signals that can be rendered into personalized environments. And as I discussed in chapter 5, personalization algorithms can cause identity loops, in which what the code knows about you constructs your media environment, and your media environment helps to shape your future preferences. This is an avoidable problem, but it requires crafting an algorithm that prioritizes “falsifiability,” that is, an algorithm that aims to disprove its idea of who you are. (If Amazon harbors a hunch that you’re a crime novel reader, for example, it could actively present you with choices from other genres to fill out its sense of who you are.)

Companies that hold great curatorial power also need to do more to cultivate public space and citizenship. To be fair, they’re already doing some of this: Visitors to Facebook on November 2, 2010, were greeted by a banner asking them to indicate if they’d voted. Those who had voted shared this news with their friends; because some people vote because of social pressure, it’s quite possible that Facebook increased the number of voters. Likewise, Google has been doing strong work to make information about polling locations more open and easily available, and featured its tool on its home page on the same day. Whether or not this is profit-seeking behavior (a “find your polling place” feature would presumably be a terrific place for political advertising), both projects drew the attention of users toward political engagement and citizenship.

A number of the engineers and technology journalists I talked to raised their eyebrows when I asked them if personalizing algorithms could do a better job on this front. After all, one said, who’s to say what’s important? For Google engineers to place a value on some kinds of information over others, another suggested, would be unethical—though of course this is precisely what the engineers themselves do all the time.

To be clear, I don’t yearn to go back to the good old days when a small group of all-powerful editors unilaterally decided what was important. Too many actually important stories (the genocide in Rwanda, for example) fell through the cracks, while too many actually unimportant ones got front-page coverage. But I also don’t think we should jettison that approach

Return Main Page Previous Page Next Page

®Online Book Reader