The Filter Bubble - Eli Pariser [90]
Google has also argued that it needs to keep its search algorithm under tight wraps because if it was known it’d be easier to game. But open systems are harder to game than closed ones, precisely because everyone shares an interest in closing loopholes. The open-source operating system Linux, for example, is actually more secure and harder to penetrate with a virus than closed ones like Microsoft’s Windows or Apple’s OS X.
Whether or not it makes the filterers’ products more secure or efficient, keeping the code under tight wraps does do one thing: It shields these companies from accountability for the decisions they’re making, because the decisions are difficult to see from the outside. But even if full transparency proves impossible, it’s possible for these companies to shed more light on how they approach sorting and filtering problems.
For one thing, Google and Facebook and other new media giants could draw inspiration from the history of newspaper ombudsmen, which became a newsroom topic in the mid-1960s.
Philip Foisie, an executive at the Washington Post company, wrote one of the most memorable memos arguing for the practice. “It is not enough to say,” he suggested, “that our paper, as it appears each morning, is its own credo, that ultimately we are our own ombudsman. It has not proven to be, possibly cannot be. Even if it were, it would not be viewed as such. It is too much to ask the reader to believe that we are capable of being honest and objective about ourselves.” The Post found his argument compelling, and hired its first ombudsman in 1970.
“We know the media is a great dichotomy,” said the longtime Sacramento Bee ombudsman Arthur Nauman in a speech in 1994. On the one hand, he said, media has to operate as a successful business that provides a return on investment. “But on the other hand, it is a public trust, a kind of public utility. It is an institution invested with enormous power in the community, the power to affect thoughts and actions by the way it covers the news—the power to hurt or help the common good.” It is this spirit that the new media would do well to channel. Appointing an independent ombudsman and giving the world more insight into how the powerful filtering algorithms work would be an important first step.
Transparency doesn’t mean only that the guts of a system are available for public view. As the Twitter versus Facebook dichotomy demonstrates, it also means that individual users intuitively understand how the system works. And that’s a necessary precondition for people to control and use these tools—rather than having the tools control and use us.
To start with, we ought to be able to get a better sense of who these sites think we are. Google claims to make this possible with a “dashboard”—a single place to monitor and manage all of this data. In practice, its confusing and multitiered design makes it almost impossible for an average user to navigate and understand. Facebook, Amazon, and other companies don’t allow users to download a complete compilation of their data in the United States, though privacy laws in Europe force them to. It’s an entirely reasonable expectation that data that users provide to companies ought to be available to us, and that this expectation is one that, according to the University of California at Berkeley, most Americans