The Filter Bubble - Eli Pariser [93]
What Governments and Citizens Can Do
There’s plenty that the companies that power the filter bubble can do to mitigate the negative consequences of personalization—the ideas above are just a start. But ultimately, some of these problems are too important to leave in the hands of private actors with profit-seeking motives. That’s where governments come in.
Ultimately, as Eric Schmidt told Stephen Colbert, Google is just a company. Even if there are ways of addressing these issues that don’t hurt the bottom line—which there may well be—doing so simply isn’t always going to be a top-level priority. As a result, after we’ve each done our part to pop the filter bubble, and after companies have done what they’re willing to do, there’s probably a need for government oversight to ensure that we control our online tools and not the other way around.
In his book Republic.com, Cass Sunstein suggested a kind of “fairness doctrine” for the Internet, in which information aggregators have to expose their audiences to both sides. Though he later changed his mind, the proposal suggests one direction for regulation: Just require curators to behave in a public-oriented way, exposing their readers to diverse lines of argument. I’m skeptical, for some of the same reasons Sunstein abandoned the idea: Curation is a nuanced, dynamic thing, an art as much as a science, and it’s hard to imagine how regulating editorial ethics wouldn’t inhibit a great deal of experimentation, stylistic diversity, and growth.
As this book goes to press, the U.S. Federal Trade Commission is proposing a Do Not Track list, modeled after the highly successful Do Not Call list. At first blush, it sounds pretty good: It would set up a single place to opt out of the online tracking that fuels personalization. But Do Not Track would probably offer a binary choice—either you’re in or you’re out—and services that make money on tracking might simply disable themselves for Do Not Track list members. If most of the Internet goes dark for these people, they’ll quickly leave the list. And as a result, the process could backfire—“proving” that people don’t care about tracking, when in fact what most of us want is more nuanced ways of asserting control.
The best leverage point, in my view, is in requiring companies to give us real control over our personal information. Ironically, although online personalization is relatively new, the principles that ought to support this leverage have been clear for decades. In 1973, the Department of Housing, Education, and Welfare under Nixon recommended that regulation center on what it called Fair Information Practices:
• You should know who has your personal data, what data they have, and how it’s used.
• You should be able to prevent information collected about you for one purpose from being used for others.
• You should be able to correct inaccurate information about you.
• Your data should be secure.
Nearly forty years later, the principles are still basically right, and we’re still waiting for them to be enforced. We can’t wait much longer: In a society with an increasing number of knowledge workers, our personal data and “personal brand” are worth more than they ever have been. Especially if you’re a blogger or a writer, if you make funny videos or music, or if you coach or consult for a living, your online data trail is one of your most valuable assets. But while it’s illegal to use Brad Pitt’s image to sell a watch without his permission, Facebook is free to use your name to sell one to your friends.
In courts around the world, information brokers are pushing this view—“everyone’s better off if your online life is owned by us.” They argue that the opportunities and control that consumers get by using their free tools outweigh the value of their personal data. But consumers are entirely unequipped