The Filter Bubble - Eli Pariser [74]
Technodeterminists like to suggest that technology is inherently good. But despite what Kevin Kelly says, technology is no more benevolent than a wrench or a screwdriver. It’s only good when people make it do good things and use it in good ways. Melvin Kranzberg, a professor who studies the history of technology, put it best nearly thirty years ago, and his statement is now known as Kranzberg’s first law: “Technology is neither good or bad, nor is it neutral.”
For better or worse, programmers and engineers are in a position of remarkable power to shape the future of our society. They can use this power to help solve the big problems of our age—poverty, education, disease—or they can, as Heifer-man says, make a better farting app. They’re entitled to do either, of course. But it’s disingenuous to have it both ways—to claim your enterprise is great and good when it suits you and claim you’re a mere sugar-water salesman when it doesn’t.
Actually, building an informed and engaged citizenry—in which people have the tools to help manage not only their own lives but their own communities and societies—is one of the most fascinating and important engineering challenges. Solving it will take a great deal of technical skill mixed with humanistic understanding—a real feat. We need more programmers to go beyond Google’s famous slogan, “Don’t be evil.” We need engineers who will do good.
And we need them soon: If personalization remains on its current trajectory, as the next chapter describes, the near future could be stranger and more problematic than many of us would imagine.
7
What You Want, Whether You Want It or Not
There will always be plenty of things to compute in the detailed affairs of millions of people doing complicated things.
—computing pioneer Vannevar Bush, 1945
All collected data had come to a final end. Nothing was left to be collected. But all collected data had yet to be completely correlated and put together in all possible relationships.
—from Isaac Asimov’s short story “The Last Question”
I recently received a friend invitation on Facebook from someone whose name I didn’t recognize, a curvy-figured girl with big eyes and thick lashes. Clicking to figure out who she was (and, I’ll admit, to look more closely), I read over her profile. It didn’t tell me a lot about her, but it seemed like the profile of someone I might plausibly know. A few of our interests were the same.
I looked again at the eyes. They were a little too big.
In fact, when I looked more closely, I realized her profile picture wasn’t even a photograph—it had been rendered by a 3-D graphics program. There was no such person. My new attractive would-be friend was a figment of software, crawling through friend connections to harvest data from Facebook users. Even the list of movies and books she liked appeared to have been ripped from the lists of her “friends.”
For lack of a better word, let’s call her an advertar—a virtual being with a commercial purpose. As the filter bubble’s membrane becomes thicker and harder to penetrate, advertars could become a powerful adaptive strategy. If I only get the news from my code and my friends, the easiest way to get my attention might be friends who are code.
The technologies that support personalization will only get more powerful in the years ahead. Sensors that can pick up new personal signals and data streams will become even more deeply embedded in the surface of everyday life. The server farms that support the Googles and Amazons will