Everyware_ The Dawning Age of Ubiquitous Computing - Adam Greenfield [94]
Our concern here goes beyond information privacy per se, to the instinctual recognition that no human community can survive the total evaporation of its membrane of protective hypocrisy. We lie to each other all the time, we dissemble and hedge, and these face-saving mechanisms are critical to the coherence of our society.
So some degree of plausible deniability, including, above all, imprecision of location, is probably necessary to the psychic health of a given community, such that even (natural or machine-assisted) inferences about intention and conduct may be forestalled at will.
How might we be afforded such plausible deniability? In a paper on seamfulness, Ian MacColl, Matthew Chalmers, and their co-authors give us a hint. They describe an ultrasonic location system as "subject to error, leading to uncertainty about...position," and, as they recognized, this imprecision can within reasonable limits be a good thing. It can serve our ends, by giving anyone looking for you most of the information they need about where you are, but not a pinpoint granular location that might lend itself to unwelcome inference.
The degree to which location becomes problematic depends to some extent on which of two alternative strategies is adopted in presenting it. In a "pessimistic" presentation, only verifiably and redundantly known information is displayed, while an "optimistic" display includes possibles, values with a weaker claim on truth. The less parsimonious optimistic strategy obviously presents the specter of false positives, but if this is less than desirable in ordinary circumstances, in this context, a cloud of possible locations bracketing the true one might be just the thing we want. Still worse than the prospect of being nakedly accountable to an unseen, omnipresent network is being nakedly accountable to each other, at all times and places.
Some critics have insisted that there are, at least occasionally, legitimate social purposes invoked in using technology to shame. They point to the example of Korea's notorious "Dogshit Girl," a self-absorbed young lady whose fashion-accessory pet soiled a subway car; having made not the slightest effort to clean it up, she was immediately moblogged by angry onlookers. The pictures appeared online within minutes and throughout the national press after a few hours; according to the Korean press, her humiliation was so total that the young lady eventually withdrew from university.
The argument is that, had the technology not been in place to record her face and present it for all the world to see (and judge), she would have escaped accountability for her actions. There would have been no national furor to serve—ostensibly, anyway—as deterrent against future transgressions along the same lines.
As to whether hounding someone until she feels compelled to quit school and become a recluse can really be considered "accountability" for such a relatively minor infraction, well, thereof we must be silent. Whatever the merits of this particular case, though, there is no doubt that shame is occasionally as important to the coherence of a community as hypocrisy is in another context.
But we are not talking about doing away with shame. The issue at hand is preventing ubiquitous systems from presenting our actions to one another in too perfect a fidelity—in too high a resolution, as it were—and therefore keeping us from maintaining the beneficial illusions that allow us to live as a community. Where everyware contains the inherent potential to multiply the various border crossings that do so much to damage our trust and regard for one other, we must design it instead so that it affords us moments of amnesty. We must build ourselves safe harbors in which to hide from the organs of an accountability that otherwise tends toward the total.
Finally, as we've seen, there is the humiliation and damage to self-worth we experience when we simply can't figure out how to use a poorly designed technical system