Everyware_ The Dawning Age of Ubiquitous Computing - Adam Greenfield [93]
Design researcher Timo Arnall has developed a vocabulary of graphic icons that communicate ideas like these: a friendly, human-readable equivalent of the "service discovery layer" in Bluetooth that specifies what devices and services are locally available. Perhaps Arnall's icons could serve as the basis of a more general graphic language for ubiquitous systems—a set of signs that would eventually become as familiar as "information" or "bathroom," conveying vital ideas of the everyware age: "This object has invisible qualities," or "network dead zone."
Whether we use them to protect ourselves from intrusive information collection or to discover all the ways our new technology can be used, provisions for transparent self-disclosure on the part of ubiquitous systems will be of critical importance in helping us find ways to live around and with them. Such knowledge is the basis of any meaningful ability on our part to decide when and to what degree we wish to engage with everyware and when we would prefer not to.
Thesis 75
Everyware must be conservative of face.
Something too rarely considered by the designers of ubiquitous systems is how easily their ordinary operation can place a user's reputation and sense of dignity and worth at risk.
Thomas Disch illustrates this beautifully in his classic 1973 novel 334. The grimly futuristic Manhattan of 334 is a place whose entropic spiral is punctuated only by the transient joys of pills, commercial jingles, and empty sex. The world-weary residents of 334 East 13th Street survive under the aegis of a government welfare agency called MODICUM, a kind of Great Society program gone terminally sour.
In particular, 334's casual sketch of what would later be known as an Active Badge system hews close to this less-than-heroic theme. Disch shows us not the convenience of such a system, but how it might humiliate its human clients—in this case the aging, preoccupied hospital attendant Arnold Chapel. Embroiled in an illicit plot, Chapel has allowed himself to wander from his course, and is audibly corrected by the hospital's ubiquitous traffic control system:
"Arnold Chapel," a voice over the PA said. "Please return along 'K' corridor to 'K' elevator bank. Arnold Chapel, please return along 'K' corridor to 'K' elevator bank."
Obediently he reversed the cart and returned to 'K' elevator bank. His identification badge had cued the traffic control system. It had been years since the computer had had to correct him out loud.
All that was, in fact, necessary or desirable in this scenario was that the system return Chapel to his proper route. Is there any justification, therefore, for the broadcast of information embarrassing to him? Why humiliate, when adjustment is all that is mandated?
Of course, no system in the world can keep people from making fools of themselves. About all that we can properly ask for is that our technology be designed in such a way that it is conservative of face: that ubiquitous systems must not act in such a manner as would unduly embarrass or humiliate users, or expose them to ridicule or social opprobrium, in the course of normal operations.
The ramifications of such an imperative in a fully-developed everyware are surprisingly broad. With so many systems potentially able to provide the location of users in space and time, we've seen that finding people will become trivially easy. We also know that when facts about your location are gathered alongside other facts—who you are with, what time it is, what sorts of services happen to be available nearby—and subjected to data-mining operations, a relational system can begin to paint a picture of your behavior.
Whether this should be an accurate picture or not—and remember everything we said about the accuracy of machine inference—the revelation of such information can lead to awkward questions about our activities and intentions, the kind we'd rather not have to answer. Even if we didn't happen to be doing anything "wrong,"