Everyware_ The Dawning Age of Ubiquitous Computing - Adam Greenfield [15]
How do I know the system understands my command and is correctly executing my intended action?
How do I recover from mistakes?"
All of these questions, of course, come into play in the context of everyware. They go directly to the heart of the difference between ubiquitous systems and the digital artifacts we're used to. What they tell us is that everyware is not something you sit down in front of, intent on engaging. It's neither something that is easily contained in a session of use, nor an environment in which blunders and missteps can simply be Ctrl-Z'ed away.
What it is is a radically new situation that will require the development over time of a doctrine and a body of standards and conventions—starting with the interfaces through which we address it.
Thesis 10
Everyware necessitates a new set of human interface modes.
One of the most obvious ways in which everyware diverges from the PC case is its requirement for input modalities beyond the standard keyboard and screen, trackball, touchpad, and mouse.
With functionality distributed throughout the environment, embedded in objects and contexts far removed from anything you could (or would want to) graft a keyboard onto, the familiar ways of interacting with computers don't make much sense. So right away we have to devise new human interfaces, new ways for people to communicate their needs and desires to the computational systems around them.
Some progress has already been made in this direction, ingenious measures that have sprouted up in response both to the diminutive form factor of current-generation devices and the go-everywhere style of use they enable. Contemporary phones, PDAs, and music players offer a profusion of new interface elements adapted to their context: scroll wheels, voice dialing, stylus-based input, and predictive text-entry systems that, at least in theory, allow users of phone keypads to approximate the speed of typing on a full keyboard.
But as anyone who has spent even a little time with them knows, none of them is entirely satisfactory. At most, they are suggestive of the full range of interventions everyware will require.
One set of possibilities is suggested by the field known as "tangible media" at the MIT Media Lab, and "physical computing" to those researching it at NyU's Interactive Telecommunications Program. The field contemplates bridging the worlds of things and information, atoms and bits: Physical interface elements are manipulated to perform operations on associated data. Such haptic interfaces invoke the senses of both touch and proprioception—what you feel through the skin, that is, and the sensorimotor awareness you maintain of the position and movement of your body.
In a small way, using a mouse is physical computing, in that moving an object out in the world affects things that happen on screen. The ease and simplicity most users experience in mousing, after an exceedingly brief period of adaptation upon first use, relies on the subtle consciousness of cursor location that the user retains, perceived solely through the positioning of the wrist joint and fingers. It isn't too far a leap from noticing this to wondering whether this faculty might not be brought directly to bear on the world.
An example of a tangible interface in practice is the "media table" in the lobby of New york's Asia Society, a collaboration between Small Design Firm and media designer Andrew Davies. At first glance, the table appears to be little more than a comfortable place to sit and rest, a sumptuously smooth ovoid onto which two maps of the Asian landmass happen to be projected, each facing a seated user. But spend a few minutes playing with it—as its design clearly invites you to—and you realize that the table is actually a sophisticated interface to the Asia Society's online informational resources.
Off to the table's side, six pucks, smoothly rounded like river stones, nestle snugly in declivities designed specifically for them. Pick one up, feel its comfortable heft, and you see that