Rule 34 - Charles Stross [123]
“I shall remember not to confess to any murders I didn’t commit.” MacDonald seems to find your caution inappropriately amusing. You’re about to repeat and rephrase when he adds, “I understand you’re in need of domain-specific knowledge.” He leans forward, smirk vanishing. “Why me?”
“Your name came out of the hat.” You decide to press on. Probably he got the message: In any case, having an inappropriate sense of humour isn’t an arrestable offense. “We’re investigating a crime involving some rather strange coincidences that appear to involve some kind of social network.” The half smile vanishes from Dr. MacDonald’s face instantly. “You’re a permanent lecturer in informatics with a research interest in automated social engineering and, ah, something called ATHENA. Our colleagues recommended you on the basis of a review of the available literature on, uh, morality prosthetics and network agents.”
Kemal, sitting beside you with crossed arms, nods very seriously. MacDonald looks nonplussed.
“Really? Coincidences?” He pauses. “Coincidences. A social network. Can you tell me what kinds of coincidences we’re talking about here?”
“Fatal ones,” says Kemal.
Damn. MacDonald’s expression is frozen. You spare Kemal a warning glance, then say: “We’re here for a backgrounder, nothing more—to see if your research area can give us any insights into what’s going on. I’m afraid I’ve got to admit that I’m not up on your field—tell me, Professor, what is automated social engineering?”
You sit back, mimic his posture, and smile at him. It’s all basic body-language bullshit, but if it puts him more at ease . . . yes. MacDonald visibly relaxes.
“How much do you know about choice architecture?”
He’s got you. You glance sidelong at Kemal, who shrugs minutely. “Not a lot.” The phrase rings a very vague bell, but no more than that. “Suppose you tell me?”
“If only my students were so honest . . . let’s review some basic concepts. In a nutshell: When you or I are confronted with some choice—say, whether to buy a season bus pass or to pay daily—we make our decision about what to do by using a frame, a bunch of anecdotes and experiences that help us evaluate the choice. You can control how people make their choices, even to the point of making them choose differently, if you can modify the frame. There’s a whole body of research on this field in cognitive psychology. Anyway: Choice architecture is the science of designing situations to nudge people towards a desired preference. You might want to do this because you’re marketing products to the public—or for public policy purposes: There’s a whole political discourse around this area called libertarian paternalism, how to steer people towards choosing to do the right thing of their own free will.”
Now it clicks, where you’ve heard this stuff before: There was a fad for it about ten years ago, trials on reducing binge drinking by giving pub-goers incentives to switch off the hard stuff after a couple of pints, free soft drinks and so on. (Which failed to accomplish anything much, because the real problem drinkers weren’t in the pubs in the first place, much less drinking to socialize, but the Pimm’s-quaffing policy wonks didn’t get that.)
You nod, suppressing disappointment: Is that all? But MacDonald reads your gesture as a cue to continue in lecture mode.
“It’s another approach to social engineering. Take policing, for example.” He nods at you. “There’s the law, which we’re all expected to be cognizant of and to obey, and there’s the big stick to convince us that it’s a lot cheaper to play along than to go against it—yourselves, and the courts and prison and probation