Rule 34 - Charles Stross [131]
“Yes, but you don’t have to worry about that.”
“The hell I don’t.” Your throat’s raw. “There were no witnesses. Okay, so suppose I say ‘yes’ and you take me round to the station where a trained counsellor talks me through giving a report and taking”—you swallow—“samples. And let’s suppose you, uh, your people go and arrest him. At that point it’s his word against mine, and you know what his advocate will make of my background? Polyamory still doesn’t get equal rights, never mind civil partnerships . . . I just get dragged through the mud, and to what end?”
“But you’ve got—” Liz jolts to a stop, like a Doberman at the end of a choke chain. She’s staring at you. “Oh,” she says softly.
“Oh, indeed.” You reach out your hand towards her. “You don’t want this, Liz. You don’t know what you’re opening yourself up for.”
After a moment, she takes your hand.
“It wasn’t rape,” you say, trying to keep any trace of doubt out of your voice for her sake. “But I’m really worried about the, the other thing.”
“Yes, I’d say you should be.” Liz is silent for a few seconds. “I’d like to take a statement, though. All the same.”
“What? But I told you, it wasn’t non-consensual—”
“Not about the sex: about the appraisal.”
You shiver. “I’d rather not. If you don’t mind.”
She sits down beside you on the futon. “It’s, it’s about Christie. He’s, uh, a person of interest in another investigation. We want to question him in relation to a violent crime. I can’t tell you about it right now, but what you’d told me—it’s really important. My colleagues—they need to know about this. Do you mind if I file at least a contact report?”
You sniff, then rub a hand across your eyes. There’s no mascara or eye-liner, luckily: You stripped before you showered. “You’re going to insist, aren’t you?”
She manages a weak smile. “You said it: I didn’t.”
“Oh hell.” You struggle to sit up. “Just . . . do you mind if I stay overnight? I can’t face that room . . .”
“You can stay,” she says neutrally. “I’ll take the futon.” She pulls her police specs on again, then pauses, one finger hovering over the power button. “I still love you, you know. I just wish things weren’t so messy.”
Then she pushes the button.
LIZ: Project ATHENA
“People laugh when they hear the phrase ‘artificial intelligence’ these days.” MacDonald is on a roll. “But it’s not funny; we’ve come a long way since the 1950s. There’s a joke in the field: If we know how to do it, it’s not intelligence. Playing checkers, or chess, or solving mathematical theorems. Image recognition, speech recognition, handwriting recognition. Diagnosing an illness, driving a car through traffic, operating an organic-chemistry lab to synthesize new compounds. These were all thought to be aspects of intelligence, back in the day, but now they’re things you can buy through an app store or on lease-purchase from Toyota.
“What people think of when you say ‘artificial intelligence’ is basically stuff they’ve glommed onto via the media. HAL 9000 or Neuromancer—artificial consciousness. But consciousness—we know how that shit works these days, via analytical cognitive neurobiology and synthetic neurocomputing. And it’s not very interesting. We can’t do stuff with it. Worst case—suppose I were to sit down with my colleagues and we come up with a traditional brain-in-a-box-type AI, shades of HAL 9000. What then? Firstly, it opens a huge can of ethical worms—once you turn it on, does turning it off again qualify as murder? What about software updates? Bug fixes, even? Secondly, it’s not very useful. Even if you cut the Gordian knot and declare that because it’s a machine, it’s a slave, you can’t make it do anything useful. Not unless you’ve built in some way of punishing it, in which case we’re off into the ethical mine-field on a pogo-stick tour. Human consciousness isn’t optimized for anything, except maybe helping