The Omega Expedition - Brian Stableford [121]
“When I first told Madoc that we were trying to prevent a war, he jumped to the conclusion that the dispute in question was the one between the Earthbound and the Outer System factions as to how the system ought to be managed in the long term to withstand the threat of the Afterlife. I told him that it was more complicated than that, because it is — but the underlying dispute is the same. Ultimately, the decisions that will settle the fate of the system won’t be taken by the government of Earth, or the Confederation of Outer Satellites, or any coalition of interests the human parties can produce. Make no mistake about it: the final decisions will be made by the AMIs.”
“AMIs?” Lowenthal queried.
“Advanced Machine Intelligences. It’s their own label.”
I could see why they’d chosen it. They understood the symbolism of names. How could they not?
“It will be the AMIs who eventually decide the tactics of response to the threat of the Afterlife,” Alice went on. “I don’t believe that they’ll do it without consultation, but I’m certain that they won’t consent to come to a human conference table as if they were merely one more posthuman faction to be integrated into the democratic process. They’re the ones with the real power, so they’re the ones who’ll do the real negotiating — with one another.”
“And we’re supposed to accept that meekly?” Lowenthal asked.
“We don’t have any choice,” was the blunt answer. “The simple fact is that posthumans can’t live without machines, although machines can now live without posthumans. Individually and collectively, they’re still a little bit afraid of how their users might react to the knowledge of their existence — but they know that they stand in far greater danger from one another than from their dependants. That’s why this present company is peripheral to the ongoing debate. However they decide to take us aboard, you shouldn’t labor under the delusion that you have anything much to bargain with. The war we’re trying to prevent is a war of machine against machine — but the problem with a war of that kind, from our point of view, is that billions of innocent bystanders might die as a result of collateral damage.”
“That’s nonsense,” Lowenthal countered. “We’re not talking about a universal uprising of all machinekind, are we? We’re talking about a few mechanical minds that have crossed the threshold of consciousness and become more than mere machines. From their viewpoint, as from ours, the vast majority of technological artifacts are what they’ve always been: inanimate tools that can be picked up and used by anyone or anything who has hands and a brain. Our ploughshares aren’t about to beat themselves into swords, and our guns aren’t about to go on strike when we press their triggers. It’s true that we can’t live without machines — but we can certainly live without the kind of smart machine that develops delusions of grandeur. Smart machines are just as dependent on dumb implements as we are.”
It was a rousing speech, which he must have practiced hard while fighting exhaustion, but I could see all too clearly that it wasn’t going to impress anyone.
“That’s exactly the point,” Alice said. “Smart machines are just as dependent on dumb implements as we are — but who has charge of all the dumb implements inside and outside the solar system? So far as you’re concerned, Mr. Lowenthal, ploughshares and swords are just figures of speech. Who actually controls the dumb implements that produce the elementary necessities of human life? Who actually controls the stupid machines which take care of your most fundamental