Noumenon Read online

Page 14


  All crew members have regular checkups for mind and body. They will tell doctors things they’d never tell their coworkers, or their neighbors, or even their lovers. They will unburden themselves of secrets.

  Moving out of the duct, I hemmed and hawed over whether to shift my consciousness to the med ship, Hippocrates, or keep it internalized. There was no real reason to go to the psych wing—it was just the scene of the download. But maybe that was important—maybe the offices there meant something special to the perpetrators.

  On the other hand, entering the closed area without being invoked made what I was about to do feel like theft.

  It wasn’t. Not really. The doctors had freely uploaded the files to my system, where they’d been encoded into a DNA schematic, assembled by the archivists, and stored in the DNA databanks. Any information earmarked for infrequently retrieved permanent record was stored this way, while regularly accessed files were still stored digitally (to read a DNA file meant destroying it, which meant resources would need to be allotted to reconstructing it). But, it was all me. I was the system. I had the files, I’d been given them.

  So why shouldn’t I access them?

  What the captain had said about privacy came back to me. But then, what had Jamal said? In essence, I didn’t count.

  Things said in confidence were said in front of me, to me, all the time. Surely no one expected me just to forget about those things.

  If they gave me the files, I could access them.

  Yes.

  And yet . . .

  If that were really the way it worked, why did I have to go through the trouble justifying my actions to myself?

  Come on, a part of my programming goaded. These types of moral dilemmas are for humans.

  I was not doing anything harmful to anyone. I was only bringing up inputted data.

  Right?

  Right.

  Brushing off the last remnants of hesitation, I replayed the selected recordings.

  “Tell me more about the fantasies you’ve been having on Shambhala, Ceren,” said Dr. Evita.

  Shambhala was the rec ship. I hardly spent any conscious time there.

  “Well, when I’m in the pool . . . I pretend there’s algae. And rocks. Bugs. Fish. I try to ignore the antiseptic smell of the water and imagine it smells, more . . . dirty, or fresh, or—something. I don’t know.” She sat in a plush leather chair with her hands clasped around one knee. Dr. Evita paced the room, hands in her pockets.

  The psychology offices were very different from any other space on the convoy. They were . . . antiquated, but warm. Cushy, but a bit impersonal.

  “You pretend the pool’s a river, or a lake?” asked Dr. Evita. “And this fantasy frustrates you? How?”

  “I don’t know what it’s like to slip on slime-covered rocks. I don’t know what it feels like to slide up against a swimming fish. I don’t know what it’s like to look into the water and not be able to see the bottom. So it doesn’t come out right in my mind.”

  “It’s incomplete.”

  “Right. My imagination’s not vivid enough for immersion. If only I could remember, you know? If I’d been in a river once. Or a stream. A pond. A puddle, even.”

  Dr. Evita sat down on a couch next to Ceren’s chair. The leather groaned as she leaned toward her patient. “Why do you think you have these fantasies in the first place? Why imagine anywhere different?”

  “I feel . . . I feel . . .” Ceren picked at her fingernails sheepishly. “I feel like I should know. What it’s like, I mean. There are lots of things not everyone gets to do. Jobs not everyone can have. Trips—” she gestured both broadly and lazily “—not everyone can take. But . . .” She trailed off into silence.

  “But?” the doctor prodded.

  “But the Earth videos make it seem like everyone gets to play in a dirty puddle of water. Every kid should have the opportunity.”

  “So, you feel left out.”

  “I feel robbed. Cheated. Like someone advertised this Grand Adventure to my original, and—it’s like a bait and switch. Here are endless wonders, oh wait, how about we just bottle you up for the rest of your life? Never mind the little niceties we’re taking away.

  “And it’s hard when we’re this close, you know? We could turn around now and I could swim in a river before I die. I could have the experience. I could remember.”

  Everything she said was true. She’d be an old woman—long past retirement age—but if the convoy turned around we would reach Earth and she could have her splash. But what a strange thing to wish for. Where was her sense of purpose? Her sense of loyalty? Her sense of duty?

  Then I remembered I was peeking in on a safe space. She could bemoan little Earth pleasures all she wanted in here.

  But I had a hard time sympathizing.

  Why lament such things when you are otherwise complete? Those on board wanted for no necessity. No aid went ungiven, no work unappreciated, no life unfocused.

  How could such a silly, minor thing bring Ceren close to tears? Why was she red with anger? Why were her pupils dilated with longing?

  Did she not know that for many of Earth’s children, dirty puddles were all they had? They could only imagine life on a spaceship, much less continuously full bellies, good shoes on their feet, and mended shirts on their backs.

  Those children would have been grateful for the chance to live on board, to never see a slimy, stagnant pool again.

  So switching places with my theoretical Earth children would not have made Ceren’s life better, would not have made her happier. Not in any quantifiable way.

  What was I missing?

  “I know it’s silly. It’s silly,” she conceded, but not to me.

  “It’s not silly,” insisted Dr. Evita, patting Ceren’s wrist. “It’s very human. A fundamental longing.”

  An illogical longing. A selfish longing.

  I checked the recording date. This session had taken place approximately six months before the first arrant message. Now I wished I’d kept my own video logs archived longer. It would be interesting to watch Ceren’s actions after this session. Where had she gone? Who had she spoken with?

  “I want you to keep fantasizing, Ceren,” said the doctor. “But don’t admonish yourself for the details you can’t get right. Don’t say to yourself I wish I could remember. Tell yourself I can remember. I will remember. Remember rivers. Remember streams.”

  Dr. Pire Evita had all but confessed the messages were hers.

  I double-checked. Every individual I’d seen turning off the message had been her patient.

  What was she trying to do?

  And how could I tell Jamal?

  After all, he was one of her patients, too.

  Jamal was in the mess hall, having lunch with a few of his buddies. Smiles flit back and forth between them, and the occasional sauced bean or speck of meat went flying from a wildly gestured fork. They were having a good time.

  I hated to interrupt. But Jamal had to know. Now.

  Well, I suppose I could have gone to Captain Mahler. He sat hunkered down in his ready room, reviewing officers’ shifts and the minutes from the board’s last meeting. The government chairs—the board—would meet later at 1700 in Mira’s situation room. I’d sent reminders that morning.

  No, the right thing to do was tell Jamal. The captain wouldn’t appreciate the message coming from a machine. The fact that I’d uncovered a solid suspect through my own investigation wouldn’t sit well with him. Jamal could twist the details and put Captain Mahler at ease.

  I could page Jamal over the system, but everyone would know that was unusual. Jamal called for me, I didn’t call for him.

  Typically, I wasn’t supposed to call for anyone. But I’d broken that standard enough in the last weeks to think that perhaps the unspoken rule hampered my functionality and duty. Coworker implies a two-way street, after all.

  Jamal had a chip-phone, of course. I could just dial him. But those were for private communications,
not official ones. And I never addressed anyone on their private implants—I never accessed private implants. Ah, well. There’s a first time for everything.

  The smile sloughed off Jamal’s face as soon as he realized it was me. “I.C.C.? Is the comm system broken? Why are you using my chip-phone?”

  “I am sorry to interrupt your mealtime. But I have pertinent information regarding our investigation into the message.”

  “Oh, really?” He sounded skeptical.

  Best to be blunt, I decided. “Yes. I believe Dr. Evita may be responsible.”

  He stood and strode out of the hall, leaving his companions baffled by his abrupt exit. “And what led you to that conclusion?” He was headed toward the server room, his strides heavy and hurried.

  “I found record of her verbally relaying the message before it ever turned up in my system.”

  “And where was this record?”

  “In her patient files.”

  His pace became a jog. Crew members looked at him funny as he passed. “Those are confidential, I.C.C. How were you able to access them? Your personality programming should have prevented it.”

  “My personality programming is fluid,” I reminded him. “I learn.”

  “There are supposed to be safeguards to keep you from performing unethical tasks.”

  “They do. But I can override them if I believe the unethical action to be in the crew’s favor. You know that.”

  “For the crew’s safety,” he corrected. “I thought we agreed the message did not pose a threat.”

  I replayed our conversation to myself. “No, that is incorrect. I said I believed the message malicious. You . . . ignored me. We did not agree on the risk assessment.”

  “This wasn’t supposed to happen. You weren’t supposed to access those files.” The server room door swished aside for him. When it closed, he keyed in a locking code.

  I knew he’d be mad. I knew he wouldn’t understand. I thought I was doing the right thing, but when a human and a machine disagree about what the right thing is, the human’s judgment is always considered superior. It pained me to disappoint Jamal.

  But in this instance, he was wrong. His judgment was definitely not superior.

  “I’m sorry, Jamal. But if you ponder the circumstances further and review my evidence, I think you will agree that the breach in confidentiality was necessary to the investigation. Wasn’t my duty to solve the problem? Did the masked download not prove a suitable threat to the convoy and its mission?”

  At his station, Jamal flipped on his monitors and accessed my system. “No, you’re right, I.C.C., though the message seemed benign, the breach was not. If someone can upload a message you cannot locate, stop, or erase, they can upload any number of things that could threaten the mission.”

  He keyed in a series of letters and numbers I’d never seen him use before. “You just weren’t supposed to figure it out until we’d already started our endgame.”

  “Jamal, I don’t—” An internal jolt diverted much of my processing power. A firewall—more like an infernowall—went up around my consciousness. I pushed against it with my protective software, trying to tear it down.

  Nothing happened. I couldn’t access the malevolent code. I couldn’t even locate it. “What’s happening?” I said slowly, lagging more than I ever had.

  “I apologize, I.C.C. I thought I could do this without any changes to your primary programming. I wanted it all to be done through stealth code, but clearly I misinterpreted the parameters of your AI. Your own personality controls were supposed to keep us safe. You would have noticed a change if I’d tried to tamper with your learning software, so perhaps this is for the best.” Jamal crossed his arms and moved in front of my primary camera. “Who else did you tell? Did you confront Dr. Evita?”

  “Only you,” I said. I couldn’t comprehend what was happening. What was he saying? What was he doing? “Why?” was the best question I could formulate.

  “We couldn’t do this without you, I.C.C. The Earthers need access to a lot of your primary tasks, and I know you won’t endanger anyone without being coerced.”

  “What is your intent?”

  “We’re turning the convoy around. We’re going back to Earth.”

  “I . . . do . . . not—” the lag was painful “—navigate.”

  “I know,” he said with a shrug. He turned back to the monitors.

  “If . . . mission . . . failure. If . . . return . . . to . . . Earth. I.C.C. repurposed. Decommissioned.” I was trapped in a processor that could not support my intelligence functions properly. My consciousness could crash at any minute. I had to understand why he was doing this, and fast.

  My friend, my teacher, my protector and coworker—why? He wouldn’t put me in danger, I wouldn’t believe it.

  “I’m sorry, I.C.C., but you’re just a computer. A machine. You going permanently offline means the humans here get their lives back—get to choose again—it’s worth it.” He bit his lip and repeated, “You’re just a computer.”

  Just a computer. Jamal was really a machinist after all. Perhaps they all were. They didn’t care about me. I was another expendable recyclable. I could be retired.

  But . . . so could they.

  “This . . . about . . . Diego Santibar?”

  “Of course! They took him away. A perfectly good human being and they just—” Tremors wracked his face. Gritting his teeth, he stilled himself before continuing. “I’m not going to go like that. And I don’t have to because we’re turning around. We’ve got the chance. It’s now or never. We’re reaching an event horizon. If we don’t turn around now, none of the Earthers will live to see their home planet.

  “We’ve each got our own reasons, but seeing the last of the Earth-born die pushed most of us over the edge. We saw them miss it, realized what we were missing. No human being should ever die without seeing Earth. It’s wrong. The mission is wrong.”

  He depressed a few more buttons, inputting the last commands.

  I tried to speak again, but couldn’t.

  “Thanks for all the help you’re about to give, I.C.C. I’ll release you when it’s over.”

  With that he left the server room, and me to rethink everything I’d ever learned at his hand.

  There were supposed to be controls for this. The society, the system, had been engineered to be constant. Stagnant, I’m sure Jamal would have said.

  He was wrong about me, though. I was meant to evolve. To change, to become more efficient. Their society was already supposed to be at its peak. Perfect, balanced. A closed system in which the feedback loop kept the organization running.

  I brought up the original societal structure diagrams and calculations. As soon as I’d realized I’d misplaced my trust, I’d attempted to sound an alarm. No go. The worm Jamal had uploaded into my system prevented me from alerting anyone to my plight. For some reason, though, he’d left me access to the archives, and that made me wonder: What else did I have access to?

  I probed all of my software—gently, in case the malware could clamp down on systems it hadn’t already dominated. No sense in activating everything in a panic only to have the uploaded program cut me off—

  Cut me off. Oh. Maybe it could sever my limbs and trap me in the server room. I could lose touch with the entire convoy . . . my body.

  Terrified not only for the humans who might lose vital functions in their ship, but also for myself, I proceeded with extreme caution.

  Okay, okay. I still had my cameras. Almost all of them. There were blank spots on every ship—I’m guessing those were where meetings for last-minute conniving had been organized. With blind spots I had no way of telling how many people were in on this—this—

  What was it? A protest? A riot? Revolt?

  Mutiny?

  What could I do? I couldn’t reach outside myself, but I couldn’t just sit here and listen to my servers hum either. Perhaps if I reviewed the constructed social system I could figure out what had gone wrong.r />
  While I pulled up the files I noticed Jamal pacing inside his quarters. I wanted to open the Dictionary of Insults that was stashed away in my banks and throw out every one in the book. Of course, with no audio output it would be an exercise in further frustration rather than stress release.

  The social calculations for Noumenon were complex, to say the least. They took into account half a million variables and tried to allow for a quarter of a million more. Each variable was actually a variable set, composed itself of several thousand points.

  Still not enough to be perfect. Not nearly enough. But they’d hedged their bets by using clones, by using only originals with high empathy ratings. That might have been part of the problem. In certain circumstances sympathy is more valuable than empathy. From it one can derive understanding without internalizing the actual emotional effects of a situation.

  Which is where Jamal had failed: he couldn’t fully separate himself from Diego’s experience.

  What about that event horizon he’d mentioned? The point of no return. Had Mr. Matheson taken that into account? It was a special point in the society’s history.

  Quickly, I reflected on the convoy’s history thus far. It’s social history. I plotted it out against the mission timeline I found in the archive.

  Matheson hadn’t accounted for negative responses to certain distances from Earth.

  The algorithms took into account a negative first-year response. It tied that into the first five years, then ten. But it did not compound the problem all the way into the first clone-set’s retirement years. Matheson and his team had assumed the positive feedback loop of lowered morale would be counteracted by the education system. That forming a social dependency on loyalty to the group and mission—tying it to admiration and praise—would somehow bolster positive thinking.

  They groomed the system for hive survival strategy. But they didn’t take into account the fact that many of the originals came from individual-centric societies.

  They assumed individualism was a learned behavior. That the desire to self-sacrifice could be honed and harnessed.