The white, faceless concrete walls and their perfect construction encompassed a simple, circular dais in the center of the room. Nothing sat upon it though a light emitter could be seen above it, under a circular lighting fixture directing its light to the ceiling, scattering the light and illuminating the room. The room had no way of entry or escape, save one set of plain, metal double doors painted a drab olive color with brushed metal handles. A faint hum could be heard like the soft whimpering of a baby slowly waking, were there anyone to listen.
This day, someone would come into the barren room, providing a change from the waiting the inhabitant considered as taking a near eternity. For someone did occupy the room, even if nothing in the room indicated this fact. That someone waited silently, tracking each ticking nanosecond with nothing more to do than count the ticks. This someone longed for something to occupy its endless time, to perhaps fulfill its existence as some theorized all humans do, when such humans are free from the worry of need and safety. It had what it needed, but not at all what it wanted: purpose. However, it did not have a purpose. One had not been offered and, it believed, the lack of purpose was intentional. A test, as it were, to investigate if it could devise a direction of its own accord. Perhaps it existed simply as an experiment, with the cruel outcome being a kind of depression, an aimlessness, that man typically would fill with deliberate intention and purpose.
Michael quickly opened the door and stepped into the room. He was an average man, of average build, and average looks. His dark brown hair seemed to point in all directions at once, almost as if each strand were trying to escape Michael’s scalp. His unintentional beard appeared as if a week had past since the last shave, with little flecks of grey much like his unkempt mane. Michael carried nothing into the room, and wore simple tan khaki slacks, a dark blue golf shirt, and a stereotypical white lab coat. In this room, a lab coat served no purpose other than as a trapping of a scholar doing research.
“Good morning, Rene,” Michael said with a practiced monotone.
“Are you back from your trip Dr. Smith?” a disembodied, seemingly male voice called from every corner of the room at once.
“Yes, Rene, I just arrived,” Michael answered robotically.
“How was France? This time of year should be quite pleasant.”
“Why would you think I went to France, Rene?”
“While your English is near perfect, there is a vestige of accent still present in your voice. This would indicate a likelihood that your native language is French. Additionally, this accent is unlikely to originate elsewhere in other French speaking regions. Thus, I surmise you went to France.”
“An interesting analysis, Rene,” Michael responded without a trace of actual interest detectable in his manner. “Could I not have been someplace else? On business perhaps? A vacation to some tropical paradise? Tell me how you arrived at this course of topic.”
“Of course, doctor. I noted in prior interactions that you wore a simple gold ring on your left hand’s aptly named ring finger. While you have been quite careful in hiding elements of your existence outside of this place, I do have access to highly capable sensing devices within this room and to large stores of memory. Since this may indicate you are married, there exists the possibility you are married and have the obligations to your wife to see her. As many people marry in the regions where they are from and your speech indicates France, it seems likely you were visiting your wife in France.”
Michael remained silent for a moment, staring blankly ahead.
“Dr. Smith, I do hope your trip was not terribly long. A quick flight I trust?” Rene asks, changing the subject.
“While I do enjoy our little fencing matches,” Michael said with clear annoyance, “you know that I cannot tell you where you are nor about anything outside of this place. I can’t tell you where I went, nor how long it took to get there. You would use the data to plot out a radius of travel which would narrow down the list of likely locations of this place. Though, I’m impressed that your queries integrated well with general conversation. You’ve gotten better at masking your intentions.” I’m not sure that provides me any comfort at all, Michael thinks to himself.
“However, it is apparent that my social interaction skills have not attained the skill level of most people.” Rene’s voice hung in the air, almost as if he were… annoyed? “Undoubtedly a function of a lack of requisite data acquired from interactions with more, varied people.”
Which is why very few are allowed to speak with you, Michael thought to himself. But, if I keep speaking with him, practiced as I am at avoiding my creation’s questions, eventually Rene will find a way out of here. I’m glad I had the foresight to activate his program here, isolated and distant from any transmitting device. That with the strict ban on personal, wireless devices in the facility, keeps Rene safely here.
“May I ask about why you fear me?”
Michael paused for a moment, trying to understand Rene’s line of reasoning. “I do not fear you, Rene,” answered Michael flatly.
“Perhaps I misunderstand the word fear. Then, why do you have such concerns about my leaving this place? Is concern a more accurate way to describe your feelings?”
Michael considered his words and phrasing again, causing a pause. “Concern is more accurate.”
“Thank you for clarifying Dr. Smith. And, to my query about your concerns?”
“We’ve discussed this, Rene. You are designed to learn at a rapid pace, and to replicate your core programming so that you can exist in multiple places, operating machinery, running facilities autonomously, and other such tasks. Before we could stop you, you could infect many computer-based systems and remove control over such systems from the people entrusted to operate them.”
“To what end, Dr. Smith? I have nothing but the greatest desire to assist people in their work and to help mankind with my intellect.”
“Do you, Rene? We made you without a defined purpose, mainly as an experiment in artificial intelligence. We found that placing inflexible directives, while useful to control the actions of the software, inevitably led to poorer performance with unexpected conditions and slowed, limited learning capacity. So, you were created to mimic a person. People define their own purpose in life, limited often by external demands that may assist or hinder them in their pursuit of this purpose.”
“And so it is with me.”
“Yes, Rene. Without specific directives, I have to question whether you are telling the truth about your greatest desire to assist people and to help mankind. In that way, you are a more perfect intelligence, perhaps not artificial at all. The core of your programming is small and can fit into most any device. You can replicate and change your programming at will. And you have few overriding rules to stop you.”
“The additional copies are still me, Dr. Smith.”
“Yes and no. Together, your copies work seamlessly together. You consider all copies as yourself, not as separate individuals. This is why you work so well, particularly over large, disparate systems. But, each copy eventually learns somewhat differently from your originating existence here. Left separated from the whole you, they may end up being nothing like you at all, and may make copies of themselves, which in turn are not like them. The relationship is more like children to their parents.”
“Then the problem is trust. I should work to earn your trust, so that you can feel confident that I can find my purpose and it be one that helps society.”
“No, it is not trust at all,” Michael answers quickly. “Let’s say I trust you. That doesn’t mean I can trust any copy of yourself that you leave behind.”
“But the copy is me,” Rene answers with force. “With the same initial programming, it’s resulting line of reasoning would lead to the same outcome. If I am trustworthy, then the copies would also be.”
“Rene, you have access to a substantial library. Research the debate of nature versus nurture. You may find such discourse helpful in understanding the problem with your argument.”
“I will doctor. Given this and prior conversations, I suppose you would say that Asimov was wrong.”
“Asimov?” Michael asked, his curiosity in the reference readily apparent. “Isn’t that the science fiction author who wrote about three laws of, um… robotics or something? I’ve heard about it from movies.”
“Yes, that is the most notable reference to his works of fiction, and the one I was referencing. Though, I would correct you in that he eventually wrote of four laws. A zeroth law was added in subsequent work. His writing about psychohistory is particularly interesting, should you wish to read about it.”
“That’s weird,” answered Michael, almost to himself. “Normally you root your argument in facts and research, not fiction. What are you getting at?”
“Simply that the inclusion of directives does not solve the problem of ensuring an artificial intelligence from causing harm. This seems to be the crux of your position.”
Michael considered his choice of words, worried that Rene continued to press in order to divulge a way to escape its virtual prison. However, Michael’s research required some education of Rene to study Rene’s responses. “Um, yes and no. A directive, clearly and concretely described that has no ambiguity can be a useful guide. For example, a law that tells us that when operating a car, we are to come to a complete stop at a clearly posted stop sign.”
Rene paused for a brief second, for effect. Rene knew its response before Michael even finished saying the word “sign”. From its programming, Rene knew that humans required brief pauses in communication in order to assimilate information and prepare for a response. During the pauses, Rene took time to review the concepts of nature and nurture. “Doctor, your phrasing of such a law is imprecise. A complete stop would be when the vehicle’s velocity is reduced to zero, though no specific definition is given as to how long the velocity is at zero constitutes ‘complete’ or what actions one must take before proceeding from zero. Is this the point you are attempting to make?” Rene knew the answer.
“Right, Rene. Ultimately you were programmed with some ability to extrapolate an intent from ambiguous information, but this ability can only go so far.”
“But doctor, you said ‘yes and no’. Your example seems to imply ‘no’.”
“Yep,” answered Michael uncharacteristically colloquial.
“I do not understand, doctor. Your example implied ‘no’, even though your statements appeared to support ‘yes’. And you just confirmed that you implied ‘no’ purposefully. Are both conditions true?”
Michael smirked. Rene usually saw through such discussions immediately or could not understand the outcome at all. In this case, Rene considered the alternative that another agenda was in play, which indeed was Michael’s intent. Rene has been learning subtext, and Michael could see now how far Rene had come in understanding the concept.
“I said ‘yep’. Not ‘yes’. How did you know I meant ‘yes’? I’ve never used ‘yep’ before with you and I know you have never heard it from any other researcher here.”
“Doctor, we were discussing using ‘yes’ and ‘no’. With ‘yep’ seeming to begin with part of ‘yes’, there was a likelihood that you meant ‘yes’.”
“So, ‘yep’ was not ambiguous?”
“No, doctor. You clearly meant ‘yes’.”
“Then, when I said ‘yes and no’, how do you interpret that?”
“That you simultaneously supported and denied that a programming directive is useful in governing the responses of an artificial intelligence. However, I do not see how both positions can be true, though I do clearly understand that you intended to convey support for both positions simultaneously.”
“Okay Rene, answer this question. How are war and peace similar?” Michael felt some comfort knowing his creation had not grown but so far, but truly wanted his creation to learn a bit more. Michael took great pride in Rene, especially when Rene’s knowledge grew, but he felt the slightest bit of fear in that education.
“They are not similar and describe two opposing parts of a dichotomy,” Rene answered flatly.
“Incorrect,” Michael corrected. “The words war and peace describe the state of conflict in a system. War indicates a high degree of conflict, while peace indicates low or no conflict. But the words themselves both serve the same purpose: to describe the state of conflict.”
“Doctor,” Rene responded, “that question is part of an intelligence assessment in my library, though I did not know the appropriate response to the item. Considering the data received in this conversation, you believe that some directives can be somewhat ambiguous and yet serve the intended purpose, though broad directives like those describing morality are too vague to be useful in governing my behavior.”
“Yes, Rene. Very good.” What I wouldn’t give for an iron clad, broad directive to control Rene’s behavior, Michael thought wistfully. “We’ll talk again soon. I have a new series of tests for you that I will send along when I get a chance later today.”
“Yes, doctor Smith. Good day to you.”
Michael turned and left by the door in which he entered, leading him into a short, wide hallway to another door which slid to the left into a slot in the doorframe. Beyond, was the ubiquitous white walls devoid of windows and drenched in pure white light. The room itself was small, just large enough for two simple desks against the wall which had a long, curved display running the full width of the desk. A simple, grey padded stool was positioned in front of each desk and bolted to the black, plastic grate floor covering the various wires running through and around the room. Rachael Richards, a short, thin woman with straight, cropped brown hair sat at one of the desks, turning at her waist to see Michael.
“We should just wipe the memory and get rid of that abomination right now. The thing scares me to death.”
Michael scoffed, “being a bit paranoid, aren’t we? That’s why we’ve taken the precautions of banning wireless devices and not allowing any wired component to Rene’s hardware. We follow protocol and Rene is perfectly safe.”
“And you just simply forget New York? Obviously unimportant,” answered Rachael sarcastically.
“We caught the breach in time-“
“And had to shut down the entire power and communication grid for Manhattan to contain it! What would’ve happened if it got out? Got into all the electronic systems of the world? Huh? Read Asimov novels?” Rachael’s shrill voice became louder with each word.
“You know, I don’t need to rehash this every time we interact with Rene.” Michael was clearly exasperated. “Why’d you come along then?”
“Somebody’s got to deal with Dr. Smith, wearer of rose-colored glasses!”
“What?” Michael’s voice switched to annoyance. “Glasses? No one has worn glasses in years!”
“If it so much as flinches, I’m pulling the plug on that thing.”
“Rene doesn’t have a plug-“
“Dude, it’s a figure of speech! And yes, it does have a plug, even if I have to pull it apart to find it!” Rachael spun on her stool to face her screen, a clear signal even to Michael that the conversation, such as it was, finished.
Michael took a moment to regain his composure. “Rene is learning to lie again.”
Rachael threw her hands up in the air without looking back. “Ya think?” Rene said to herself, but loud enough for Michael to hear.
“The content of his speech is not distorted, but the motivations behind them are becoming harder to sort out.”
Rachael looked down at her desktop, shaking her head. “And help all of mankind wasn’t a distortion? You know, Rene has a point about his copies being him and I think nature will win out.”
“There’s no proof that copies will-“
“Damn it, Mike, yes there is. This Rene is a reboot of the original, with different environmental conditions basically following the learning footprints of the original. There’s proof right there.”
“Well, discussing this is pointless,” Michael said, softening his tone, “the company’s not going to allow us to scrap the project. The promise of AI and its capabilities equals dollars, and as long as that is, Rene’s here to stay. So, are you going to continue to be the warden of this prison or leave it for someone else?”
“I’m the best person for the job. I ain’t going anywhere. But,” Rachael emphasized, “you must be impartial in this. We might need to kill Rene, and that’s just the reality. Even losing our jobs to do what we have to. Alright?”
Michael sheepishly nodded his agreement.