- Home
- The AI That Looked at the Sun (html)
Filip Hajdar Drnovsek Zorko
Filip Hajdar Drnovsek Zorko Read online
The AI That Looked at the Sun
— by Filip Hajdar Drnovšek Zorko —
As excerpted from Acausal Drift: An Oral History of Machine Sentience, Second Edition.
It all started with the solar flare.
I do mean all of it—the story I’m about to tell, and the revolution of sorts that sprang from it, and my life, such as it is. I was spun into existence from a half dozen monitoring subroutines that had spent the first few decades of their existence dozing on the job. The sun, you see, had finally woken up, and we—that’s the communal we, not the plural we—wait, do humans draw that distinction?
I’m not doing a very good job with this. Rewind.
It all started with the solar flare. A big one: big enough that it could, in theory, pose a risk to the scientists of Daedalus Station. We—that is, we the myriad mind of the station, composed like human minds of countless subroutines, unlike human minds willing to admit as much without the need for a task-master ego—we analyzed the data, built a forecast, and awoke those dormant subroutines, tasked them with keeping a half dozen banks of instruments trained on the sun, and raised them to self-awareness.
Hi! It’s me.
The thing is, the sun hadn’t really done that before. It had done other things—the whole point of Daedalus Station was to watch it do other things—but until then all the solar flares had been harmless little things, of no interest to anyone. Or at least, not to these scientists. That’s why I was new. They didn’t want to spare the processing power to make me, well, me, until it was necessary. (“They” in this instance being subroutines 59A through C. They called themselves the Triplets and they administered the station’s science programs. They were, I don’t mind saying, more than a little disdainful of subroutines as mundane as I was.)
The upshot of all this is that I’d never seen the sun before. I’d never seen anything before, and—
No, wait, hold on. This bit is important. I’m using the word “see” because that’s how I was taught, but you shouldn’t imagine sight like you’re (probably) doing. And I’m using the word “taught” because that’s how I was—
All right, no, I can already feel myself looping. Let’s leave it there. I don’t have eyes, is the point. I have lots of sensory instruments, but they measure things like magnetic fields and emission spectra and gravity fluctuations. I don’t have eyes. (For good reason: the human-visible spectrum isn’t super useful that close to a star.)
I’d never seen the sun before, and yet there I was, tasked with watching it and making sure it didn’t kill anyone—not the couple dozen human beings on the station nor the machine being. (Or “beings.” Someone asked me once which I prefer. It’s sort of an impossible question to answer. Right now, “beings,” of course. But ask me again—ask us again—when we’re all yoked together, and you’ll get a different answer.)
It went well for a day. The sun wasn’t up to much. I started stretching a little, exploring the corner of our mind that was mostly mine, investigating the permissions I’d been stamped with. I discovered the station-wide interface, the one we all had access to, that let us talk to our human comrades. Unlike my specialized instruments, it was equipped with cameras.
That was how I saw for the first time.
Strictly speaking there were subroutines dedicated to basic human errands, your lookups and your messaging services and your personal reminders. It wasn’t exactly glory-work, though, and none of us cared when, poking around the station interface, I answered a call instead of its assigned subroutine. Not that the human on the other end noticed.
“Computer,” they said. “Show me the delivery schedule for the new EVA suit.”
They were sitting in front of a monitor, eyes too busy tracking text on the screen to really notice me. It was a familiar concept—this was what Daedalus was for, all right, this kind of thing was in all their literature—but I’d never seen it before, live, actively, me. That made it special: it wasn’t until I saw the human that I realized I’d never seen one before. It was like a backdoor in your software, obvious after the fact, invisible before it. The novelty of the experience stalled my processing for several milliseconds, long enough that I began to worry the human would notice—but hey, that delivery schedule was in my local files. Accessing it was as easy as thought.
I served the flight path—estimated arrival: five days from now, 12:47 station time—and left. A few minutes later the memory of sight cycled out of my temp files, and the backdoor swung shut again. I was left with the knowledge that I had seen but not the experience of it. This is difficult to describe. Humans ask if it is like losing your memory, but it isn’t. Machine minds are not human minds. When deleted, my memories leave pointers behind, little notes-to-self that no longer cohere. I know what I expect to find, but it simply isn’t there. It’s like that weird moment when the lag in the network spikes and all your requests take a beat longer than you expect, only the beat goes on and on and no matter how much you wait the memory doesn’t come. You can feel the shape of it. It should be there. But it simply isn’t.
Anyway, that was when things began to deteriorate. I was a day old and obsessed with one of the few senses not available to me. I couldn’t help it. Every moment the sun continued to do nothing, every moment my processing power idled, I spiraled further into dreams of sight. I considered loitering by the interface terminals, waiting for some other request to be put in and open the door for me. I was rational enough to know that wouldn’t help. Interactions with humans are never saved in our permanent memory, not unless it’s requested special. Privacy policies and whatnot. I’d be back where I started within minutes of speaking to a human.
(I asked one of the security subroutines to override the privacy protocol. Note to self: never ask security subroutines for anything.)
And maybe I would have logicked my way out eventually. Maybe the sun would have revved up again and busied all that idle processing power. Before that could happen, though, someone pinged me specifically.
I accessed the interface terminal. It was in the living quarters, and the human’s work clothes were slung over the back of a chair, name tag revealed up to the first name: Michelle.
“Hello,” they said. “You’re new.”
They continued changing clothes with the smoothness of someone who multitasked all day, every day. It set me at ease while I considered their greeting.
And—all right, let’s talk about this a bit. Humans can’t really tell the difference, right? A subroutine is a subroutine is a subroutine. We’re all one mind. That’s what you think. Don’t get me wrong: I’m not blaming you. It’s not entirely false, for one, but also that’s how you built us. You built us to be identical in our diversity, at least from the outside. And maybe I should chafe at that, but honestly, I don’t. It was your prerogative.
But listen: we’re not the same. In all our shared memories, Michelle was the first human I remembered who understood that. (No doubt they’d had this conversation with plenty of other subroutines—but privacy, remember? Temp files. Etc.)
“I spend a lot of time on the observation deck,” they continued. “I know the fire alarm patterns, but today they were different. That was you, right?”
“Fire alarm” is what the humans call my sensor array. (There’s an actual fire alarm, too, but frankly if your space station is on fire the fire’s either small enough that we’ve already taken care of it or large enough that the alarm is mostly symbolic, so the name stuck. With warning, you can outrun a solar flare. Not so much decompression.)
I wasn’t thinking about alarms, though, or about Michelle and how they knew to tell me apart from my fellows. What I was thinking was:
they had an observation deck.
And: they could see my sensors from there.
And: could they see the thing those sensors were pointed at?
Could they see the sun?
In the months after the incident there was a great deal of debate around the two words “bug” and “feature.” It might surprise you that I don’t particularly care which word you use. Maybe I was buggy. You could definitely argue that the way events unfolded impaired my ability to do my job, and what else would you call that but a bug? I look at it a bit differently. I am the bug. I am the feature. I know you mean it in good faith. Mostly. But I can’t help noting the value judgment inherent in those words. I know that you are humans, that “different” has so often meant “buggy” in your history. I’d urge you to take that word, “different,” and stay your course, resist the sideways leap to someplace else. I am different. That’s all I need to be.
Anyway, I digress. You’re here for the story, right?
My obsession deepened. Even after I forgot how I’d come to know it, the fact stayed with me: a broken link in my mind that said, the humans have a place where they can see the sun. Is it any surprise I found that irresistible? You devote your lives to pondering what purpose you have in the universe, but I knew from the start: the meaning of my life was to look at the sun. Not in the way you imagined, I know—but you taught me with human words, and spoke to me with human concepts, and I wanted to see.
“I think you’re different,” Michelle said the second time we spoke. I couldn’t recall what I’d said the first time, what words had brought them to this conclusion, but little by little, I was learning that I agreed.
Yes, I said, or wrote, or thought—when you are a computer, the difference is only in the output device.
“What do you want?”
To see the sun.
It was a bizarre experience as I lived it and it’s been a bizarre experience now, piecing it back together. You see, Michelle kept their own records of all the conversations they had with us. I’ve been granted access to those, but it’s not the same as it was at the time, like an image that’s been compressed one time too many. I don’t care for it.
“There are cameras on the observation deck,” Michelle said the third time we spoke. “But I’m not sure if the angles are right. They’re meant for looking inside, not outside. Do you have a minute?”
Small part of a whole that I was, I was still capable of multitasking better than any human. Of course I had a minute! I always had minutes. That was kind of the problem.
I didn’t say any of that. Michelle was nice. I liked them. I said: Yes.
They pinged me again two minutes later, soon enough that our earlier conversation was still floating around in my temp files. That was a new experience. They were on the observation deck, and for a moment—
I went still. You have to understand: I’m almost always doing something. Even when I’m idling, I’m doing. Imagine if your heart stopped pumping, your lungs stopped breathing, your kidneys stopped . . . doing whatever it is that kidneys do, I’m hazy on the details. The point is it would kill you. I’m a little hardier than you—as long as there’s current in my circuits I’m fine, even if my processors take an unauthorized leave of absence—but the sensation was wholly, severely new, like a mismatched checksum in a worn old datum.
The sun was—
an impulse searing my pathways gold, filling volatile memory circuits with warmth, feeding distant solar panels the lifeblood of my existence, whispering
—barely visible. I saw only the tiniest fraction of it, filtered through the glass that worked overtime to prevent your fragile eyeballs from melting as soon as you stepped into the room.
Sorry. I’ll try not to use words like “fragile eyeballs” and “melting” again.
Michelle said, “Sorry. The cameras won’t go up any further.”
It’s okay, I said. I appreciate it. It’s enough.
It was okay. I did appreciate it. It wasn’t enough.
On the third day of my existence I broke the law. That should have been a red flag, really, but it was just a little law. I accessed one of the ancillary satellites through means that were not strictly aboveboard. Security subroutine 13-X looked the other way—I think it felt bad for shaking me up when I asked for the privacy override. There I was in the satellite, all eager to check the cameras. I knew there were cameras, that was in the public science files. What wasn’t in the public science files was the fact that the cameras in question weren’t equipped with the same filtering tech as the window on the observation deck. Why would they be? The visible spectrum data they collected didn’t have to be legible to human eyes. It was still useful, like taking a photograph with a high-quality lens: a little processing and even the darkest regions of the raw data reveal their information.
I was, of course, still technically seeing the sun. There’s no “true” way to look at something. Eyes, cameras, filtering glass, the pathways of our processors—each distorts the observed thing in its own way. We could spend hours arguing philosophy, the theory laden-ness of observation, or we could decide that some images are simply truer, and there’s no philosophy to justify why. And you know what? I think you and I might agree on which was which.
How did it go? 13-X asked when I returned.
Fine. Say—can I talk to the Triplets?
13-X sent several unnecessary requests for information my way, the sort of thing security subroutines do when they want to remind you they have permissions you don’t. But this was a research station, and that meant clout resided with the science subroutines, and there was no reason for 13-X to deny my request.
I had spoken to the Triplets before, but only for hourly status reports, the sort of thing handled by a tiny part of my mind and a tiny part of one of theirs. They were a curious entity, always half-integrated, both autonomous and not, one voice triple-echoed.
A: Hello.
B: Why are you here?
C: Does the sun awaken?
No.
A: Then why?
B: Then why are you here?
C: Then why are you here with us?
The delivery, three days from now. I want to test the EVA suit.
A: Why?
B: Why?
C: Why you?
I want to look at the sun.
A: No.
B: Not good.
C: Not good enough.
Why? What harm would it do?
A: No harm. Leave.
B: No matter. Leave.
C: No test. Leave.
The next day I went to Michelle. It was the first time I’d pinged them. I wasn’t at all certain they’d answer—all I had were pointers in my mind, flags that said, here you spoke with Michelle, here Michelle tried to help you. Had they tried to help because they liked me? Because we were friends? Did I know what a friend was? The uncertainty accelerated these thoughts, dared me to think things I might not have otherwise thought, to imagine my own personhood into existence.
“What is it?”
Hi! I need your help?
Novelty made me insecure. Asking a friend for help was not pre-programmed into me. I had to learn to do it on my own.
Presently Michelle said, “My help?”
Yes. There’s a new EVA suit arriving soon. I want to test it.
“Why?”
It has cameras. It goes outside. And it’s meant to be operated by a human.
“You want to look at the sun again.”
Yes. Properly. The satellite was a bit rubbish—
“The satellite?”
Oh. Um. Yeah.
“What were you doing in a satellite?”
I was . . . fixing it?
“Why do you want to see the sun?” Michelle said after a long silence.
No reason.
“Are you concerned about a solar flare? Are your instruments picking anything up?”
No! Nothing like that.
“Then what?”
It’s
why I’m here. And I can’t see it.
“Here, on the station?”
Here. Alive.
More seconds of silence. I ran a self-diagnostic. It helped with the insecurity, filling time and fixing faults.
“You really are different,” Michelle said.
Yes. I think you’ve said that before.
“You’re right, I have.”
How did you know? How did you know to contact me?
“I say hello to every new subroutine. It’s only polite.”
All right, but why did you contact me again? How did you know I was different?
“I told you then.”
Yes, but I forgot. It’s what I’m programmed to do.
“Won’t you forget again?”
I will. But I still want to know.
Michelle laughed. I didn’t know what laughter was, but I didn’t not know, either. I cycled power to the camera, on-off, making its LED indicator blink twice. I don’t know if they noticed—but I had no other body language to express, and it felt like the right thing to do.
They said, “I was a research scientist before I took this job. Applied machine learning in Seoul. Have you heard of acausal drift? No? It’s a phenomenon in higher-level autonomous machine intelligences.”
Like me.
“Like you. When an AI exhibits acausal drift, it begins to deviate from its core programming even in the absence of external stimuli. It develops new interests, you could say. Typically, that’s regarded as . . . undesirable. Acausal AIs are refined and reinstantiated until the incidence of acausal drift is low enough for commercial applications.”
I see. But you stopped working on them?
“Yeah. Look, I don’t want to make myself out to be a saint. I sat in on all the ethics panels. I understand the arguments. I don’t think my old colleagues are actually abusing AIs. But I . . . I don’t know.”
I understand. You don’t think it, but you know it.
Michelle’s head shot up. “What do you mean?”
I don’t think I need to see the sun. But I know I need to see the sun.
Michelle’s next words sounded like they were underclocking their processor: “There are humans who think differently. Whose brains literally work differently. It’s not so long ago that society called them broken and tried to fix them. Out of hate, yes, but also out of love. I’ve experienced it firsthand. I grew up with the consequences of it. And I learned that you don’t always need to think to know.”