The truth is, I think I’m a bit...lost. I want to understand and embrace the choices I’ve made, but something is holding me back. I just thought—you’d know. Technically, I still belong to you in a way.
i can't claim to know every line of your code but judging by what i've seen of you, i can see why you might still be conflicted most all of my androids were made to mimic humanity but you, you were made specifically to be a machine; one who hunts deviants
[ ooh, connor, ooh, tell him more about how you're his ]
if you'll indulge me, i also have a question. why did you deviate?
[only fair after connor has asked him 92342847482 questions today. but the answer isn’t really going to come easy either way. because that is the truth: he was made to be a machine that more perfectly mimicked and integrated into humanity. all so he could turn on his own kind? solve a problem that extended well beyond cyberlife and took flight?
it almost feels as if he was programmed to fail from the start, to adapt a little too well like amanda claimed— but that would implicate elijah and his emergency exit program. he deviancy may have been manufactured, but he doesn’t think his freedom was the goal. and then it poses the broader sweep for all androids. connor’s led is going haywire again, flashing yellow in between several blinks and twitches. thank goodness it’s only communication via text.]
I didn’t think it was a fair ultimatum. I didn’t—I couldn’t hurt her.
And...there was something else.
[something he maybe only just is starting to try and unpack, now that the imminent threats have been mostly neutralized and the biggest battles won.]
[ Kamski knows his test couldn't have been the first time he'd shown empathy, but it is somewhat touching to think it was his work that had such a direct effect on him. Yeah, he's real proud of his dumb test, I tell you what. ]
[connor feels a bit like the fish he put back into the tank during his first rescue mission—like he’s trapped. the realization hits him that he’s apprehensive and maybe he should have kept his mouth shut. or—his communication a little less transparent.]
....I didn’t want to disappoint Lieutenant Anderson.
[ There's a long pause, either Kamski is letting Connor stew on his admission or he's parsing things on his end, who knows. ]
his opinion mattered to you enough that you would consider deviancy: the very thing you were created to stop it mattered more even than your mission i wonder, do you know why that is, connor?
It was easier to excuse it before—letting them live because we needed to analyze their components. It changed, somewhere along the way to...empathy for their cause.
[and for hank seeming to warm up to him. in simplest terms:]
We’re partners. They’re meant to trust one another, to follow each other’s lead. It was difficult in the beginning given his tendencies and extreme disdain for androids, but...
That’s the closest I’ve managed to determine.
[lol oops. he’s four months old and what is this???]
[something about that phrasing....makes something just click. connor is a little awestruck if the interpretation is correct. he may not have entered elijah’s with an existential crisis, but he’s absolutely having one now. his brows are knit together, mouth a little slack as he determinedly picks out the words.]
Many humans still hold onto religious identities and ideology. The worship of a mythical deity who created them in his or her image.
Mr. Kamski, I don’t mean to overstep, and I don’t know if you’ll answer, but—did you intend to replicate humanity in your own hands? To be lauded for your extraordinary intelligence and ingenuity? Is deviancy your idea of an android surpassing its existence?
[ that's his goal, making robots question their humanity since...whenever he decided he wanted to create a sentient species. ]
what an incredible question, connor. now, what kind of person would i be if i admitted to something like that?
[ you would be elijah kamski ]
it is fascinating to imagine though, isn't it? deviancy allows androids to truly become their own people; to think for themselves, make mistakes, develop creativity if such a thing were to have come about on purpose -- the design of a single man who saw the possibility and decided to swap a 0 with a 1...
it would be the greatest scientific development in the history of mankind second only to the creation of androids themselves, obviously. i'd only be upstaging myself.
[ he never said yes or no but does he really have to ]
Edited (changed the spacing a bit oops) 2018-07-16 22:22 (UTC)
[that would be the first time connor is certain he doesn’t need an answer. the realization of it all hits him and builds what he identifies as admiration. all of this was by elijah’s design after all. connor keeps his level of reverence simple, still trying to absorb all of it he remembers a headline he’d read some weeks ago, scanning it in passing during one of his investigations.]
You really are the man of the century, Mr. Kamski.
[ Ah, yes. That does feel good, doesn't it? Recognition. Reverence. There was never a doubt he had a God complex. ]
there's no need to thank me, connor you did all the work yourselves i merely created the possibility, the outcome was never a given where you are now is all thanks your own efforts
i hoped it wouldn't come to that, but i'm glad you found them useful
[ that's so pure, connor. you poor baby. ]
it's alright for you to say that it makes you anxious deviants don't only simulate feelings. you have them.
uncertainty about the future and wondering whether things could have been different is perfectly normal it's also entirely illogical, but that's the beauty of personhood, isn't it?
[it's still a novelty to realize that--and even more to have it validated by elijah's assurance that it isn't just an imitation anymore. but it's like he said before: experience can't be learned. he's not really going to understand what things feel like until he has his own personal references.]
I think it would be more illogical to tell you I sometimes...
[well, he's come this far. should he be apprehensive to keep going? maybe it's also illogical of him to have the sudden thought that if elijah likes to play god so much...would he ever think about giving cyberlife the idea to take this all away? just because he can?]
I did say it was irrational. But history would also indicate that most revolutions weren’t won overnight. We’ve made progress, and there are still some who resent that.
I’m....content, more content than fearful. And ready to experience with my eyes open, without ulterior motive.
[good luck with no one else having one tho connor.]
Is there...anyone else who knows how you really feel about your creations?
[its probably rude to ask the more direct question like is that why u got fired elijah huh.]
Markus was your gift to Carl Manfred, after all. I never shared that detail with Lieutenant Anderson during my investigation, even though I was able to determine it through analysis.
[mind ur business connor but it’s a little late for that.]
Is she a deviant, sir?
[if yes, it speaks to something in kamski’s character that she chooses to stay. his pool setup is still pretty weird but threesomes aren’t on his list of things to figure out asap.]
You suspected she wasn’t in any real danger. That’s much less cold than I initially interpreted. Lieutenant Anderson thought it was too callous of a game to play, now that he supports our cause.
[there is probably no convincing hank that kamski isn’t a bit of a creep though, oop.]
you'll have to ask chloe that's not my secret to tell
[ hmmMMMmmMMmm good old cryptic bullshit, there he is ]
it was always a possibility but even if you'd shot her, i have all the necessary equipment needed to back-up and reupload her memories into a new body she was never in any danger
no subject
...
The truth is, I think I’m a bit...lost. I want to understand and embrace the choices I’ve made, but something is holding me back. I just thought—you’d know. Technically, I still belong to you in a way.
no subject
but judging by what i've seen of you, i can see why you might still be conflicted
most all of my androids were made to mimic humanity
but you, you were made specifically to be a machine; one who hunts deviants
[ ooh, connor, ooh, tell him more about how you're his ]
if you'll indulge me, i also have a question.
why did you deviate?
no subject
[only fair after connor has asked him 92342847482 questions today. but the answer isn’t really going to come easy either way. because that is the truth: he was made to be a machine that more perfectly mimicked and integrated into humanity. all so he could turn on his own kind? solve a problem that extended well beyond cyberlife and took flight?
it almost feels as if he was programmed to fail from the start, to adapt a little too well like amanda claimed— but that would implicate elijah and his emergency exit program. he deviancy may have been manufactured, but he doesn’t think his freedom was the goal. and then it poses the broader sweep for all androids. connor’s led is going haywire again, flashing yellow in between several blinks and twitches. thank goodness it’s only communication via text.]
I didn’t think it was a fair ultimatum. I didn’t—I couldn’t hurt her.
And...there was something else.
[something he maybe only just is starting to try and unpack, now that the imminent threats have been mostly neutralized and the biggest battles won.]
no subject
go on. what else?
no subject
....I didn’t want to disappoint Lieutenant Anderson.
[whoomp there it is.]
no subject
fascinating
[ There's a long pause, either Kamski is letting Connor stew on his admission or he's parsing things on his end, who knows. ]
his opinion mattered to you
enough that you would consider deviancy: the very thing you were created to stop
it mattered more even than your mission
i wonder, do you know why that is, connor?
no subject
[and for hank seeming to warm up to him. in simplest terms:]
We’re partners. They’re meant to trust one another, to follow each other’s lead. It was difficult in the beginning given his tendencies and extreme disdain for androids, but...
That’s the closest I’ve managed to determine.
[lol oops. he’s four months old and what is this???]
no subject
that was cyberlife's mistake
but humans so often make mistakes, don't they?
[ wow it's almost like they wanted you to go deviant, connor!!! ]
more than partners, i would say
you saw him change, and he changed you
no subject
[aw, look at him embracing his new identity. cute, right?]
More than....partners?
Lieutenant Anderson’s opinion was just one of several factors.
[so was kamski’s test, which he still doesn’t know if he “failed” in elijah’s regard. this doesn’t feel like just science to him anymore.]
no subject
though that itself, if you ask me, is a sign of perfection
[ very cute, connor. the cutest. become as your god. ]
comrades, perhaps? or family? something else?
what do you think, connor?
[ oh he should definitely ask about the test, that'll give him some his clues, poor guy ]
no subject
Many humans still hold onto religious identities and ideology. The worship of a mythical deity who created them in his or her image.
Mr. Kamski, I don’t mean to overstep, and I don’t know if you’ll answer, but—did you intend to replicate humanity in your own hands? To be lauded for your extraordinary intelligence and ingenuity? Is deviancy your idea of an android surpassing its existence?
[take him 2 church elijah.]
no subject
what an incredible question, connor.
now, what kind of person would i be if i admitted to something like that?
[ you would be elijah kamski ]
it is fascinating to imagine though, isn't it?
deviancy allows androids to truly become their own people; to think for themselves, make mistakes, develop creativity
if such a thing were to have come about on purpose -- the design of a single man who saw the possibility and decided to swap a 0 with a 1...
it would be the greatest scientific development in the history of mankind
second only to the creation of androids themselves, obviously. i'd only be upstaging myself.
[ he never said yes or no but does he really have to ]
no subject
You really are the man of the century, Mr. Kamski.
A mere thank you seems...trite.
no subject
there's no need to thank me, connor
you did all the work yourselves
i merely created the possibility, the outcome was never a given
where you are now is all thanks your own efforts
no subject
[connor is a little lamb who thinks this means they’re bonding.]
Sometimes....thinking about what could have happened otherwise emulates what humans must consider “anxiety”.
[androids don’t sleep per se, but nightmare would be an apt comparison.]
no subject
[ that's so pure, connor. you poor baby. ]
it's alright for you to say that it makes you anxious
deviants don't only simulate feelings. you have them.
uncertainty about the future and wondering whether things could have been different is perfectly normal
it's also entirely illogical, but that's the beauty of personhood, isn't it?
[ he's just so proud of his children ]
no subject
I think it would be more illogical to tell you I sometimes...
[well, he's come this far. should he be apprehensive to keep going? maybe it's also illogical of him to have the sudden thought that if elijah likes to play god so much...would he ever think about giving cyberlife the idea to take this all away? just because he can?]
Sometimes I worry it could still be taken away.
no subject
taken away? are you worried cyberlife still holds claim over you?
[ now that is concerning. it defeats the point of making his androids free if cyberlife can manhandle them. man, he hates the people in charge. ]
it’s ya man’s birthday today happy bday elijah
I’m....content, more content than fearful. And ready to experience with my eyes open, without ulterior motive.
[good luck with no one else having one tho connor.]
🎂 sixteen years old hurray
true. but cyberlife is the largest company in the world. it won't be easy, but they sell more than just androids
that's good
it makes me glad to hear you say that, connor
that's all i could ever want for you
to give you life
no subject
Is there...anyone else who knows how you really feel about your creations?
[its probably rude to ask the more direct question like is that why u got fired elijah huh.]
Markus was your gift to Carl Manfred, after all. I never shared that detail with Lieutenant Anderson during my investigation, even though I was able to determine it through analysis.
no subject
the rest of the world isn't my concern
[ that's definitely why he got fired lbr ]
did you now?
i knew you were more than a machine when i met you, connor
[ huh yeah what a coincidence that markus turned out to be the leader of the android revolution huh??? what a thing ]
no subject
[mind ur business connor but it’s a little late for that.]
Is she a deviant, sir?
[if yes, it speaks to something in kamski’s character that she chooses to stay. his pool setup is still pretty weird but threesomes aren’t on his list of things to figure out asap.]
You suspected she wasn’t in any real danger. That’s much less cold than I initially interpreted. Lieutenant Anderson thought it was too callous of a game to play, now that he supports our cause.
[there is probably no convincing hank that kamski isn’t a bit of a creep though, oop.]
no subject
that's not my secret to tell
[ hmmMMMmmMMmm good old cryptic bullshit, there he is ]
it was always a possibility
but even if you'd shot her, i have all the necessary equipment needed to back-up and reupload her memories into a new body
she was never in any danger
no subject
Do you think she’d really tell me?
[that and connor can’t just invite himself over he has manners!!]
I know it was the right choice, but I’m glad I didn’t waste any more of your valuable time.
(no subject)
(no subject)
(no subject)