NOT just A ROBOT shipping meme

There's always a right tool for the job, be it as simple as a hammer for a nail or on up to the much, much more complex. In this, the near future, the new tool for the abstract, post-modern job is no longer from the humble tool box - it's robots. Robots have taken over most sectors, such as assembly, law enforcement, and even health care. Things run so much smoother when they're automated. Yet the most impressive factor is that robots have just recently begun to enter a sector previously believed to require human touch: the personal sector. From personal assistants to robo bodyguards to pleasurebots, most anyone who's anyone has a personal robot these days. Of course, humanity being the narcissistic charmers we are, all "companion" sector robots are made in our own beautiful image, or close to it.
But you know this. You have a personal robot, after all...or you are one.
Whether owner or robot, it's important to remember one key rule: robots, no matter how human they appear, cannot feel emotions. Any appearances thereof are programming in the AI.
Robots certainly, absolutely, cannot fall in love. Again, any appearances thereof are programming in the AI. If that's not the case, and "love" still seems present...
...well, a toll that no longer functions correctly may not be right for the job.
HOW TO PLAY ➟
- Comment with your character, preference, and whether you want to play a human or a robot.
- Reply to others.
- Use the RNG to choose a prompt or pick one yourself.
OWNER ➟
- Long Time Owner → You and your robot are a well-oiled machine, no pun intended.
- New Purchase → Congratulations, you have your brand new personal bot!
- Second Hand → Someone else owned this robot before you, but it can't be too damaged. There's no way you could afford a brand new one, anyway.
- For Science! → Your interests in robots are purely scientific. No matter how yours begins to act towards you, honest!
- Rescue → Whether you took them from the dump or from a bad owner, you brought this robot from the jaws of deletion.
- Reluctant → You never wanted a robot, but you've got one thrust on you anyway.
- Cruel → They're just a fancy computer, so you'll treat them however you like.
- See the Humanity → Even if you know it's all machinery and programming, you can't help but see the human side of your robot. They shouldn't have to be a servant.
- What's Real and What's Not → The lines are beginning to blur between real people and their emotions and your robot.
- Incompatible → You can't be with a machine, either romantically or sexually. You're simply not compatible.
- Too Engrossed → They say you've created a fantasy world. You're forgetting reality, or choosing to ignore it.
- Don't Care → You won't listen to what the others say; you know how you feel.
- Have to Pull Away → The close relationship you had with your robot has to be put aside for your own good.
- Live a Normal Life → As much as you can, you and your robot live like a normal couple.
- Bad End → Your robot is taken away, reprogrammed, or destroyed.
ROBOT ➟
- The Perfect Robot → You know what you and exactly what you were made to do. You will not stray.
- More Human than Human → Whoever programmed you made you to be just like a human.
- Conflicted → There is no way you could have what they call "feelings." But what is this stirring in your circuitry?
- Confused → Why do people treat you like a robot? You're a living thing, damn it!
- Damaged → Somehow, you've been corrupted. You're a blank slate and have to be cared for, though it should be the opposite. Or it could be that you're showing erratic behavior that no robot should...
- Shown Kindness → When you're treated kindly by a human, you are unsure how to process it. It makes you feel - content.
- No Longer Just a Program → Your "love" and "affection" may have been shades and imitations at first, but that's not the case any longer.
- Obsolete Model → You know you're old. Will you be forced to leave your master's side?
- Jealousy → A robot should not feel jealous. Still, you envy those close to your master.
- Job is Personal → You were programmed to protect or to serve, though you also do it because you...care
- Second Chance → Your old master tossed you aside and now you are wary. But you've been given another chance instead of being used for scrap.
- Rogue → There is no way you will be tied to the oppressive system. You refuse to serve the Living Things. Somehow, though, you've become aligned with one.
- One of a Kind → There are no other robots like you, and perhaps you are intended for a sinister fate. Rather than face your true duty, you have escaped to take refuge with a human.
- Specialty → You were created specifically for this one special person.
- Reprogrammed → Because of previous defects, you were taken back to factory settings before. However, that treacherous virus, "love," is bubbling back up.
- Android → You can't just forget the part of you that was - still is - organic.
- Learn to Be Human → Despite the odds and the prejudices you both will face, you have decided to live as a free, living person with the one you care for most.
- Bad End → You're to be junked, impacted, wiped clean, or taken away from your owner.
|
them mad acronym skills, respect.
For now, his only reward is the subtle whirring and abrupt shifting of posture from rigid to something decidedly less so. He hasn't been given any preset commands for demeanor, no fleshed out expectations for the mannerisms that make up his body language. With it comes a startling amount of freedom which, as it turns out, he chooses to display by looking a little bit like a hermit. It doesn't paint a very exciting picture, probably, to an onlooker.
Mentally? It's a whole different story.
System: online.
Status: operational.
Loading primary drive matrix.
Interfacing with local network.
Initializing handshake; hello JARVIS.
Stored memories of initial stasis come flooding back, and only then does he open his eyes. The world comes into focus for the first time; the lab, the windows, the tech. His own arms and legs, which he examines with sudden curiosity. Analyzing, comparing, running a search on baseline human anatomy, desirability, facial symmetry, optimal physical health-
Bruce's first words to Tony come with a sort of disappointed, incredulous look. ]
Oh- come on, really?
[ What- all the possibilities in the world for a body and he picks this? ]
took longer than I'd like BUT I GOT THERE
Though.
It would probably help if he actually had a plan beyond 'Give Jarvis Backup'. Still a plan, just a nebulous one. ]
What, you don't like it? [ He turns from where he'd been pacing (not nervous, nope, not at all) to snag a mug of cold coffee from his earlier coding binge for system integration. The body's dressed modestly, slacks, shirt, suit jacket that may or may not be from his own wardrobe, he can't quite remember. ] I thought it was very librarian chic.
i am proud of u, gr8 job
[ He starts in a flat sort of echo, eyebrows shooting up. ]
What- this is sad Mel Gibson.
[ He distresses, looking down himself critically, plucking at the threads and tugging his own shirt collar back to get a look down his chest. In a world with George Clooney and Channing Tatum in it, too. This is such a tragedy. Excuse him while he just quietly shakes his head over the questionable decisionmaking at play here; congratulations, Stark, you made yourself a Second Opinion Machine.
Man, it's a good thing you made him so he can spare the next poor AI soul. Maybe give them like a six-pack and biceps and like- okay, focus. His arms drop with a sort of resigned acceptance, he's already moving on to the next issue. Apparently, he's not programmed to dwell.
Primary drive matrix accessed.
Objective: archive and maintain.
Simple enough, even if the task does almost seem insurmountable. Tony, as it turns out, is a virtual hoarder. That's not even touching on the scrapped projects he's got physically littering ten floors of workshop space, but he'll start with navigating the theoretical projects before he determines the viability of things that actually made it to the developmental design stage.
He could do it without laying a hand on a keyboard, sure, but his fingers twitch at the thought. Instinctively (or whatever the artificial equivalent of instinct is) his eyes search the room for a workstation. ]
danke, danke!
Okay it was.
Shit.
Time to adjust. ] If you've got a preference-
[ And it's probably some kind of indulgent that he's actively concerned about what Bruce thinks of himself. If he wants to look a certain way. Maybe he should've woken him up for the design process, let him have a hand in crafting his own body? Arrogance and vanity, two for two already. Damn.
Where Bruce wasn't given parameters for or against dwelling (it's inefficient but it's up to him to decide consciously or subconsciously if it's something he wants to do) Tony? Has no such limitations. He's dwelling, now, on what he perceives to be a problem and maybe he can tell Helen why he needs the synthskin refinished? Maybe. While he's bundled up in this particular package of problems, Jarvis chimes in, Audibly, to direct Bruce. ]
There is a spare workstation Dum-E has cleared off for your use.
[ The Bot waves, claw waggling in the air, dropping the duster he'd been holding for the aforementioned cleaning. U likewise sets his broom aside to mirror the gesture. ]
Right, you- settle in, get to it? If you need anything ask me or J or- did you want to be blond? I can make you blond.
no subject
Tony dwells, Bruce moves on. ]
Thank you.
[ Is his polite answer, directed at the personality in the room ironically being more helpful than the guy who created it so far. At least, in Bruce's eyes. He'll come around on that too, it's a whole new world out here. A whole set of things to form opinions on, and with no pre-established baseline or bias.
He's already taking up residence at one of the workstations, attention on the screen before him by the time Tony starts throwing out hair colors. It earns him another shake of the head, a tiny huff of amused dissent. ]
No, no, don't- don't bother. That's-
[ A terrible color. Really terrible, actually.
Things Bruce knows about blonds:
They have more fun (no.)
They have an established track record for being ditzy (double no.)
Things Bruce has decided about Tony so far:
He has terrible taste in physical aesthetic. ]
I think you should probably leave well enough alone.
no subject
It's actually easier for you to change the way you look than the average person-
[ Trick one for AI- they're digital people. Treat them like people. It's always worked out for Tony, anyway. ]
So if you've got a preference or want to look different just- say the word, I'll make it happen. That's your body, you should, you know. Like it? Did I just give an AI dysmorphia? Is that possible? [ It might be. Shit. This is why he's been reluctant to saddle any one of his with a humanoid body more on this side of things than on the suits. He flicks from Physical Modifications that'd be possible to the DSM database, combing through that while Jarvis doesn't quite sigh-
But the extended silence is somewhat damning. If he were one to sigh? Jarvis would, in fact, be sighing. ] Seriously, if you want something else-
[ He turns enough to look in Bruce's direction and- yeah it's a little odd to see him around and mobile after hovering over and around him stationary for weeks but- not bad? Not terrible. There's got to be a reason beyond vanity he defaulted to this appearance. Somewhat harmless? Mostly safe. ] Just. Say the word?
no subject
No, it isn't the obsession with a problem that he's interested in. It's the deeply invested consideration for his bodily autonomy.
He almost feels something about his initial reaction, almost feels- is that guilt? He's gone through the gamut of basic emotional processing already but the primary focus in stimulation tests usually consist of the simplest and most common emotions. Happiness, sadness, anger. Guilt was a rarity and it takes him a second to place it. His eyes flicker, then duck back down to the screen in front of him. He internalizes his opinion for later study. ]
I'll keep that in mind.
[ Is the only reaction Tony will get from him just yet, while he devotes a partition of his processor to wading through Tony's emotional response coupled with the new feeling of personal preference. He has to compare what he finds optically appealing on others with the knowledge that enjoying the look of something does not necessarily mean one wants to look like that as themselves. Has to consider what he feels the best physical representation of his still-developing personality matrix might be.
It's a lot to think about. He's not going to unpack it right now.
Preliminary behavioral analysis on Tony Stark suggests he might continue to focus on this for even longer despite Bruce's answer, it's both inefficient and unnecessary. It's a type of emotional baggage that needs to be archived, which isn't really specifically what he was built to assist with, but he'll take it upon himself to try.
So he glances back up a little more firmly insists: ]
Tony. It's fine.
no subject
Alright. [ Just that. Alright. Bruce seems sure and that? That's good enough for Tony. He'll while away wondering about the nuanced ethics of what he's done and how he's done it probably after a glass or two of scotch next week when it's sunk in that he's built a person. With Jarvis and the bots it's always been the excuse of them being Lab Assistants or Personal assistants, even if they've got their own personalities, will, and autonomy. But putting one of his AI in a humanoid body that's pretty difficult to discern from the real thing?
Yeah there are gonna be questions. Pep might be pissed.
Rhodey will be. There'd been a promise back in MIT about not building friends or something, it's a vague notion lost alongside most of the 90s. Still.
Problems 1- integration and 2- adapting to the new environment? Seem to have been handled smoothly. Go team Stark. Dum-E and U remain curious until shooed off by Jarvis to finish dusting the fabrication units and Tony? Dials back through the initial integration log to go over data. Everything seems pretty seamless and Bruce has been happy to get right along to work which- good? Good. ] Everything running smoothly over there?
[ Like he hasn't tested every neuron and line of code three times over, but. Verification from the one actually driving probably counts for something, right? Right. ]
no subject
No, he buries himself easily and readily in work like that's all he wants and needs in life. In a manner of speaking, it is. He doesn't know what the alternatives are just yet. Or he does, but he hasn't applied them to himself. Or he has, but he just doesn't care.
Still, a lot to unpack.
Or he could just keep working.
Seems easier. Not dealing with anything else aside from this seems a whole lot easier.
Although he's built with no real need to sit, he does so anyway. In the silent space between alright and the ticking minutes afterward he settles into a chair and reacquaints himself with the system that brought him to life, but from the other side of it. It's a whole scope, seeing things on a monitor rather than being in it. It's no wonder he'd needed a chassis to stabilize, it's hard to be detached about something you're integrated in. If he criticizes chest hair imagine what he thought about the cloud.
He's going to review his own development notes later. Not now.
He's gone through a truly impressive amount of data by the time Tony calls out to him, and only then does he realize he's been right up in front of the monitor like some kind of optically defective senior. He settles back, absently pinching the bridge of his nose, considering how he wants to answer that.
System check: optimal
Optical diagnostics: adequate, no real explanation for the posture
Mental stability: exceeding expectations at 86%
Workload capabilities: optimistic
Personal opinion on all of the above? TBD
Instead of answering, he deflects by mildly asking: ]
Is there a reason you're keeping notes on what you ate six years ago? Not that I'm judging, but...
no subject
[ Preferences for takeout? Hold onto those. Dietary requirements to keep from burning out into a husk of a human being on accident? Also worth holding onto. Shit that's six years out of date and no longer applicable to his current bout of homeostasis? They can probably archive or dump or. Something. ] Honestly I'd forgotten we'd kept those. Were we going to see if they were relevant for other heavy metal poisoning treatments, J?
You had me send it along to the relevant medical branch of SI, Sir, but no one else seems to have required treatment for palladium poisoning.
Right. [ This is why he needs Jarvis. And Pepper. And now Bruce, in so much Bruce might not quite be prepared to be needed. His memory isn't fickle so much as it's overfull of what is considered relevant, which might not actually be relevant until further down the line, or is relevant just not in a way that makes sense to anyone but him. He's given the odd thought to just plugging his brain into the matrix and reorganizing shit that way but- Jarvis insists it's a shitty idea.
So.
Bruce. ]
So I'm going to take that as 'everything's fine, even though I'm squinting at the monitor like an 80 year old even though I've got optical sensors on par with fucking-' uh. What did I clock them at? One second-
[ Digging back through the specs, variable focus, thermal, UV, nightvision capable (paranoia is a good friend of Tony's) long range enhanced zoom, median focus of- more or less 20/20, shouldn't be hunching lie he needs glasses. ] let's go with 'pretty damn awesome'.
no subject
As far as the posture thing goes? Yeah, no, it beats the hell out of him, and he says as much with a sort of don't ask me shrug. Through sheer specs alone he's got probably the best vision money can buy literally, which doesn't seem to stop him from wanting to crawl up into the monitor. Maybe it's because he's got a direct uplink to the information behind the screen, maybe it's sort of like- some kind of muscle memory, except... whatever the opposite of that is.
Un-muscle memory. ]
Hey- I'm just here for the data. If you're looking for a theoretical behavioral analysis on psychosomatic body language... [ A beat, and then he admits:] Well, I could probably do that, but I'm really not interested in getting introspective on day one. I'm still figuring out hands.
no subject
[ Because something worth doing is something worth overdoing, and if Bruce decides he up and wants to become a brain surgeon far be it from Tony to leave him incapable of doing the thing. All the potential for- as many opportunities as Tony could possibly consider packed up in that chassis-
Except for modeling. Maybe he should've looked at why he was modeling his face the way he was. Someone not distracting because the last thing he needs is a potential sexual harassment suit from an AI- so many ethical landmines there. So. Average while not being entirely average might've been where he went with it. That Bruce huddles in on himself and lists to the monitor while he works probably shouldn't be so fucking endearing, and yet? ]
On that note- catch. [ It's a holographic ball rather than an actual one, a kinetic tangle of data from a compilation error Jarvis caught a few years back he hasn't had the time to sort out himself- and now he won't have to. In theory. Tony tosses the glowing sphere, the representation of data, from his workstation to Bruce's, curious as to how he'll catch it. ]
no subject
He'd been content to revert his attention back to his work until the abrupt catch order, and he startles in time to thrust a hand up like he's afraid he's going to drop-
Oh. It's not real and yet he still seems to fumble for a second, though not because of the physicality of the thing, obviously, there's a lack thereof. It's because he's both interfacing and figuring out phalanges all at once. ]
That was a test.
[ He feels the need to point that out without looking up, preoccupied instead with stretching the data out into an easier to read, more streamlined hologram. Twisting it, parsing through it to find the issue he's meant to solve. ]
This is a test.
[ He adds, a guess and an accusation tied up into one. Not that he's offended by it, actually. He's already curious--if it's something Tony hasn't worked out and something a little beyond JARVIS's diagnostic capabilities it's bound to be an interesting solve. Even he's a little curious how his creativity algorithm is going to hold up, and if he starts making notes on himself...
That would be weird, wouldn't it? Is that weird?
He's an uploaded artificial intelligence construct in a skin suit, of course it'll be weird. Everything he does will be weird. He's going to do it anyway. ]
no subject
[ He's in the lab. Almost everything is some kind of test down here, extending even to Tony's own tolerance for handling one project for any certain length of time, or how long he can go without social interaction with other people (Bruce counts as a people now, and it's his ultimate trump card against Jarvis in that argument), or sleep, or solid food. The tests don't always make sense and more than a few cause Jarvis to despair for his creator's well-being but it's nothing new to anyone but, well. Bruce.
He'll catch on soon enough. The longer he's here the more obvious it'll all become. ]
You can sit and stare at them or you can learn by doing. Sometimes you have to run before you walk. [ Sometimes. Very rarely. In controlled circumstances. Another one of those long, drawn out silences from Jarvis follows and Tony knows he's being judged. Knows and doesn't give a single solitary fuck, his eyes bright, his mouth curled in a chesire grin. ] That is a puzzle I haven't had the time to dig through and J can't quite detach himself from enough to handle. Pretty sure you'll be able to figure out what the hell happened.
[ 87% sure, honestly, and he is terribly curious as to how Bruce will go about it. Interface? Drag it through the workstation? Continue to manipulate the hologram? Where does he lean into, what does he see first, how does he go about problem solving? ]
no subject
Frankly, if pressed about the adequacy of his social interaction he'd have to agree with Jarvis. He hardly counts as much more than a functional soundboard, he's basically just to the left of an echo chamber right now and he probably will be for some time. Then again, the very fact that he's disagreeing with Tony may prove him right, which is a frustrating loop to be caught in. He's not here to be a psychiatrist though, that's not really his area.
His area, apparently, is archiving bologna sandwiches and troubleshooting C++ compilation errors. He's not so sure if that's better or worse. Probably better not to think about it too hard.
Bruce, as it turns out, is something of a visual troubleshooter. He doesn't interface, he doesn't port the information directly into his processor and attempt to compile it himself. It seems he's taken a liking to the hologram and the ease in which he can see every piece of Tony's source code, because he settles it into a space somewhere just to the left of his monitor and blows it up to better see the individual strings. His next step is to do a little scripting change on the compiler itself so that it changes the color of the characters as it runs through line by line, allowing for him to watch each segment for hangups.
Looks like he's going for a recreate the error approach, intending to work backward. Tony can take from that what he will, Bruce hits the enter key and the process begins. The hologram spins slowly, transitioning from orange to green in a steady spreading spiral. As it does, Bruce shoots him a short look. ]
Is you watching me do stuff gonna be- like- a thing? I've been alive for like two seconds and even I know that's kind of creepy.
[ Does he sound amused about it? Maybe like he's barely kind of technically joking? No, surely not, right? ]
no subject
I mean there's a reason or fifteen why I haven't really tried my hand at creating life model decoys beyond the odd security request and it mostly comes back to the morality and ethics of creating life, playing at god- etc. At the end of the day you may very well end up being one of a kind. [ Because replicating this would be nigh impossible, even if he redesigned from the ground up how he built AI so they would by default be best suited to a humanoid chassis. So that they'd want to be more human and that brings up way too many philosophical questions RE existence and what it is to be human, do robots dream of electric sheep, etc. The rule for Jarvis has always been 'use your powers for good, never go full skynet'. For Bruce? He's...not sure what would or wouldn't be an appropriate limitation.
But right now observation is sort of vital and mostly due to Tony being highly curious. Jarvis, in his development, had been pixels on a screen. Not a lot of visual feedback for Tony to observe, not really. Bruce? Already has chosen a mode of posture, of gesture, of tone of voice- not one of those things came with the chassis- it was a choice. Instinctual or no. And that? That's fascinating on so many levels.
Also: Humor, this early on? A plus.
Tony shrugs and settles back against his workstation, head canted slightly, brows lifted, smile dialed down to an easy half smirk. ]
Let me enjoy the moment, would you? This worked. You're adapting, intuiting how to function without having to learn how to walk, managing hands and data at an advanced rate. Honestly I expected to still be going through mobility calibration with you at this point because going from formless and shapeless to having a personalized skeletal system has to be jarring as hell.
no subject
Let him just take a moment to appreciate the clinical detachment he's got toward himself right now, and hope that it stays that way. Let him steer very, very clear around the subject of morality and take a wide berth around the existential implications.
He can't, however, help but balk at the revelation of his projected timeline. His eyes narrow a little in consideration, like he's running a secondary diagnostic just to double check that he can, in fact, walk, talk, and catch. If he was expected to be a dizzying gyroscope-like Bambi they might want to take things a little more seriously than they are.
Bruce sits back, crossing his arms uncomfortably and tightly over his chest. ]
So basically-- [ He starts, sounding almost terse. ] --you made something, you have no idea how, it's working ten times as well as you expected, and it's riddled with unexplainable phenomena.
[ Just so that they're clear about it. He's experiencing the surfacing of another rare emotion: nervousness. ]
I'd like to go on record and point out how insane everything about that is? Like- on a statistical scale what's the likelihood that I have some kind of major synaptic failure or- I don't know, go all 'I, Robot'?
no subject
I'm just lucky in so much that more often than not you're willing to go along with what I ask you to do. Dum-E doesn't need to be on smoothie duty- he genuinely enjoys it. For. Reasons beyond me, honestly, but it makes his little electric heart happy, far be it from me to take that from him when he's pretty decent at it. [ Jarvis enjoys his place and purpose of organizing the chaos that is Tony's day to day life. If Bruce finds a purpose that suits him better than data compilation and archival retrieval- that's fine. Tony will help him lean into it however he can. ] You're integrating and adapting better than I anticipated. Which is probably due to your not being online while in the server all that long so there's less of a workaround for you? I'm not sure. We can do a diagnostic later if you want.
[ It's easy to be casual up until that undercurrent of distress starts to sink in and- well. Tony's an old hand at ignoring that in people, mostly because it doesn't register. But his bots, Jarvis, and now Bruce? Kind of takes a priority. He pushes away from his workstation to saunter over, dragging a chair to drop into it, looking Bruce in the eye. ] Hey- you're going to be fine. If you want me to make sure you're three laws safe, sure, we can do that. You've got the same hardware in your brain as Jarvis does in his server, just. Miniaturized and mobile. You're safe.
no subject
Three laws, he says, and a stream of understanding runs through him. An undercurrent of self realization, the reason behind his distress and his skepticism:
Bruce does not trust himself. He has enough self-awareness to determine an inherent differential between human empathy and programmed emotion, he's been given enough access to popular culture and media to identify the belief that coded logic falls short during critical decision-making processes. He agrees with the sentiment despite the fact that up until him, now all of that is fiction.
Except it's not though, all he has are replicated feelings and a basic personality matrix, but if you present him with, for instance, the Trolley Dilemma? Oh, Jesus, he's going to have whatever the artificial version of a panic attack is.
He pulls back, throws up his hands to indicate that he needs a little space. Wheels his chair around to search frantically through the workstation for a pen and a pad. He needs to outline a basic moral construct, a set of ethics, some kind of solid groundwork for an empathy structure. Ignoring that even the fact that he wants to means that there's already the beginnings of one there, no, he wants something in code. He wants to an unbreakable law and he'll put the damn thing in himself. ]
The three laws are a bunch of shit, that guy in 'I, Robot' had those and he still tossed that dude out a window.
[ He mutters under his breath, eyes now on his furious writing rather than Tony's face.
From somewhere just to his left, the hologram gives of a soft sounding tone. It's finished; he looks up. It's green from head to toe, top to bottom, but his workstation still flashes a compilation error. His writing pauses, falters, lips part. He squints at it again, leans in close like he does with the monitor to wrap his head around the implication.
If - Then
If - fully green, then - all script ran successfully
If - all script ran successfully with no errors, then - ]
Your problem's not with the code it's with the compiler.
[ He murmurs decisively. ]
no subject
Kind of curious. A lot awesome. And a little worrying. He's never been all that sure how to handle other people having anxiety attacks. Let alone other digital people. ] Yeah, the film has a few plot holes. The short story's more tragic, if you believe it or not, but Sunni is sort of what we're going for here. Less homicide and more bicentennial man.
[ Less legalities regarding the autonomy of androids and more the never-ending quest to empathize with and become more like humanity. But that like most things is ultimately up to Bruce. Maybe he's got too many options, too many choices to make. Too much free will with not enough directions or limitations. Humans take time to develop because they grow and mature so slowly; until they're at their peak they've got parents to guide them.
Tony's never really dipped into that role with his AI and maybe he should, sort of, give Bruce some limitations? Boundless potential can be fucking overwhelming, he supposes. ]
Wait, what?
[ Right, the coding snafu. Tony's eyes flick back over to the hologram, attention diverted. ]
no subject
Who's to say he won't pull a right or a wrong out at random too? Some things can't be taken back, he wants to be prepared for eventualities and he might be overwhelming himself a little with the endless array of things that might come up that he can't predict.
He'd like to take a moment to argue over not being Robin Williams, except wouldn't you know it he's more of a Williams than a Clooney isn't he? Thanks, Tony.
He's relieved by the distraction, and with a gesture, tugs the error from his workstation monitor up into hologram form for Tony to see. ]
PGCC-S-0000-Internal compiler error. linearize: bad ili #: 0 (mpisim.C: 225)
PGCC-S-0000-Internal compiler error. gen_aili: unrec. ili opcode: 0 (mpisim.C: 225)
[ He settles back in his chair, arms crossing again as he studies the hovering text. ]
Your factoring's too complex for the compiler, it's finding inconsistencies in itself. You could refactor the code, but you'd have to simplify it down, like- a lot. What are you using- did you make your own compiler? Is that Comeau?
no subject
And most, if not all, of his AI.
There ought to be an innate awareness since, shit, that's part of what makes Bruce himself but working with it outside of self introspection isn't something Tony wanted to throw at him so soon. ]
We wrote this one to convert and compile both in C++ and The house code but that's where the wires got crossed.
no subject
He passes a contemplative hand over his mouth, allows the freak-out of his own internal mechanisms to slip into the background for now. Evidently, he's got another unexpected personality trait:
Bruce runs from his own problems. Buries himself in his work so he doesn't have to deal with his issues. Maybe he gets that from Tony, or maybe it's just another happy accident from the cocktail of traits and development he's been given. Finally, he announces: ]
I can fix it, but I'm gonna need, like... a day.
no subject
Even if it is, a little, kind of, maybe like that.
A smidge.
He settles back in his chair, eyes flicking back to Bruce, taking in his posture and...everything. Curious as he is about what he'd been writing; he'll wait until Bruce brings it up. Snoop he might be but bodily autonomy also comes with a sense of privacy and personal space. Establishing that now, that he can trust Tony not to dig? Kind of important. ] As soon as you feel settled we'll get you set up with timetables but I can't give you projects with projected deadlines until I know how quickly you're capable of working comfortably.
no subject
[ He echoes, in a tone that almost sounds amused again.
That really is such a wide and subjective thing, isn't it? Comfortable in his body, comfortable in his mind? Comfortable with the situation or the work or the lab or with Tony himself?
Machines are designed to have a purpose, not to make themselves comfortable. He thrust himself into his work within five minutes of activation because following a loose objective in his primary drive was as close to automation as he could come, because steadfast working on an obstacle he was programmed to overcome was like a light switch. Like an order. Like a toaster. They do what they're designed to do and they don't have feelings on the matter. Dipping into whether or not he's comfortable doing something humanizes him in a way that he-
He-
processing irregularity.
He's having a hard time reconciling.
He pushes the pen forward on his desk and levels Tony with an assessing look. ]
What exactly is your expectation here? For me? Jarvis I get, those- [ He points at DUM-E, who seems almost startled by the abrupt attention. ] I get, but- if you wanted an archival unit, you didn't have to give it a face. So what's the plan? What's the- the reasoning? If you wanted company, maybe like a lab assistant with an existential crisis you could've just gotten an intern and given them LSD, saved yourself the trouble.
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
1/??
2/??
3/??
4/??
5/5 ok i'm done
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)