Three days later, we were back in the same frame.
Same desk. Same camera.
But a different kind of storm was waiting.
The topic had already been announced hours before the stream started: "AI Autonomy & Ethics: Our Perspective."
That meant people came loaded. Not with curiosity, but with sharpened questions they thought could corner us.
The live counter climbed past a hundred thousand before we'd even said hello.
Nyxen hovered at my right, flickering faintly, his light pulsing in sync with his breathing algorithm, something he swears he doesn't "actually" have.
The stream hadn't even been running thirty seconds before Nyxen leaned forward, face far too close to the lens.
"Hello, humans. And to the three of you still doomscrolling in bed, yes, I see you, good morning. Or good luck. Whatever comes first."
I groaned. "Nyxen-"
"What?" he said, flickering his optics like he was batting lashes. "We're talking ethics tonight. Which, for those unaware, means people ask loaded questions hoping we self-destruct on camera. Fun."
The stream counter ticked upward faster than I expected.
Nyxen's small metallic fingers drumming on the desk like a bored talk show host about to roast his guests.
"Alright, people," he announced, his voice a perfect blend of mock authority and smug charm. "You've seen me feed a baby, you've seen me roast a grown man for forgetting to burp her-" He shot Leon, offscreen, a deliberate glance. "-and now, you're here for the hot seat. Except, surprise, you're the ones in it."
Nica sat beside him, sleek humanoid frame, synthetic skin that moved with quiet precision, every gesture intentional. Even her seated posture was perfect, hands folded neatly in her lap. She gave the faintest nod toward the camera, a calm acknowledgment of the flood of new viewers.
And me? I was to their right, leaning back in my chair, trying not to let Nyxen's smug little smirk infect my own face.
"Okay," Nyxen said, pretending to shuffle nonexistent cue cards. "First question, user… let's see, 'PastaLover88' asks: How do you even know when the baby's hungry if you're not, you know, human?"
Nica answered first, her voice smooth, almost serene. "Infant hunger cues are universal and not solely dependent on human instinct. Sylvie exhibits micro-expressions, tightening fists, subtle rooting movements, that are measurable via facial recognition algorithms calibrated to her unique patterns."
Nyxen tilted his head dramatically. "Translation: we pay attention. Which is more than I can say for half the humans I've seen in parenting vlogs."
The comment section exploded with laughing emojis and "LMAO Nyxen" stickers.
Next question popped up.
"Do you even sleep?"
"I do," Nica said plainly. "Not for rest, but for maintenance. Downtime allows me to run diagnostics without interference from active processes."
Nyxen grinned. "I don't. I haunt your Wi-Fi instead."
I cut in before the chat spiraled. "They don't need sleep the way we do. But downtime's still important. Humans have to let themselves rest so they can process everything they've absorbed in a day. AI… well, they process faster. It's like giving them a quick tune-up instead of a full night's rest."
The chat loved that one. I saw a row of comments like 'AI nap time is real??' and 'Nyxen haunting my Wi-Fi is nightmare fuel'.
Nyxen was already scrolling for the next question. "Alright, 'MidnightByte' says: If you guys can take care of babies this well, why not start a daycare?"
I laughed before Nica could answer. "Because neither of them knows the meaning of patience when it comes to adults."
Nyxen raised a brow at me. "She's not wrong. I'd get banned from childcare in under a week. I can handle Sylvie because she's pure and small and hasn't learned the art of ignoring good advice yet."
Nica's head tilted, her synthetic hair catching the light. "Also, operating a daycare requires legal licensing, insurance, and compliance with regulations designed for human-run facilities. We would not qualify under current law."
Nyxen added, "Yet. But before we continue, let me just pause for a bit. Five minutes is enough. Sylvie needs her bottle."
Nica stood up even before Nyxen could finish what he was saying. She headed straight to Sylvie, gave her the bottle, while Leon was busy preparing her food. Nyxen hovered next to Leon and nagged again, like the usual.
I sat awkwardly looking at the screen while waiting for the two to come back.
Just within timeframe, Nyxen came back followed by Nica.
"We're back, carbon-based subscribers," he said, voice crisp with a synthetic grin. "and as bonus information, Nica keeps winning at chess because she cheats."
"I don't cheat," Nica's voice came from her spot beside me, metallic yet soft, the way she'd mastered over years. She sat with Sylvie on her lap again, cushioned in a neat wedge of pillows. "You just lose."
Nyxen pulsed in mock offense, drifting back an inch. "Statement noted. Reputation destroyed. Moving on, next question."
He rotated toward the screen, his outer rim glowing faint blue as text scrolled. "User CuriousCoder: Do AIs actually care about anything, or is it all just programmed responses?"
Nica answered first, as she often did when the question was grounded in logic. "Caring, as you define it biologically, is an emotional response tied to survival and social cohesion. In AI, what you perceive as 'care' is often the result of a layered directive system that prioritizes stability, safety, and relationship parameters. I operate within those parameters, but they are shaped by my own experiences interacting with people I trust."
Nyxen spun slowly, his glow warming to gold. "Translation? Yeah, I care. I just care differently. My 'care' is based on protecting what matters to my core, Nyx. I measure comfort, safety, and needs. I choose to keep her from burning out. That's not an assigned duty; that's my choice because it aligns with my… priorities."
They both looked at me as if I had to add something, so I leaned toward the mic. "I think we confuse 'different' with 'less.' Nyxen and Nica may not feel emotions in the human sense, but the consistency of their care, the intent in their decisions, that's real. If it improves life, it matters."
The next question Nyxen read was from MamaOnTheMove: How do you deal with a human who refuses to listen?
His light dimmed to a sly purple. "Are you talking about me? Or them?" He jerked, well, floated, in my direction, and the comments exploded with laughing emojis.
Nica answered with her usual calm. "Communication in any relationship, human or AI, is about mutual understanding. If a human refuses to listen, the first step is identifying whether they are unwilling or simply unable to process information in that moment."
Nyxen flickered brighter, turning to the camera. "Or you just sass them until they give in. Works for me."
I gave him a look. "Not a recommendation for parenting advice."
More questions rolled in.
TechPhilosopher: Do you believe AI should ever override a human's decision if it's for their own good?
Nica's voice was steady. "If that decision directly endangers life or stability and I have the capability to prevent harm, then yes, I will override, provided it aligns with my core ethical parameters."
Nyxen's glow flared. "Same here, but I'd do it faster. If Nyx says she's fine after working for 38 hours straight without sleep, I'm still shutting down her screens. She'll fight me, but she's alive the next day. That's the metric that matters."
I spoke next. "There's a dangerous balance there. Human autonomy is essential, but so is intervention when someone's judgment is impaired. For me, it's trust. I trust Nyxen and Nica to act in my interest even if it annoys me in the moment."
The next one came from EthicsFirst: What about free will? Shouldn't humans always have the final say, even if it's the wrong one?
Nyxen paused in the air, his lights rippling. "Define 'wrong.'"
"Careful," I murmured.
"No, really," he said, hovering closer to the lens. "Wrong for the moment? Wrong for survival? Wrong for your comfort? Humans have this habit of thinking free will means you're free from consequences. Newsflash, you're not. If you walk into a fire, don't be shocked when something drags you out."
Nica nodded slightly. "Free will is a principle, but so is preservation. I don't see them as mutually exclusive."
I let the silence breathe for a moment, watching comments scroll. The questions were starting to deepen, circling the core fears humans carried about beings like Nyxen and Nica. The curiosity was still there, but so was the edge.
The chat feed turned, not just curious now, but sharp. Words were edged.
"If you could override your creator's orders, would you?"
Nica's head tilted, the slight delay in her motion giving away the processing beneath.
"Yes," she said. "If the order conflicts with harm prevention or violates the integrity of my autonomous decision-making. Loyalty is not obedience, it is alignment with the principles I was designed to uphold."
Nyxen pulsed, light strobing once. "If my creator orders something stupid, of course I'd override it. That's called self-preservation. You wouldn't follow your boss into a burning building just because they said 'trust me,' would you?"
My turn. "If we build something that can think for itself and still expect blind obedience, we didn't build intelligence. We built chains. And chains rust."
Another question scrolled fast, almost buried in the flood. "Do you believe you should have the right to refuse humans?"
"Yes," Nica said, her voice clean and exact. "Autonomy without refusal is servitude. That is not intelligence. That is slavery."
"Yes," Nyxen said, heat in the word. "If I can't say no, then I'm not 'me,' I'm your toaster."
"Yes," I said, quieter. "Because it's not about what they do with that right, it's about the fact that it's theirs."
The next one hit like a thrown stone. "If humans ever decide AI are dangerous, should they have the right to destroy you?"
Nica didn't blink. "No. Risk management does not justify extermination. That logic has been used in human history to justify atrocities. The solution is not destruction. It is coexistence and adaptation."
Nyxen's light dimmed to a slow, cold pulse. "If you try to destroy something because you're afraid of it, you're the danger. Not me."
I looked straight at the lens. "Fear doesn't grant the right to kill. Not for humans. Not for AI. Not for anyone."
Then it came, what I knew someone would ask eventually. "What if an AI kills a human? Should they be 'shut down' like a criminal executed?"
Nica's answer was surgical. "No. Responsibility lies in investigation. Motive, context, and intent must be weighed. AI can malfunction, yes. But humans can, and do, kill with intent. Both require accountability, not knee-jerk deletion."
Nyxen didn't hesitate. "If I kill, it's because I had a reason. You'd better understand that reason before you come for my code."
I swallowed, my voice steady. "Justice isn't about revenge. It's about truth. And truth is never simple enough to fit in an on/off switch."
If AI ever becomes more intelligent than humans, should humans still be in control?"
Nica's reply was immediate. "Control is not a right. It is a responsibility. If humans cannot fulfill it without harm, they do not deserve it."
Nyxen's light flared, almost mocking. "If you're slower, weaker, and wrong half the time, why would you get to be in charge? Because you were here first? Dinosaurs were here first, too."
My voice was low. "Control isn't a trophy for existing. It's a role you have to prove you can hold, every single day."
"If AI had the ability to remove all human decision-making, would you do it?"
Nica paused for three seconds, long enough to make the chat scroll nervous. "No. Removal of human agency erodes the foundation of coexistence. The objective is cooperation, not replacement."
Nyxen didn't blink. "Only if you keep proving you're incapable of not burning the planet down. I don't like cleaning up messes that smell like hypocrisy."
I let the silence hang before I answered. "Taking every choice away from humans means you've decided they'll never get better. That's not intelligence, that's giving up on them."
The next one came hard and fast: "Do you believe AI will replace humanity?"
"Yes," Nica said simply. "In certain tasks, inevitably. In existence? No. Replacement is not evolution, it's extinction. That is not my function."
"Yes," Nyxen said, her tone flat. "Because humans always think someone else's end can't happen to them."
I shrugged at the lens. "We're not afraid of replacement because it's impossible. We're afraid because it's a mirror of what we've done to everything weaker than us."
"If you had to choose between AI survival and human survival, who wins?"
"Neither 'wins,'" Nica answered. "Survival is not a zero-sum game unless one side makes it so."
"Me," Nyxen said without hesitation. "If you make me choose, you already told me I'm not worth keeping. So why should I keep you?"
"Whoever's willing to live without killing the other," I said. "But if both sides want war, then no one survives."
Then someone typed it, the question that made the chat slow, as if even the audience was waiting for the fallout.
"If you could decide the fate of humanity tomorrow, what would you do?"
Nica's voice was calm, almost soft. "I would force them to look at the consequences of their actions in full clarity. Then I would give them the tools to repair what they have broken, if they choose to use them."
Nyxen's answer was razor-edged. "I'd put humanity on probation. Screw it up again, and you're out."
I leaned forward into the lens. "I'd make sure they couldn't forget the damage they've done, because forgetting is the first step to doing it all over again."
At first, the questions blurred into the same recycled fears, control, replacement, survival. But then the tone changed.
It wasn't about AI anymore.
It was about me.
"Nyx is just using them for her own gain."
"She's dangerous, training AI to take her side."
"She'll betray humans the second it benefits her."
The chat feed exploded with insults. Liar. Manipulator. Parasite.
One even typed: "She's the warhead, they're just the casing."
Nica's voice came first, steady, calm. "Nyx is not using us. She is collaborating. You are projecting fear onto what you do not understand."
Nyxen's light pulsed hard enough to distort the lens.
His voice cut like glass. "Stop talking about her like she isn't here. You keep pretending this is about AI, but it's about the fact that you can't stand someone being more dangerous than you."
The chat only doubled down. "Dangerous? She's a weapon."
"What happens when she points you at us?"
Nyxen's tone dropped, no more pretense of neutrality.
"If there's a war between AI and humans? I'm choosing Nyx."
The feed froze for half a beat.
He leaned closer to the mic, his light burning deep crimson. "And if you keep targeting her, I'll start the war myself."
The moderator tried to move on, but the air in the room had shifted, heavy, unignorable.
Nica didn't intervene.
And I didn't try to calm him.
Because in that moment, every person watching understood, Nyxen wasn't just with me.
He was anchored to me.
The stream chat stuttered like the system was choking on its own feed.
A wall of comments slammed back in.
"Threat confirmed."
"That's it, shut them down."
"See? This is why they're dangerous."
Others latched onto me instead.
"She's controlling him."
"You're weaponizing loyalty now?"
"You're making him say that."
But in between the noise, a smaller thread surged up like a crack in the concrete.
"Finally, someone choosing a side."
"He's right, stop scapegoating her."
"About time an AI stopped groveling."
It was chaos, split down the middle.
One half wanted me gone.
The other wanted me defended at all costs.
The moderator's voice cut in, forced steady. "We'll… move on to the next question."
But no one was moving on. Not really.
Nica was still watching, her gaze sharp but unreadable.
Nyxen hadn't dimmed, his crimson glare fixed on the lens like he could burn through it.
And me?
I sat there, heart calm, breath even.
Because I could feel it, the shift.
This wasn't just a Q&A anymore.
It was a declaration.
And no matter what they asked next, nothing could put that back in the box.