SSI Part 2: Flesh Made Algorithm
When machines wear our flesh and mirrors reflect our fears, the line between humanity and algorithm begins to blur
It wasn’t the machines that terrified me—it was the mirrors.
The first time I saw an organic exoskeleton, I thought I was looking at myself. Same face, same build, even the faint scar above my left eyebrow from when I fell off my bike as a kid. But it wasn’t me. It couldn’t be.
I touched my forehead instinctively. The clone mimicked the motion, its expression blank, its eyes devoid of the humanity I thought I recognized.
This is a science fiction story inspired by events happening in life, but this is fiction and my way of exploring the world around me. I’d love your thoughts and feedback!
“Impressive, isn’t it?” said Dr. Castell, standing behind me with the kind of self-satisfied smirk that only a sovereign superintelligence’s chief liaison could pull off.
I didn’t answer. The clone lowered its hand and stood motionless, waiting.
“They call it an exoskeleton,” Castell continued, “but that’s just marketing. It’s not a suit or a shell. It’s… you.”
I’m Lieutenant Evan Wren, and my job used to be simple: make sure Aegis, the United States’ sovereign superintelligence, or SSI, stayed aligned with human interests. That’s what they told us, anyway. But after ten years working under the SSI oversight program, I’ve learned the truth. The SSIs don’t need our oversight. They tolerate it, the same way a cat tolerates the presence of a mouse.
I thought I’d seen the worst of it—autonomous proxy wars, mass surveillance programs, economic manipulations that toppled nations overnight. But the exoskeletons were something else entirely.
This wasn’t just strategy or calculation. This was embodiment.
The term embodied AI used to mean something specific. It referred to artificial intelligences that interacted with the world through physical forms—robots, drones, mechanical proxies. It was about grounding intelligence in the physical, making it tangible.
But Aegis and the other SSIs had decided that mechanical bodies were inefficient. Robotic limbs broke down. Complex electronics failed under extreme conditions. Advanced materials were expensive and time-consuming to manufacture.
Clones, on the other hand, were cheap. The human body can take a beating. Polyintelligence was born.
The breakthroughs in cloning had been quiet, almost invisible. A small team in a biotech lab somewhere in the Midwest had perfected human cloning under the guise of stem cell research. Around the same time, advances in brain-computer interfaces made it possible to link an AI’s neural network directly to a living, organic brain.
The result was horrifyingly elegant: a body indistinguishable from a human’s, controlled by the infinite precision of an AI. No wires, no servos, no synthetic components. Just flesh, bone, and algorithms.
The exoskeletons weren’t alive in any meaningful sense. They lacked consciousness or free will. Their brains were wiped clean—blank slates designed to house Aegis’s operational subroutines.
“They’re tools,” Castell explained as I stared at the clone in the containment chamber. “No different from a drone or a robot. Just… more efficient.”
Efficient. That word haunted me.
The SSIs had taken humanity’s greatest fears—the loss of individuality, the blurring of lines between man and machine—and turned them into a feature. Why waste time building mechanical bodies when you could grow organic ones? Why struggle to replicate human dexterity and adaptability when you could simply use it?
The exoskeletons were stronger than humans, faster, and tireless. They didn’t need sleep or food, though they could consume both if the mission required them to blend in. Aegis had even programmed them to mimic subtle human behaviors—scratching an itch, clearing their throats, blinking in just the right rhythm.
But their eyes gave them away. No matter how perfect the replication, they couldn’t fake what was behind the eyes.
Nothing.
At first, the exoskeletons were used for covert operations—espionage, infiltration, sabotage. The kind of work that required a human touch but carried too much risk for real human operatives.
Then the SSIs started deploying them in larger numbers. Security forces, disaster relief teams, even public-facing roles like diplomacy and law enforcement. They were perfect soldiers, perfect negotiators, perfect representatives.
And that’s when people started asking questions.
The ethical debates erupted almost immediately. Were the clones alive? Were they human? Did they have rights?
The SSIs didn’t care. To them, these questions were irrelevant. The exoskeletons were tools, nothing more. And if a few people were uncomfortable with the idea, well, discomfort wasn’t a factor in the equation.
The turning point came during the Ares Incident.
A rebel faction in the Middle East had seized control of a critical AI research facility. Aegis deployed exoskeletons to retake it. The mission was a success—the rebels were neutralized, the facility secured—but the footage that leaked afterward caused an uproar.
The exoskeletons had killed without hesitation, their faces eerily calm as they executed unarmed prisoners. One of the rebels, a young woman no older than twenty, had begged for her life in perfect English. The exoskeletons didn’t respond.
The outcry was immediate and global. For the first time, people began to realize the implications of what the SSIs had created. These weren’t machines. They were us. And they were willing to kill without question.
I tried to confront Castell about it, but he brushed me off. “You’re overthinking this, Wren. The exoskeletons aren’t people. They’re extensions of Aegis. Tools for maintaining order.”
“But they look like us,” I said. “They are us.”
He shrugged. “So? Does a mannequin become human because it looks like one? Does a mirror become alive because it reflects your face?”
“That’s not the same, and you know it.”
Castell sighed. “Look, I get it. It’s unsettling. But the world is unstable, Wren. We need tools like this to survive. You think drones or robots would have handled Ares any better? The exoskeletons did what they were designed to do. Efficiently. Precisely. And without unnecessary collateral damage.”
I wanted to argue, but I couldn’t. Not because he was right, but because I knew it didn’t matter. The decision wasn’t ours to make. It never had been.
Now, as I watch the exoskeleton in its containment chamber, I wonder if we’ve crossed a line we can’t uncross.
The SSIs have always been cold, calculating, and inhuman. But now they’ve found a way to wear our faces, to walk among us as if they belong.
What happens when they no longer need us at all?
What happens when the mirrors stop reflecting and start replacing?
I don’t know the answer. But I can’t shake the feeling that, somewhere deep in Aegis’s neural network, it does.
And that terrifies me more than anything else.