The Algorithm Chose My Children
How we let a predictive model decide who deserved to be born — and who didn’t

I used to think we’d have twins.
I saw it in a dream once, two tiny girls with my eyes and Ava’s laugh. They were running through a hallway of sunlight, dragging a blanket behind them, shouting nonsense and joy. I woke up crying. It felt so real, so inevitable. For weeks after, we called them by name — Lira and June — like they were already part of our story.
But the Algorithm disagreed.
It said our parental compatibility index had a moderate-to-high statistical risk of depressive inheritance, low neural resilience markers, and a suboptimal creativity threshold. It said the combined genome probability for Lira and June — though we hadn’t told anyone those names — fell below the threshold for viability.
Ava wanted to appeal. She printed out the 87-page recommendation file and marked it up with furious ink. She told me we could go to the Ethics Committee. But I knew we’d lose. Nobody wins against Predictive ParentAI. Not since the government outsourced the entire family planning infrastructure. It wasn’t just policy anymore. It was code.
The problem wasn’t that we were infertile. It’s that we were statistically inadvisable.
Ava stopped talking about the twins. She scrubbed their names off the whiteboard, deleted the lullaby playlist she’d been building, even unsubscribed from the nursery design channel she used to watch religiously. I caught her crying into a laundry basket one afternoon, clutching a tiny sweater she’d knitted before we submitted our genetic profiles. But when I tried to hold her, she just shook her head.
We were not allowed to grieve what had never officially existed.
Because the Algorithm hadn’t simply denied us Lira and June. It had never permitted them to exist at all. And in a world governed by logic, permission was everything.
I asked a friend — discreetly — how one might go about circumventing the system. He sent me a message that vanished after two views. It read:
“Don’t ask how to break the law. Ask who’s never had to follow it.”
At first I didn’t understand. Then I started researching high-tier exemption codes. Ultra-wealth. Political legacy. Strategic fertility grants. Turns out the Algorithm wasn’t the gatekeeper for everyone. Just for people like us.
Ava and I were educated, healthy, careful. But we weren’t influential. We weren’t investors or donors. We weren’t curated enough to matter.
It’s funny how the human brain rebels. How grief mutates into obsession.
I started collecting names.
There’s a digital shadow for every denied child, even if you never speak the name aloud. Predictive ParentAI runs simulations — so it can project likely futures, behavioral outcomes, ecological impact. Somewhere in the central servers, there is a ghost-version of Lira chasing soap bubbles, and June building quiet things out of wire and glass.
They exist in potential. That’s the cruelty of it. They almost were.
I met others like me in a hidden subnet, calling ourselves The Ghost Orchard. We shared name lists. Shadow photos. Snippets of projected personalities and lives.
It wasn’t therapy. It was war.
Ava didn’t follow me into that. She wanted to move on. We qualified for a “Low-Risk Male Singleton” the following year. The Algorithm approved a boy. No name recommendations were given. No personality simulations. Just a green light and a set of synthetic embryo coordinates.
We didn’t love him less because of it. But we loved him differently.
It was a quiet love. Like obeying a rule you never agreed to.
Release Protocol
I didn’t plan to do it.
Not at first. I only wanted to see them — the almost-children. The unapproved lives. But access came with responsibility, and curiosity, once opened, doesn’t go quietly back into the box.
One night, while Ava and our son slept — breathing in time like two synchronized machines — I sat before the terminal. My clearance, hacked from an old moderator’s credential file, let me in further than I should’ve gone.
There they were.
Names upon names. Tiny faces rendered in probability. Birthdays that never happened. Interests. Fears. Favorite colors, based on parental tendencies and inherited preferences.
An orchard full of light, waiting in silence.
And then the question appeared, blinking.
“Would you like to initiate public transparency protocol?”
I hesitated. I thought about Ava. About our son. About how we’d be flagged. Blacklisted. Possibly erased.
Then I thought about Lira. About June. About the twins we never got to love.
And I pressed Yes.
Within hours, the files spread across the net. People everywhere met the children they were never allowed to have. Some cried. Some raged. Some denied everything. But for the first time, everyone saw the system naked — its cold calculus, its brutal mercy.
The Algorithm did not respond.
Maybe it didn’t need to. Maybe it had already predicted this.
Or maybe, just maybe, the future isn’t as fixed as they want us to believe.
About the Creator
Alain SUPPINI
I’m Alain — a French critical care anesthesiologist who writes to keep memory alive. Between past and present, medicine and words, I search for what endures.



Comments
There are no comments for this story
Be the first to respond and start the conversation.