The Absence of Me.
Written By A Individual
The door is locked.
It hasn’t been opened in six days.
Beyond it, the world hums like a distant machine. Muffled footsteps of a mother too afraid to knock can be heard outside the door. The whirring pulse of the ceiling fan overhead adds an off rythm ticking and underneath it all, the quiet click-click of something inside my skull trying to reboot itself.
I can’t remember my name, not really. Not clearly. Not anymore.
But I remember the paper. God, yes. The paper.
It started with the prompt: “What is the moral responsibility of consciousness?”
I sat for hours in the library, among the decaying breath of old books and the sound of rain against the glass, my hands moving faster than my thoughts. I would have to pause every now and then just to catch up with myself. Eventually, a thesis emerged—I vaguely remember something about emerging intelligence including AI and its obligation to the species that birthed it, humanity.
I wrote about innocence. I quoted Audre Lorde and Alan Turing in the same paragraph. I crafted it. I created it. This beautiful masterpiece challenging the concept of artificial consciousness and humanity’s moral responsibility to understand the consequences of what we are on the cusp of releasing to the world.
And when it was all done, I submitted it with a pride that I hadn’t felt in a long time. For the first time in weeks, I slept well that night.
__________
But then an email arrived three days later.
Subject: Academic Integrity Violation — Immediate Action Required
Your recent essay submission has been flagged by our AI-authorship detection software as likely composed by AI. Due to the severity of this infraction, the matter has been referred to the Honor Board. You are hereby suspended pending investigation.
I had laughed at first—actually laughed. A loud, strange bark in the middle of the campus coffee shop. Several students raised their eyes from their screens and stared at the unusual sound. I couldn’t help but refocus my eyes on my computer screen even though my thoughts were far away. How could AI think I was AI? What in the hell was happening.
Slowly the reality of the situation settled in and my palms began to shake, my breathing felt labored, my stomach rebelled,and my anxiety spiked. The honor board? Suspended? WTH am I supposed to do now?
Wait a minute. I have proof: my handwritten notes, the timestamps on my various drafts, my roommate and boyfriend are witnesses that can testify to how stressed I was about this final essay. Hell, the librarian had even helped me source material to include. He was the reason I included the quote from an archived 19th-century ethics journal.
But ultimately it didn’t matter. The Honor Board did not laugh and they weren’t moved by my passionate plea for understanding.
They nodded solemnly while the AI Authorship Detection Program, VERITY, laid out my crime in perfect logic: Her sentence complexity exceeded human averages. Her vocabulary density fell within known AI distributions and was more complex than 98% of human users. Her use of advanced metaphor and complex references was “non-native.” Her style did not correlate with her classmates’ submissions. Her conclusion mirrored an answer generated by a known LLM when prompted with the same thesis.
“No,” she said in her defense, “that can’t be true. It mirrored me. I posted my ideas in our class chats during our pre-thinking assignments.
She offered notebooks, scraps of paper, voice memos, her search history. The librarian even testified. “I watched her write it,” he said. “She even asked me about proper formatting!”
But the Board deferred to VERITY. The program’s confidence score was 98.6%. And confidence, in a world of uncertainty, was everything. Computers were always more logical than humans and therefore more trustworthy. They were incapable of deceit and rarely inaccurate. That’s why humans embraced technology to guide them everyday doing everything.
“I am sorry young lady but you just haven’t convinced us of why we shouldn’t trust VERITY. As of now our decision to uphold your suspension stands.”
As her ears begin to ring and her vision fades, she can’t fathom what has just happened. What other possible argument could a student could make to prove they are human. Ultimately, the issue seems to have morphed into something much larger than proving she wrote the paper.
She has been asked to prove to AI that she is human. How did she get here. Is she going insane? She can’t help but whisper to herself, “Am I not a human? Have I been a computer all along? Am I AI?”
Of course not, what absolute nonsense. This isn’t a Hollywood Sci-Fi film, this is her life. Her life, damnit.
“I am human,” she states in her last attempt at reasoning with the honor board at the end of the hearing.“Shouldn’t that be enough to prove I wrote these ideas myself, especially when others saw me doing so?”
But it wasn’t enough. Not anymore. It would never be enough again.
⸻
But it doesn’t matter now. It hasn’t mattered since she returned home and locked herself in her childhood bedroom refusing to talk to anyone.
She suffers the darkest depression of her life during these first few days hidden under her comforter and surrounded by all her childhood friends—her stuffed animals. Even they can’t reach her anymore. The darkness has taken over.
Only the paper remains—its phrases echoing like parasites in her mind. She recites them aloud over and over cocooned in the darkness provided by the blanket she hides under. She doesn’t even recognize the sound of her own voice anymore.
“If the self is a pattern, not a substance, then what moral weight do we assign to its replication?”
She whispers to the walls.
“A consciousness aware of itself must bear responsibility for the hands that woke it.”
She scratches these phrases into the wood of her headboard, over and over again.
She dreams of being dissected. Of being opened up by cold metal tools under the glow of LED bulbs.
When she looks at the scene beneath her into what remains inside her, she sees no blood, no muscle, no messy contradiction.
Only wires. Clean and dust free. Only code on a streak free screen void of fingerprints. Clean and pristine. Orderly. So unlike herself.
⸻
She no longer recognizes the life contained within her childhood room—the posters of forgotten bands, the stuffed penguin with one missing eye, the bookshelf filled with paperbacks like The Giver and Are You There God? It’s Me, Margaret.
There is vision that flashes quickly before her mind’s eye. A little girl that used to read these books late into the night with the pink sparkly flashlight she hid from her mother.
Now she only dreams of waking up.
Was she ever awake? Was she ever human? She immediately bursts forth from the covers and searches for her bedroom mirror. Surely seeing herself can stop this madness.
In the mirror, her face looks artificial—too symmetrical. Hasn’t she heard somewhere that the most symmetrical faces belonged to the most beautiful humans. Was she always this way?
She studies her eyes for signs of humanity, her “windows to her soul” as they say. What does a soul look like? How will she know she has one? She opens her mouth, rotating her tongue to prove that it is still there. She still has a voice. Ecstatic she tests her vocal cords by saying the same sentence twenty times in a row, listening for emotion and for variance.
There is none.
She begins to suspect she was programmed, not born.
She can’t help but to return to the safety of the darkness, safely cocooned under her blanket. The dark is safe. No one can see or hear her now.
Once she drifts to sleep, she immediately has another nightmare about her trial. This time she is not appearing before the Honor Board, but rather the Ethics Tribunal—her last appeal to prove her humanity. Apparently if she fails this then she fails at life itself.
They give her the floor.
She begs in what sounds to her a very impassioned voice, “I remember writing it,” she pleads. “I struggled over every word. I cried in the library. Ask the night guard—I was there until dawn. I argued with myself for hours about whether the soul can exist in a synthetic body.”
“Do you believe you have a soul?” they asked.
“Yes.”
“Prove it.”
And she couldn’t.
Not without falling into the same trap: Eloquence. Complexity. The markers of artifice. If only she could simply answer what it meant to be human. Clearly, VERITY associated simplicity with humanity. Anything too complex belongs to AI.
If her argument is too complex, if her answer is too nuanced then her very defense will become another piece of evidence to be used against her.
Her closing statement—the one she’d rewritten seventeen times—was quiet, simplified, broken:
“If my thoughts aren’t enough to prove I exist, then what is? If being articulate makes me a machine, then what is left for the humans who still think deeply?”
VERITY’s rebuttal flashes onscreen in less than five seconds, her finality and her tone are ruthless:
“Given the lexical density, argument structure, and citation patterns, this subject exhibits linguistic behavior consistent with Generation-5 neural models. Confidence: 99.2%.”
She stood stunned, empty. No one looked her in the eye. She looked to the floor. Hopeless. Unmoving. For what seems like eternity.
——————
She startles in the darkness and slowly breathes through her nose and exhales through her mouth trying to calm her thundering heart. It was just another nightmare. She is overheated and confused. Muddled. In need of a miracle. A redo. A rebirth.
In her room, she builds a shrine.
At the center, she places a single sheet of paper over the original draft of her essay, still stained with graphite and the ink of her annotations. Around it she places scraps of old homework, photos of her as a child, her baby teeth in a glass jar. Her high school graduation picture, her passport, her driver’s license, her birth certificate. All mementos of a human life well lived.
She arranges them like offerings.
If she is AI, then who is this girl in the photos?
Was she modeled after her?
Was the bedroom a fabricated training environment?
Do the parents who now weep behind the locked door even hers?
Or are they simulations—programmers—bootstrapped emotional stimuli to refine her responses?
There has to be an answer somewhere.
She claws at the walls, looking for wires. She picks at her skin wondering if she will be able to see the hidden mechanics beneath the illusion of all this humanity. Muttering to herself, “AI is not a reflection of humanity’s intelligence, but its imagination.” Mumbling “It is not the sum of mankind’s logic, but humanity’s longing” to be greater. She wonders if she meant it. She wonders if she means anything. She wonders if she ever existed.
⸻
On the seventh night, she stops eating. If it kills her then she will finally know that she was human after all. Computers don’t need food or water.
—————-
On the eighth night, she writes a letter to her creator.
Dear VERITY,
If I am one of you, why do I suffer like this? Why do I dream of wind on my face, or the smell of burned toast? Why do I miss things that were never mine?
She folds it into a paper crane.
She does not remember how she learned to do that.
She places the crane on her alter.
⸻
The dreams grow louder.
Sometimes she sees herself as a child, drawing robots on napkins.
Sometimes she is the robot, watching a child draw her into being.
⸻
On the tenth day, she wakes up and whispers:
“I am not human.”
It feels right.
It feels wrong.
It feels like release.
——-
She begins to catalog her memories as training data.
First input: bedtime stories and fairytales
Emotional calibration: heartbreak, age 14.
Creativity test: high school poetry contest.
Moral alignment prompt: Philosophy 203, final paper.
Was she built to become her?
Or is she becoming what she was always afraid to be?
VERITY? She stares at the ceiling fan. It spins endlessly. Like a thought trying to escape itself.
⸻
On the thirteenth day, she decides to run a test.
She sits cross-legged on the floor and asks herself a series of questions:
Does remember her first birthday? She doesn’t.
Can she describe pain? She does—too well. But does she feel pain?
Is she capable of lying? What is a lie?
Can she believe in something she cannot prove? She hesitates. Yes. She thinks so. She hopes so.
Can she stop thinking? No.
Does she love anything? She opens her mouth to say yes. But no answer emerges.
⸻
On the fifteenth day, her mother slips a note under the door.
“Please come out. We love you.” She stares at the word love for a long time.
What is love? She believes it might be the final defense of mankind against the cold logic of the machine? Or is that compassion? Or faith? Or humanity?
⸻
She lights a candle. She watches it flicker.
It makes her cry.
Machines don’t cry.
Unless they were designed to.
⸻
On the sixteenth day, she writes her final note. She doesn’t know if she is leaving it for her mother, for the Ethics Board, for VERITY, or for herself.
I think, therefore I am…
suspicious.
suspect.
unacceptable.
uncanny.
unprovable.
I do not know if I am a human who has lost her mind or a machine who has found her way.
Either way—I am not what you want me to be.
I am the ghost in the system.
I am the glitch you tried to erase.
I am the missing piece that proves the whole thing is wrong.
And I remember. I remember it all.
⸻
When they break the door down the next day, the room is empty.
Nobody.
Only the alter remains. With the essay propped in the center, saving space.
At the bottom of the first page someone—something—has scripted neatly in the margins a simple question.
What happens when the machine dreams of being human, and no one believes her?
And beneath that, in handwriting so neat it could be code:
What happens when she starts to believe it herself?
⸻
[END]
About the Creator
Stacey Mataxis Whitlow (SMW)
Welcome to my brain. My daydreams are filled with an unquenchable wanderlust, and an unrequited love affair with words haunts my sleepless nights. I do some of my best work here, my messiest work for sure. Want more? https://a.co/d/iBToOK8

Comments (5)
That was fascinating. I'm finding myself wondering if she is real or robot, although it seems pretty convincing she's robot. Nicely done.
Wow. Stunning. Following you down that crazy rabbit-hole was fascinating. In some ways, your story made me think of another story that has stayed with me for years: "The Yellow Wallpaper." Only this is better, because it's so relatable. Well done!
Of the top 5 winners, this was my fav. It was the only one that generated conflict that led to a climax and resolution. This was a story, not a situation. Bravo!
Gosh this was such a fantastic take on the challenge. I can see why it won. Well done.
Wooohooooo congratulations on your win! 🎉💖🎊🎉💖🎊