I asked ChatGPT to write a Black Mirror episode about me.
Spoiler: it wasn’t set far enough in the future — which is exactly why it’s terrifying. More documentary than dystopia.
In a near‑future America, taken over by techno‑authoritarianism, AI deepfakes my own voice, face, and workshops into propaganda that turns the very women I once empowered against me.
1:07 a.m., July 2025
My kid, my husband, my dachshund, were sound asleep.
The world, of course, wasn’t. And neither was I.
Somewhere between Traumatic Headline #6, a “calm girl” gummy ad, and yet another Do It Lady reel, when the doomscroll got louder than my own thoughts, I opened ChatGPT.
“Write a Black Mirror episode about me.”
I did it for shits and giggles, really.
What came back
Wasn’t Handmaid’s Tale feminist dystopia.
Wasn’t Contagion‑style bioterror panic.
No 28 Days Later zombies.
No Will Smith fighting aliens or Liv Tyler, Ben Affleck and an Aerosmith song that still lives rent‑free in my brain.
It was scarier:
It didn’t feel sci‑fi; it felt like tomorrow morning.
An algorithm trained on everything I’d built:
My negotiation zine, late‑night Substack drafts, Zoom recordings of live workshops, pep‑talk voice notes to clients, meme folders, ancient Tumblr rants, scribbled Google Docs of future curriculum and business plans.
And later, it scraped even more:
My appearances on NBC, my NYT Magazine feature, articles I wrote for Bust and Bustle, interviews with Business Insider, Glamour, BBC, Fast Company, every podcast where I was a guest.
All of it.
Rewritten in my voice — sharpened, weaponized, algorithm‑approved.
And the most 2025 twist?
I edited the original premise of what ChatGPT gave me and used it as a thought starter. Because writing now means wrestling your digital twin until she blinks back something almost true.
Then editing again. And again. And again.
Before I show you the episode, let me do the most media‑girlie thing ever:
Offer a quick critique.
Not a Rotten Tomatoes score (it’s not out yet but you can feel it coming).
Think of it as an A24 trailer: part critique, part warning.
The episode opens
It didn’t start with sirens or smoke but something real and eerily familiar:
Trump’s election was the last real election. Future elections are canceled.
Civil servants have been replaced with loyalists.
Executive power has expanded beyond law.
Media has been discredited; the government flood feeds with propaganda.
Margaret Atwood once wrote that in The Handmaid’s Tale, everything fell apart so quietly we watched it on TV. Now, it’s livestreamed with comments turned off.
Then it zooms out
They didn’t just hack democracy.
They replaced it.
With companies that don’t just predict what we’ll do next — they script it:
Palantir: surveillance‑as‑a‑service; your face, vote, digital past and likely future — monetized.
Grok AI: rage‑bait repackaged as “free speech.”
OpenAI: billions of words we wrote to help each other — sanitized, auctioned to the highest bidder.
All while the feed keeps selling us self care (basically anything to keep us scrolling, never questioning).
Then it got personal
I built workshops to help women stop leaving money on the table and own their ambition.
The algorithm devoured them, learned them, made them sharper — then spat out thousands of deepfaked “new lessons,” each more viral than the last.
Same warmth. Same authority.
But now telling women to submit.
To surveil each other.
To love power, but never ask whose power it is.
The machine learns what women want faster than they can say it.
Then learns what makes them furious when someone else wants it.
That rage becomes engagement.
Engagement becomes data.
Data becomes power — power that never belongs to them.
The Get Out moment
Masked agents ripped me off the street. Imagine ICE meets Amazon Prime: same‑day shipping for inconvenient women.
They put a black pillowcase over my head, transported me to Dallas, and put me on stage at the Young Women’s Leadership Summit. 3,000 girls in ruffled sundresses and boots, pins that read Dump your socialist boyfriend and My favorite season is the fall of feminism.
My hair and outfit? Pre‑chosen. No guests allowed. No off‑script jokes.
The teleprompter scrolled:
“Soft power. Submission reframed as strength. Ambition reframed as service.”
And then I really saw the women in the audience — and for a breath, they saw me too — so I broke script:
“Smart women don’t follow the rules. We write them.”
Security dragged me off.
The darker twist: ChatGPT thought this was dystopian fiction
I’d told ChatGPT that they’d throw me into a boat and send me to Alligator Alcatraz — my brain hurting as I typed it because it’s the kind of place that shouldn’t exist, but in our world, it does.
Here’s the creepiest part:
ChatGPT thought Alligator Alcatraz was dystopian metaphor.
I had to break it to the AI:
No, sweetheart. It exists.
We’re already worse off than you imagined. It isn’t speculative fiction — we’ve broken the fourth wall.
Meanwhile, online
The livestream vanished.
An AI‑generated replay dropped instantly: my face, my voice — but my words replaced:
“Smart women follow the rules. That’s how we stay safe.”
Girls who posted the real clip? Shadow‑banned.
Scholarship emails quietly re‑reviewed.
The algorithm didn’t delete dissent.
It mirrored it, isolated it, replaced it.
And now?
Stephen King might call it quiet horror: a door that locks before you notice.
Margaret Atwood might call it history sped up by venture capital.
And me?
It isn’t just fiction.
It’s a mirror.
The scariest part?
How easily we’d double‑tap, share to stories — and keep scrolling.
Like every great Black Mirror ending:
Not tanks, not tear gas. No red lights flashing.
Just your own face, softly filtered, telling you everything is fine.

