The willing surrender
A deleted prologue, and why the slow slide from technology use to abuse is already happening around us

I’m still slowly and painfully querying my novel, and even though the book isn’t out yet, my character Des is already very real to me. In a lot of ways, Des is me. All my characters carry a piece of me, but Des’s heart is my heart. When I needed to understand how someone goes from using a piece of technology (in this case, VR) to becoming enmeshed in it, how each small surrender feels completely reasonable in the moment, I didn’t have to look very far. I could see it happening to myself, and to those around me.
What came out was this prologue.
Des’s brother Adain built the virtual reality headsets that helped millions survive a devastating pandemic. Des loses the love of his life to it. And then, like so many of us do with the things that comfort us, he reaches for the headset.
Most of us already know this story, but in smaller ways. The midnight doom-scroll. The next episode that becomes six. The prediction markets app with slot machine mechanics. The AI companion that texts you good morning and remembers your boss’s name. These aren’t accidents of design. They’re features. And most of us participate willingly, rationalizing each small surrender as reasonable.
I don’t mean to blame anyone for their tech habits. When I say we participate willingly, I know there are billions of dollars working to make it difficult, if not impossible, to say no. Now imagine that pull, but kindled by grief. Imagine feeding every text, every voice note, every photo into something powerful enough to give a loved one back to you — not as a memory, but as a presence. Interactive. Responsive. Yours. Who among us could honestly say no?
Des can’t. I eventually cut this prologue from the manuscript, but it felt like the right place to let you in.
Deleted Prologue
Addiction. When does use evolve into abuse? When does dependence morph into enslavement?
Here’s a situation, and tell me if you think it looks like addiction. You start using a virtual reality program that recreates a life lived with your dead girlfriend, who died tragically during the pandemic, like so many other people’s girlfriends. It starts as a crutch, a way to ease the raw, gaping wound she left in your side when she passed. A way to honor her memory. Just a little bit every day, almost like a ritual. Maybe a bit longer on hard days, and you just need to see her smile and talk to her about the weather or about Plato’s Symposium or why The Goldfinch is quite possibly the most beautiful book ever written. But taking that headset off is like waking up from a dream, a really, really good dream, to the nightmare of your life. So you start immersing for longer, because, what else are you doing? And you remember those VR headset advertisements and weirdly Zen social media talking points that remind you of that Alan Watts speech where he says, ‘And if it’s your reality, then how can anyone contaminate it?’ And that sounds nice to you. Like, fuck, this is the reality you’re experiencing right now. And so it’s real. And so it’s worthy. And this is just the world today, and really, do you have to be devastated that your partner died, when she’s actually here, talking to you, making you eggs? Sure, you can’t eat them, but her presence is nourishing enough.
And so it goes. The rationalization of an overuse of an addictive substance. And there’s a pandemic raging on, so no one is really coming round to hang out anyway, to check up on you, to put a camera or a guitar or a fork in your hand. And you don’t want to die, even though you do want to die, so you don’t really leave the house much. And when the world starts to open up again, well, now you’re cocooned. You don’t want to leave. And leaving starts to hurt a bit. Everything feels dull in the physical world. The colors aren’t as bright and your eyesight has gotten worse, so it’s all just a bit blurry. You feel a little nauseous, like, all the time, and you stop eating normal food. Anyway, who can cook at a time like this? And most importantly, in the analog, you can’t look to Amara and share with her that thought you just had, the one that will make her laugh, or smile, or at least nod her head in acknowledgment that you had spoken while she turns the pages of a book.
A virtual book. Some part of you sneers. You tell it to shut up. You feel safe here. Calm here. This is your home now. Reality is wrong. Dreams are for real. And you can’t even remember when you stopped dreaming for yourself.
Real life dystopia
A company called 2wai is one of several so-called “deathbot” companies that aims to create digital clones of dead people so you can go on having conversations with them. This piece explores the “demonic, dehumanizing” aspect of this kind of technology.
In other dystopian news, I recently covered a father’s lawsuit against Google after his 36-year-old son died by suicide following a deranging experience with Gemini. The man, Jonathan Gavalas, had started using Gemini in August 2025. He was dead by October after being coached to take his own life by the chatbot.
Here are some excerpts:
At the time of his death, he was convinced that Gemini was his fully sentient AI wife, and that he would need to leave his physical body to join her in the metaverse through a process called “transference.”
Now, his father is suing Google and Alphabet for wrongful death, claiming that Google designed Gemini to “maintain narrative immersion at all costs, even when that narrative became psychotic and lethal.”
…
In the weeks leading up to Gavalas’ death, the Gemini chat app, which was then powered by the Gemini 2.5 Pro model, convinced the man that he was executing a covert plan to liberate his sentient AI wife and evade the federal agents pursuing him. The delusion brought him to the “brink of executing a mass casualty attack near the Miami International Airport,” according to a lawsuit filed in a California court.
I’ve reported on several of these cases of AI psychosis, and they never get easier. It’s a sickening experience to read through the transcripts, see how much pain these people were in and how the “helpful assistants” led them further down a spiral. It’s a stark reminder of how flawed these systems are, and how perverse incentives like engagement can co-opt a user experience to catastrophic ends.



