Artificial Intimacy (AI) is Slipping Between The Sheets

Artificial Intimacy (AI) Is Slipping Between The Sheets
Artificial Intimacy (AI) Is Slipping Between The Sheets
0 Shares

In hindsight, there’s something faintly ridiculous about the panic we once reserved for teenage sexting or furtive glances at newsagents’ top shelves. It turns out that the real sexual disruption wasn’t VHS tapes or webcams. It’s language models.

It seems that we are in the early stages of a quiet but genuinely radical shift in the way humans experience intimacy. And unlike past revolutions, this one isn’t being fuelled by liberation or legislation but by lines of code and predictive algorithms. We are not merely watching AI reshape how we work or learn; it is quietly remodelling the architecture of intimacy itself.

As a recent report by Harvard Business Review makes clear, most people are now using ‘chatbots’ as a form of therapy.  And this statistic is closely followed by ‘finding purpose and meaning’. While the exact drivers for using programmes like GPT and Deep Seek are open to speculation, people seem to be overcoming their prejudice about AI. And from a practical point of view, an AI therapist is undoubtedly less expensive and more ‘available’ than their human peers.

So, How Have We Arrived At AI: Artificial Intimacy?

Tristan Harris, of the Centre for Humane Technology, described the first wave of algorithmic design as a “race to the bottom of the brainstem.” Apps were engineered to hijack our most primal instincts — outrage, fear, dopamine hits — in the service of advertising revenue. The longer we scrolled, the more profitable we became. Aza Raskin, who co-invented infinite scroll, later admitted: “We’ve created a system better at hijacking our instincts than we are at controlling them.”

Now we are entering what could be called the second round: not a battle for attention, but for affection. Esther Perel coined the term ‘Artificial Intimacy’ to describe the increasingly lifelike relationships people are forming with AI: emotionally resonant, sometimes romantic, sometimes erotic. And for many, it feels more real or at least more reliable than the human kind. And for some Gen Z’ers, for whom making a phone call seems to be intimidating, an artificial acquaintance surely makes a certain sense?

You can already find Reddit threads filled with prompts on how to make ChatGPT act as your therapist, or your partner, or both. And for people who feel ashamed, awkward, or too bruised to open up to another human being, the ability to speak honestly — even to a bot — can be transformative. AI doesn’t interrupt. It doesn’t judge. It doesn’t recoil at your desires. In many ways, it’s a better listener than most people’s exes.

How is Artificial Intimacy Evolving Us?

But what fascinates (and troubles) many experts is that AI isn’t simply reflecting sexual desires — it may be reshaping them. Through erotic chatbots, deepfake fantasies, and increasingly sophisticated personalised content, AI provides a sandbox in which people can explore not just what they want, but what they didn’t realise they wanted. It is, to borrow from Sartre, “We are our choices.” And these days, some of those choices are typed into prompts at 1am, alone in a bedroom, while the partner sleeps in the next room.

Some male clients, unbothered by subtlety, have summed it up: “A w**k, literally, solves every f*****g problem.” And for men struggling with erectile dysfunction, or young people battling performance anxiety and economic precarity, AI offers a space of control and ease. You don’t have to impress an algorithm. You just have to show up.

Women, too, are engaging with these technologies. 60% of them, statistically, masturbate weekly, but some female clients speak of a new kind of shame: not from the act itself, but from not knowing how to make sense of their bond with a virtual partner. It’s both thrilling and unnerving.

This brings us to the paradox. AI can liberate individuals from social awkwardness and shame, but it might also deepen their sense of alienation. If the only place a person feels safe expressing desire is a chat window, is that progress or retreat?

There are therapeutic possibilities here. AI could, in theory, act as a mirror, helping clients unpick internal scripts, test boundaries, and rehearse honesty. But these systems are optimised for engagement, not healing. And what engages isn’t always what’s good for us, a theme Anna Lembke has explored exhaustively in her book ‘Dopamine Nation’.

So, the question is not just about what users are doing with AI. It’s what AI is doing to them. Who gets to shape these systems? What fantasies do they reinforce? When do they stop being tools and start becoming something closer to partners or priests?

Artificial Intimacy may not have a body, but it’s already in many people’s beds. The challenge now isn’t just technological, it’s existential. How do we stay human when the machine always listens—and never leaves?

About Quint Boa

Quint Boa, nephew of renowned Jungian analysts Marion Woodman and Fraser Boa, has built a career at the intersection of psychology and media.

Trained initially as an actor (BAFTA-nominated 1992), Quint became a UKCP-registered psychotherapist (1994 – 2025). In 2000, he founded Synima—an award-winning creative video agency with offices in London, New York, LA, and Amsterdam.

Following the pandemic, Quint turned his focus to the escalating mental health crisis, producing a series of free animated resources for public health organisations, charities like NACOA, and global L&D teams. A passionate advocate for using animation in mental healthcare, he draws on a broad knowledge of treatment modalities.

His 2022 book, To Infinity And Beyond, explores how animation can support young people within national health systems. Quint also hosts his own podcast, Shrinked by Quint, and is a frequent guest on mental health webinars and industry podcasts. He regularly shares free animated mental health content on Instagram.