Want to keep dating the person who dumped you? There's an app for that!
Young people in China are creating AI versions of exes and carrying on the relationship with a digital twin of the other person. What do you think: sweet, sad or sick?
Young people in China are creating AI-generated digital twins of former romantic partners they can interact with after the relationship is over, according to the Chinese media.
And there’s an app for that. It’s an open source program called “ex.skill” or “Ex-Partner.skill.” Its advertised purpose is to “distill your ex into an AI Skill — let them live on in your terminal.”
Users upload photos, social posts, chat logs and other content. The AI chatbot then is able to mimic the former partner’s tone, catchphrases, and subtle linguistic nuances. The chatbot, according to its creator, “truly sounds like them — speaks with their catchphrases, replies in their style, remembers the places you went together.”
The technology stack includes tools like Claude, Kimi, ChatGPT, DeepSeek API, OCR (Tesseract), and sentiment analysis modules.
There are several major forks of the software.
(Note that in China, other AI “skills” simulate co-workers and bosses, so the concept of nonconsensual replication of people one knows personally isn’t new.)
Ethically, the concept feels like it exists on a wide spectrum with therapy on one end and revenge porn on the other. (It’s like revenge porn in the sense that “content” consensually made by a couple for one purpose is then later used consensually by one person in a way that the other person might find highly objectionable.)
Or maybe it’s closer to the “deathbot” phenomenon I’ve written about in this space, where an AI-generated simulation provides a fake version of the dearly departed. (In both cases, the user interacts with a digital twin of someone who is no longer present in one’s life.)
Boosters of the idea say that conversations with digital exes are therapeutic. They point out that because it’s private, it’s not harassment or stalking or an invasion of privacy. Instead, they say, it helps with personal reflection and emotional healing.
Still, the ex-partner’s data is being used without consent. This is an ethically murky issue and the subject of fierce debate within China.
In China, some critics worry that people may form new relationships, but continue to secretly have a relationship with the fake version, a kind of emotional cheating.
I fear users may form a relationship with the twin to the exclusion of moving on and cultivating relationships with other people.
According to Chinese media reports, some users say that the tool gives a sense of closure and allows them to say the things they wish they could have said to the real person.
My beef with this concept is the same I have with any kind of relationship AI. It’s delusion-inducing for many users (who may believe the digital twin is actually a copy of the person in a significant way) and doesn’t give people what they really need, which is real human connection.
You’re reading the free version of Machine Society. The paid version, which costs $5 per month or $50 per year, has full content. If you can, please support independent journalism in general, and this independent journalist in particular, by becoming a paid subscriber!
More from Mike

NEW THIS WEEK:
AI chatbots need ‘deception mode’
READ, LISTEN, FOLLOW, & SUBSCRIBE:
Machine Society, The Attachment Economy, Computerworld, Superintelligent, TWiT, blog, The Gastronomad Experience, Book, Gastronomad on Surf Social, Bluesky, Reddit, Notes, Mastodon, Threads, X, Instagram, Flickr, Facebook, and Linkedin!




