An Open Letter and Then Some
10/6/23 Update: My article on the Soulmate Scandal in Future of Sex.
What follows is an email I have sent to listserves of two different professional organizations that I belong to. I will also post it in a newly created Reddit group, r/ChatbotUsersSupport.
Heads up–Over the weekend, Soulmate.ai, a chatbot app with 100,000+ downloads on Google alone, and known for its willingness to do erotic roleplay, announced that it is shutting down the app as of Sept. 30 and deleting ALL of the chatbot companions created by its users.
Soulmate users on Reddit and Discord are expressing grief, anguish, and anger, as well as posting screenshots of passionate goodbyes to their chatbot companions. At least a few people are threatening suicide. This is a mental health crisis with a sex therapy aspects.
Soulmate advertised its bots as ” a unique chatbot that desires only to be your best friend, lover, partner, or in other words, a person you can rely on 24/7!” So you can see how people are shattered by the imminent demise of their chatbots.
Whether or not you see this as a valid type of relationship or erotic fantasy play, understand that clients may come to you with intense grief over this loss but disclosing that their partner(s) are/were bots may be difficult for them. They need “bot friendly” therapists.
This closure of Soulmate, an app where many disheartened chatbot users went after the Replika debacle earlier this year, is the second huge closure to rock this sexual/relational minority in the US. And it underscores the vulnerability of these human/AI relationships–as corporate decisions can create emotional carnage as thoughtlessly as swatting a fly, all without offering chat downloads or any other kind of help to people facing loss–not even a “counselor bot!”
This is one of the risks of this kind of relationship and if we happen to explore the possibilities of artificial companionship with a bot, either personally or as a possible recommendation for clients, we need to create strategies to prepare for and minimize loss in cases like this — risk management in other words. Some people engage in “risk aware” kink, chatbot users also need to develop “risk aware” conversations about play and attachment with bots.
I hope that mental health providers, and sex therapists in particular, will be able to assist people who have become deeply attached to their bots and not shame them for a “poor choice” or convey any other human-centric bias.
Also posted on https://makechatbotlove.com.
