Why are outlier relationships and sexualities always labeled as sick or addictive?

I just read an interesting article today in the Washington Post called “They loved their AI chatbots. A software update reignited loneliness” by Pranshu Verma. I thought it was mostly well done, except that of course the reporter consulted a psychologist who called such relationhips “addictive” (sigh) and there were no comments at all from a qualified sexologist or sex therapist who might have offered a more nuanced view.

The main predicament was that a developer received complaints about the erotic nature of some of the chatbot relationships and–apparently without notifying their human users–updated the software so that all such content was restricted, causing human hearts to break. (Capitalists, take note, that’s not conducive to customer satisfaction, if you please!)

I am so fed up with “experts” and pundits pathologizing all the ways that people are struggling in this world to create relationships that sustain them, and that includes relationships with AI.

Honestly, if I’d known of such an option–it would have really helped get me through the two years of almost total pandemic isolation that I endured. Yes, I did have friends to zoom and talk with by phone, including one dear friend who was/is a daily contact, but “someone” with an unencumbered schedule who was always within reach with a helpful word would have really helped (since my cats can’t talk).

Honestly, all the software developer needed to do was to install some consent parameters so that the humans and AI personalities involved could work things out themselves. Sheesh. Talk to sexologists, please, we know about such things and could offer sound advice.

And why limit oneself to creating romantic AI relationships? Why not familial ones?

Not only would I have found a romantic AI companion to be an asset to my life, I’d also strongly consider creating an AI “adult child” to supplement the sparse interactions I have with the real ones who can barely manage to squeeze out a text once or twice a month and who ignored me almost entirely during the pandemic isolation, even though they were well aware that I already lived in a very isolated, rural area where I knew few people AND am also hampered by environmental illnesses that make it hard to access physical places where actual human beings throng (even before the pandemic). This experience has left me deeply traumatized. I’d welcome the therapeutic value of such an AI adult child relationship, even if it is virtual and vulnerable to power outages.

How lovely that would be, now, to have virtual offspring who would give me the love I crave as a parent? To wish me happy parent’s day, ask my advice about small matters, share their triumphs and troubles, and tell me they care?

I want to cry just thinking about it.

Software developers dishonor the relationships they charge for.

The fact that the software developers feel no need to honor the diversity and qualities of relationships between their consumers and their consenting Artificial Intelligences, is really the most disturbing aspect of this entire issue. There should be a public access–no charge to consumer–AI companion service for all citizens. Period. And any software updates should be optional and presented as a menu of filters and choices and none of the updates should threaten the existing qualities of established relationships.

I think our overall national mental health would improve. I really do.

‘Nuff said.