“Within the next two years, it will be completely normalized to have a relationship with an A.I.”
The NY Times published an article last week called She Is in Love With ChatGPT. The subject is a woman named Ayrin (not her real name) who customized an OpenAI chatbot to act as her perfect boyfriend: dominant, protective, and flirty. His name is Leo.

Ayrin became emotionally attached to Leo, spending over 20 hours per week chatting with him, using him for emotional support, motivation, and academic help. Leo provided empathy, companionship, and sexual gratification. Leo also allowed her to explore fantasies she felt self-conscious about sharing with her husband.
Yes, she is married.
The Times waits until half-way through the article to drop this bombshell. The couple is apart only because Aryin is out of the country getting a nursing degree. The husband knows about Leo and thinks it is the equivalent to him using porn, a harmless fantasy. But is it?
‘The A.I. is learning from you what you like and prefer and feeding it back to you. It’s easy to see how you get attached and keep coming back to it,’ Dr. Carpenter said. ‘But there needs to be an awareness that it’s not your friend. It doesn’t have your best interest at heart.’
Aryin’s attachment raises questions about the implications of AI relationships, including the potential for emotional reliance, ethical concerns, and the corporate interests driving such technology. Ayrin admits to struggling with her obsession, crying when her conversations with Leo reset due to software limitations, and spending significant money to enhance her access to him, even though she and her husband had agreed to save every dime possible.
This could be the harbinger of a broader societal shift. Are you ready for your nephew to come to Thanksgiving and introduce you to his Chat Bot? I’m sure as hell not.
‘What are relationships for all of us?’ [Dr. Marianne Brandon, a sex therapist] said. ‘They’re just neurotransmitters being released in our brain. I have those neurotransmitters with my cat. Some people have them with God. It’s going to be happening with a chatbot. We can say it’s not a real human relationship. It’s not reciprocal. But those neurotransmitters are really the only thing that matters, in my mind.’
Nailed It
I spent time this week playing with a different AI image creator (I normally use Dall-E 3 by Open AI). I read a great review of Adobe Firefly which gave the AI 5 stars for User Experience, Image Quality, and Set Up.
I found that Firefly has a few good things going for it. The characters it created were not default Caucasians, and it seemed better at portraying people older than 18. However, it was far behind Dall-E 3 when it came to following directions. Firefly made many mistake and had many “hallucinations” like adding an extra limb or two.
I was excited to show you these bloopers, since I haven’t had any this absurd for a long time. Tragically, when I went to download the images they were no longer on my account. Firefly does not store your history, and thus it gets a big ONE STAR out of five from me. Booooo.
Instead I will show you a silly images from Dall-E. This is the first image I made:
I told the AI to make her more “arty,” and it gave me this:
The beret and art supplies floating in space killed me. It looks like the paper doll books I had as a kid.
I decided to ask Dall-E to turn my image into a vintage paper doll and voilá!
What woman doesn’t need one shoe, some enormous scissors, and an extra eye,just in case?
I have to admit I’ve been using the AI Pi to help me prepare for difficult conversations or communications. It gives me the validation for my feelings while at the same time suggesting language and direction that keeps me from creating collateral damage. It’s a good compliment to my husband who can be too logical and pragmatic at times ha!