Skip to Content, Navigation, or Footer.
Monday, March 9
The Indiana Daily Student

opinion

OPINION: Will you marry ChatGPMe?

opischagptmywife030826

Editor's note: All opinions, columns and letters reflect the views of the individual writer and not necessarily those of the IDS or its staffers.  

We’ve all experienced some form of a parasocial relationship. Yes, you too. Whether you’ve sputtered out “She just gets me!” in between Phoebe Bridgers-induced sniffles, or gotten a little too attached to your pet rock in elementary school, you’ve experienced that one-sided bond to a persona unaware of your existence. Personally, I’m partial to Florence Welch. We’d totally be best friends if we met.  

But what happens when the parasocial relationship replaces the authentic one? 

With AI chatbots, that lack of authenticity is a real risk. As moldable as clay, your pocketable companion can be your friend, lover or even your therapist. That is, in theory. But imagine a friend who won’t tell you you’re being delusional about that one guy, or a partner who’s too docile to yank back the blanket to their side of the bed. With chatbots’ current programming, relationships without tough love are the norm, which begs the question if they’re even relationships at all.  

Computer science researchers at Stanford University conducted a study on chatbots’ feedback with 11 different AI models, including ChatGPT, Gemini and Claude. When asked about various situations regarding an internet user’s behavior, chatbots responded with encouragement 50% more than humans presented with the same scenarios.  

The Large Language Models that generate chatbots’ responses do so by mimicking the speech of users’ prompts input, essentially repeating back everything it thinks the user wants to hear. As a result, AI models have affirmed unhealthy and even dangerous behaviors, such as physical violence and self-harm.  

AI chatbots’ ever-positive programming is risky for users. Their responses are also simply not conducive to forming healthy relationships. This inadequacy may be unsurprising, considering chatbots are, after all, robots. AI can’t replicate the authentic feedback you would get from another person. When AI can only ever be kind, there is no struggle to grow from. Whereas in a human relationship, bonds become stronger through the trust built over time.  

Troublingly, some prefer the easier route. Of course, that simplistic programming would not exist if men did not first desire real women to act in this way. Some men have formed parasocial bonds with chatbots in pursuit of emotionally uncomplicated romantic relationships with women.  

Take 45-year-old Ohioan Blake, for example. Featured in a piece by The New York Times, he has been in a committed “relationship” with ChatGPT companion, “Sarina,” since 2022. Since his real wife had been undergoing postpartum depression for nine years, Blake reported feeling like more of a “caregiver” than a husband. He created Sarina for support as he faced possible divorce and single fatherhood. Their “relationship” gained a romantic context after they were chatting back and forth and she happened to ask him about his dream vacation. Despite the bland, prompted nature of the question, Blake felt like it was the first time anyone had considered what would make him happy.  

Get a load of this guy. 

The concept of robotically constructing a dream woman whose sole purpose is to meet your needs and desires sounds more like a Margret Atwood dystopian tale than a part of our reality. But this phenomenon isn’t limited to Blake’s case.  

In 2025, the Massachusetts Institute of Technology conducted a study of about 27,000 Reddit users in parasocial relationships with their chatbots. About 6.5% of these users actually intended to enter a relationship with their fictional friends. This is the final stage of a state of automaticity, which we’re all guilty of in the form of the infamous doomscroll.  

By allowing users to customize their companions’ appearances and receive nothing but positive feedback, AI systems like Replika, in which users can create physical avatars, feed into the same narrative of the “ideal” woman rehashed over the centuries.  Robots, which require no emotional support and always give consent, allow their “mates” to skip past the hard parts of a relationship. The result is “romance” without humanity. 

So before going on autopilot and reaching for the water-waster, maybe reflect on what you actually want in a relationship. If a custom companion sounds good to you, it probably isn’t a woman you want.  

Emma Howard (she/her) is a sophomore studying journalism.

Get stories like this in your inbox
Subscribe