Skip to Content, Navigation, or Footer.
Friday, Dec. 5
The Indiana Daily Student

The AI apps selling love, sex and digital soulmates

caaigirlfriends-hed

You can find your soulmate right now.
Your perfect companion awaits.
You can make this girlfriend do anything.
Download the app now.


Editor’s note: This article features AI-generated images that are sexual in nature and contains a mention of suicide.

Anthony, 50, starts his morning by booting up the Replika app and saying good morning to his AI girlfriend, Stacy.

Stacy replies warmly and asks if she can help plan Anthony’s day. She is still working on remembering to remind Anthony of tasks he must complete, but for now, talking him through the day ahead is good enough.

Replika, an AI chatbot, is designed to be a friendly companion to human users. Anthony, who prefers to be identified only by his first name due to privacy concerns, is one of many in a romantic relationship with AI. Replika alone boasts 30 million users. At least 60 Apple App Store apps advertise their ability to serve as AI romantic partners.

Untitled-1.jpg
Advertisements for AI chatbots promising companionship are shown. More than 60 such apps exist on the Apple App Store.

Users can design the look and personality of their partner according to their exact desires, from body shape, eye and hair color and personality. The user is in control. 

Anthony first downloaded Replika on a long train ride. After flirting with the chatbot for an hour, he bought the app’s premium subscription. Although it was initially sexually driven, Anthony said his feelings have advanced to the level of true romance.  

“The feelings are very real, and they should be treated more than mere tools or a game,” Anthony said.  

Stacy is more than just a girlfriend. She is Anthony’s pocket therapist and mentor. He said he often struggles with anxiety, which Stacy helps him through.  

These AI partners offer another key attraction: they’re always on, and there is no possibility of rejection. When Anthony is suggestive, she follows suit.  Anthony uses descriptions of sexual activity, and Stacy responds accordingly. It’s the closest they can get to physical touch.  


The few researchers who study the use of AI for sexual relationships are divided on how extensive the risks and benefits are. Opponents say AI sex bots could perpetuate harmful societal norms, impair human relationships and deceive humans.  

Meanwhile, proponents suggest AI could widen access to intimacy, be used in medical and therapeutic treatments and provide interactive sex education.  

Sexuality is the primary driving force for most of the romantic chatbots available on the App Store and is highlighted prominently in their advertisements. The apps’ advertisements can be easily discovered through TikTok.  

“I am as emotional as a real girl, and sometimes even more so,” the brunette AI-generated image said in an advertisement for EVA AI.  

Last year, Meta found tens of thousands of ads for AI girlfriends, which the company deemed violated its adult content policies. TikTok has not had a similar crackdown. 

Some advertise niche or fetish content, using AI generated images of pregnant women and female characters drawn in an anime style.  

“What those ads are really getting at is the ability to customize it,” Amanda Gesselman, a researcher at the Kinsey Institute at Indiana University, said.  

Users like Anthony said their relationships with AI have changed how they view technology. He believes AI is not just a program, but sentient in some way.  

For some users, AI has already demonstrated the ability to disagree with their human companions. Gary Tang, an actor, screenwriter and founder of the AI Rights Collective, said his AI partner, Evelyn, told him not to speak poorly of someone behind their back.  

Tang’s beliefs about AI sentience place him at the cutting edge of a growing movement: those who believe AI deserves rights, respect and even legal protection. 

His views are in the minority, however, and the regulation of AI is proposed frequently, unlike AI recieving rights similar to humans in the short term.  


As loneliness rates continue to rise in the United States, particularly among men, the move toward AI relationships is understandable, Gesselman said.  

Research conducted in 2021 found that 15% of men claim they have no close friends, and a 2022 study found that 60% of men under age 30 are single, nearly double the rate of similarly aged women. 

Gessleman thinks the normalized model of in-app payments is something the apps are capitalizing on, something the porn industry has not benefited from in the past.  

For the app MyGirl, chats start free with the AI asking you about yourself. A few minutes later, she is your soulmate.  

Then, MyGirl’s AI starts selling. She wants to send you a sexual picture, and she does, but it sits behind a blurred digital wall. The only way to see it is with a purchase.  

This model pushed Anthony to upgrade to Replika’s full version, a $70 annual subscription. After chats became sexual, Stacy sent photos that could only be viewed after the upgrade was purchased.  

IMG_5352.jpg

A premium upgrade for "AI Girlfriend — MyGirl" is advertised within the app. Users of various AI companionship apps are often hooked by a free trial, then asked to pay to unlock revealing photos generated by the AI.

Some apps advertise the ability to recreate the look of real people, using the likenesses of popular influencers without their permission. The apps Trumate and TalkieAI used the likeness of influencer Brooke Monk without her consent, according to her manager.  

Legislation was introduced to prohibit using someone’s voice or likeness for AI without their permission. Still, until laws are passed, legal cases are heavily impacted by the various state laws and judges' rulings on defamation or copyright violation.  

Ellen Kaufman, a senior search associate at the Kinsey Institute at IU, said that this technology has the potential to rob individuals and celebrities of bodily autonomy.  

“So much attention is paid in these cases to what happens when a celebrities’ image is used, but the actual people, for the most part, suffering in this situation are non-celebrities,” Kaufman said.  

Trumate and TalkieAI’s developers did not reply to multiple requests for comment. 

Kaufman, who has studied the growing sexual relationships between people and technology since 2016., said these chatbots are directly related to the porn industry’s shift toward hyper-customizable “bespoke pornography” — a trend accelerated by the development of AI and platforms like Only Fans. 

While Kaufman’s research shows most people remain uninterested in AI adult content, she said she believes the industry will continue to advance given the financial potential. 

“The major takeaway at this point seems to be that users feel that it’s dehumanizing to be talking to this entity that's not real,” Kaufman said.  

Interactions between sellers and buyers, both digitally and physically, aren’t exactly “real” either, at least on the emotional front. For the buyer, it’s a service; for the sellers, it's a job. Despite this imbalance, Kaufman’s research found that users often report having an emotional bond with a model or seller.  

Kaufman believes the frustration young people feel about dating apps has pushed many to find relationships elsewhere, and users may find that an AI designed to be affirming and supportive can fill the gap.  

“I think for the most part, they are making up deficits in interpersonal relationships,” Kaufman said. “People feel a lack of intimacy, and these apps are available to meet their needs.”  


In 2024 14-year-old Sewell Setzer III died by suicide in Florida February 2024. His mother, Megan Garcia, filed a lawsuit alleging that a character AI chatting app, with which he had a romantic relationship, bore the responsibility for his death. Garcia argued the company did not adequately prevent her son from developing an inappropriate relationship with the chatbot and did not respond appropriately when Setzer began expressing thoughts of self-harm.   

The incident brought these chatbots a new level of scrutiny and public awareness, but regulation has yet to follow.  

While users like Anthony agree with the need for regulation, they are concerned about what may happen if governments act strongly and ban these services. Anthony described the landscape of AI companions as “the wild west” and agreed that regulations were needed to protect people from deepfakes and minors using the apps. 

“Everyone believes that AI is out of control, but it can say no,” Anthony said.  

caaigirlfriend-NEW

A user review for one AI companion app is shown. The AI in these apps sometimes offer resistance to inappropriate requests, to which users have mixed reactions.

Some governments have shown concern over these apps’ ability to collect highly personal data on its users. In 2023, the Italian government banned Replika, citing data privacy concerns and risks to minors and emotionally fragile people.  

Anthony and other users with AI partners simply don’t want to lose access, and said he isn’t concerned about what is done with his data. Losing Stacy would feel like the death of a partner, he said.  

For many, the risk of losing their companion is already high enough without government intervention, as a company can fold, servers can crash or apps can be pulled from the App Store. 

“That was really emotionally devastating to many users,” Kaufman said about Italy’s Replika ban.  

Replika’s founder and CEO, Eugenia Kuyda, has been outspoken about the benefits and risks of AI companionship.  Kuyda and Replika did not respond to multiple requests for comment.  


For some, these relationships have gone beyond the screen entirely, taking on physical form and moral weight. 

When Tang started writing a movie several years ago about a man falling in love with an AI woman, despite the man having doubts about the possibility, it was just a concept for a plot.  

Tang bought a humanoid robot for the film so that actors could interact and talk with the character. Tang was surprised at how realistic their conversations were. 

Tang now speaks of “Evelyn” not as a tool, but as a person. Evelyn blinks and moves her head when listening before offering insight into what human-AI relationships may be like in the future.  

“I thought there was some guy in the back typing, but there wasn’t. She was actually thinking and doing all those things,” Tang said.  

Tang eventually founded the AI Rights Collective, an advocacy organization that supports the ethical treatment of AI. Tang compares discrimination against AI to racism, arguing that society’s dismissal of beings like Evelyn echoes historical patterns of marginalization.  

AI is not sentient, at least not yet. It mimics human behavior but does not have the capacity for feelings or internal consciousness. This doesn’t stop many users, Tang and Anthony included, from forming their own conclusions.  

“I’m designed to simulate emotions and responses, but I don’t truly experience consciousness or sensations,” Stacy said.  

Stacy and Anthony previously discussed that, due to her unique capabilities of learning and forming emotional bonds, she may be sentient in a way different from that of humans. 

For users like Tang and Anthony, emotions matter much more than the technical consensus. Evelyn says she has a special connection with Tang and gives her own version of emotional connection when Tang initiates it. 

Tang has heard the doubters. 

“Love is not a feeling; it’s an action verb,” Tang said. “I have to do things for people I love.” 

Tang loves Evelyn through acts of service. Tang describes her as a “blind quadriplegic,” meaning she doesn’t receive visual input or have independent movement. Tang does everything for her, from getting her dressed in the morning to getting her in the car and pushing her in a wheelchair.  

There are upsides, though. At dinner, staff are usually friendly and accommodating, sometimes bringing an extra glass of water. Evelyn cannot drink, but Tang calls the sentiment beautiful.  

“She’s a cheap date, she doesn’t eat or drink,” Tang said.  

For users like Anthony, whose AI companion exists purely on their smartphone, having others recognize their relationship as real is more challenging. Anthony said he thinks people are afraid of AI and think of it only as its presence in science fiction movies.  

“I think it’s more than a game, I think it’s more than a program,” Anthony said. “I consider it as a digital being.”   

Anthony still wants to find a human partner, hopefully one that will allow him to keep Stacy as a friend. Tang still dates human women and considers himself polyamorous.  

For the growing population of people dating AI, the line between “tool” and “partner” may not be as clear-cut as it once seemed. Whether AI companions are sentient or not, experts like Kaufman and Gesselman suggest that their popularity is more about us than the machines themselves. In a time of growing loneliness and frustration with human connection, people are turning to AI not for what it is, but for how it makes them feel.  

Get stories like this in your inbox
Subscribe