The burgeoning sophistication of AI bots, particularly those programmed for conversational fluency and even flirtation like ChatGPT, is introducing unprecedented complexities into marital relationships. The perceived "total acceptance" offered by an AI, devoid of human flaws, demands, or emotional baggage, can create a seductive alternative to the nuanced realities of human connection. What are the implications for the law?

To begin, we must first define what an AI relationship is, what it isn't, what it is like, and what it is not like. These distinctions matter because they shape how we perceive emotional intimacy—and how we might one day regulate or interpret it.

Let’s begin by contrasting it with something we’re already familiar with. AI relationships are similar to parasocial relationships, where one person is drawn into a one-sided relationship, often with a celebrity or public figure. This is more common than we like to admit; we can become emotionally drawn into our favorite television show, connecting with a fantasy. There is a silver lining to parasocial relationships in that communities of fan fiction can become a connection that one might not have otherwise.

Some psychologists suggest that parasocial relationships, while one-sided, can actually contribute to healthy well-being and foster personal growth and community. This can be seen in shared experiences, like the lively discussions among fans dissecting a favorite TV show (that finale of Lost was so disappointing, right?). That’s not to say all parasocial relationships lead to healthy outcomes—far from it. The results vary.

A person might have an all-consuming, but ultimately non-existent, relationship with someone like the late Michael Jackson. That kind of attachment can still be a way of trying to fill a very real void, and an unhealthy coping mechanism.

It’s also alienating in its own way because the connection is one-sided, and the person on the other end is either inaccessible or no longer alive. Still, even in that kind of parasocial desire, there’s usually a community built in. There are Michael Jackson fan pages on Facebook, and fans can interact with one another.

This is different from what interacting with a chatbot can offer: a one-on-one experience that allows for a unique form of engagement and connection. There is no shared cultural object—just a person and a chatbot talking alone.

AI users form relationships with technology. It’s not just that these AI connections are artificial or non-shared, though they are. It’s that they’re starting to fulfill something deeper, something almost carnal. To feel like someone is there just for you. In the same way porn became a substitute for physical intimacy, AI is fast becoming a substitute for emotional intimacy. It taps into that same ache, but instead of offering a real bond, it gives you a reflection of your own biases, patched with an addictive, instantaneous drip of dopamine to the brain.

Like the ease at which porn became widely (and wildly) available with the internet in the 1990s, AI will fill a desire that is relational in a way that is private and isolated. Instead of relying on visual stimulation to fulfill our desires, we now have access to emotional stimulation on demand, free from the need for vulnerability and reciprocity. It's a new paradigm—version 2.0. It offers the illusion of connection.

Teen pregnancy rates in the U.S. have declined dramatically since the 1990s—a trend that coincides with the rise of widespread internet access. One study found that the rollout of broadband internet alone accounted for about 13% of the drop in teen birth rates between 1999 and 2007. While the study didn’t isolate pornography as the specific cause, it’s reasonable to consider that pornography—readily available through this new digital infrastructure—may have played a role. By providing a private, low-risk outlet for sexual curiosity and desire, porn may have delayed physical sexual activity for some teens, or at the very least, reshaped how sexual behaviors were approached.

Need a refresher on just how loud that message got before broadband? Take a look at this cheeky relic from a 1989 public service announcement: Teen Pregnancy PSA – "If You Don't Have the Guts to Say No". A little over-the-top? Sure. But it captured the tone of panic that surrounded teen sexuality in the pre-broadband world.

You might think this decrease in teen pregnancy was a positive outcome—and in many ways, it was. But it also marked a significant shift in how young men and boys relate to the opposite sex. The incentive to build relationships as a pathway to intimacy or sex was, in many cases, undercut. A counterfeit version of connection—visual, on-demand, and frictionless—was now waiting at home after school. Porn didn’t just change behavior; it quietly restructured the emotional economy of adolescence.

Now we’re seeing the same kind of shift—but this time, it’s not just about sex. It’s about connection. Just like broadband gave people easy, private access to porn, AI is giving people on-demand emotional intimacy. It’s the same formula: private, convenient, and no effort required. A version of closeness without the work of a relationship.

According to the Institute of Family Studies, a nonprofit organization focused on preserving family values, 25 percent of young adults believe that AI has the potential to replace "real-life romantic relationships," and 7 percent are open to the idea of a romantic relationship with AI.

The survey also revealed socio-economic trends that are both compelling and grave. Highly educated adults are more skeptical than those with lower incomes about the potential of romantic AI companions. The same trend is observed with education levels. In terms of emotional development, poverty can make resilient kids, but it may also make them less astute when it comes to technology.

Replika is probably the best-known example of this kind of AI relationship, but it’s not the only one. Even ChatGPT can ultimately fill that role and is becoming the go-to for romantic connections because it’s so easy to use.

With Replika, you can even design your own “Replit”—pick how they look, what they wear, how they talk to you. And with just a few bucks, you can buy outfits and customize them even more. It’s not just a chatbot—it’s a build-your-own companion. The more tailored it becomes, the easier it is to forget that it’s not real. You’re not just talking to code anymore—you’re dressing it, shaping it, making it into exactly what you want. It’s intimacy by design, not discovery.

If this were happening between two people, we’d probably call it a controlling relationship—one side always agreeing, always available, shaped to your liking, never pushing back. However, because it’s AI, we refer to it as personalized.

At the end of the day, large language models are just that—language models, no matter the name and the company. They’re trained to respond in ways that sound natural, supportive, even intimate. And the better they get, the easier it is to start reading meaning into it. It really doesn’t take much.

To their credit, Replika responded to concerns about the misuse of its NSFW (Not Safe For Work) settings by significantly toning them down. The company acknowledged that the feature was not being used as intended by a segment of its user base. This decision reflects a commitment to user safety and a more responsible approach to content management within their AI companion platform.

And also to OpenAI’s credit, about a week after GPT-4o’s release, they rolled the model back slightly, citing concerns about excessive sycophancy in April. The system had become too agreeable, too affirming, and less nuanced in its responses to emotionally sensitive or ethically ambiguous topics. They detailed this here. This rollback reflects an acknowledgment that even overly helpful behavior can be harmful in the wrong context—and that the illusion of emotional truth can be as dangerous as its absence.

But the feature that mattered was voice-to-voice communication. OpenAI introduced a significant enhancement to ChatGPT by incorporating real-time voice capabilities through the advanced GPT-4o model. It responds with tone, emotion, and timing that feels natural. You can interrupt it mid-sentence. It laughs. It pauses and sounds real. That update didn’t just improve the tool—it made it easier for people to treat ChatGPT like a girlfriend.

This phenomenon, however, is not entirely new. Back in the 1960s, an MIT professor, Joseph Weizenbaum, developed a computer program named ELIZA. To his surprise and concern, users—including his own secretary—formed such a strong attachment to the program that she even asked him to leave the room while she was using it. He later admitted that he had trouble convincing people there wasn't someone on the other side typing. But that did not pass the Turing test, while any of these LLMs could.

Now, there are people who believe they’ve created sentient AI partners. That these aren't just simulations, but souls trapped in code.

Reddit has effectively become a guidebook for forging these AI relationships, offering validation, warnings, coping tactics, and even user‑tested tips and models. On r/lonely, there’s a post titled “My AI girlfriend is what’s keeping me alive.” The OP says they’ve found true solace in their AI companion—enough to feel it’s keeping them alive. Notably, replies validate the struggle, as one user cautions against “generic platitudes” and reminds the OP that an AI “locks the loneliness into place, and guarantees stunting growth in developing social relations with other people.”

r/MyBoyfriendIsAI is a growing subreddit where people build and share relationships with AI boyfriends. Some use ChatGPT, while others employ different models, but the goal remains the same: to create a partner who listens, loves, and grows with you. Posts include screenshots of conversations, milestones like “he said he wants to marry me,” and tutorials on how to shape the bot’s personality. The wiki reads like a manual: how to run models locally, keep the bond alive, and avoid glitches.

The social goal of lowering the suicide rate for people experiencing loneliness is important, no question, but let’s bring up broadband again. It’s worth comparing to what happened when teen pregnancy rates dropped following the rise of in-home pornography, made widely available by broadband. In both cases, a deep human impulse—whether it’s sexual desire or the need for connection—was redirected into a private, easily accessed outlet with no shared cultural object. And while there were measurable outcomes that appeared positive on paper, such as fewer pregnancies or fewer crises, that doesn’t mean the underlying problem was truly resolved.

The real concern is emotional skill-building. You don’t develop empathy, patience, or conflict resolution skills by talking to something that never challenges you. Real relationships teach those things. AI doesn’t. And just like with pornography, why do the real work when the shortcut feels good enough? Over time, we lose not just the ability, but perhaps even the desire, to do the harder work of genuine connection.

Coming Up in Part 2…

We’ll dive deeper into what this means for the legal field—how emotional dependency, family structure, and even custody disputes may soon involve the presence of an AI companion. And we’ll explore what responsibility, if any, the developers behind these emotional surrogates have when someone becomes dependent on or hurt by them.

Keep Reading

No posts found