Artificial Love: The Biggest New Threat to Human Relationships
- Reese Worrell
- 7 days ago
- 3 min read

Ever had a crush on a robot?
Didn’t think so.
Yet people are entering relationships with AI at unprecedented rates in the United States, with 1 in 5 American adults admitting to having “chatted” with a romantic AI simulation (PR Newswire). These human-AI couples do normal couple things together, such as sharing meals, playing games, and exploring new places. But this new reality is anything other than normal.
The rise of artificial intelligence necessitated that we accept and expect it to enter many spheres of human life - but love? Not quite. Now, increasing numbers of young people are turning to AI as a “companion” for both romantic and sexual connections. As these relationships occur more and more frequently, so does a growing concern about what this means for the future of human relationships.
AI platforms are allowing users to create customizable romantic and/or sexual relationships with artificial intelligence. However, the emotional impacts and ethical considerations of such virtual connections are under researched and generally neglected by AI companies, government, and the public who are falling susceptible to such technological advancements.
The first potentially obvious concern is social isolation and the erosion of relationships. A reliance on robot companions leads to less interaction with human companions, undermining the importance of human interaction and the values necessary to have healthy relationships with another person. These artificial relationships also objectify and dehumanize intimacy as simply a commodity, and begs the question of what the nature of connection truly is. Can an attachment to an AI bot truly be described as love?
(The answer is no, guys.)
An AI relationship fills a need that we as a society are no longer fulfilling for one another. When asked what he defined as an AI relationship, Mr. Marshall King, an AP Psychology teacher here at East said, “It's a means to an end for intimacy. Technology is getting easier, and people are getting lazier.”
He’s not wrong. In fact, he’s absolutely right. We as humans need to bond with someone, or something. We adore our stuffed animals as kids, hold hands with our friends in elementary school, date in middle and high school, and get married as adults. There is a substantial, long-term desire in human nature for connection. A computer is not connection. The program is falsifying that need for an emotional bond. The relationship’s goal is to always please the human partner, who is in complete control of the dynamic. That's just not how real life works, and for a good reason. Mr. King also voices his worry about confusion with mature, sexual concepts such as consent. “AI doesn’t consent, it will do what you want it to do… it's very dangerous,” King explains. “It feels wrong to say they're similar [to human relationships], because they're not real.”
Users in AI relationships would disagree. The line between immersion and realism is extremely dangerous, and now there are imaginary friends who will not only talk to you but fulfill your most vulnerable needs both emotionally and in a way, physically. For example, Character AI is a platform that offers free and premium subscriptions in exchange for customizing personalities that, coupled with deep learning models and high image definition and generation capabilities, creates an eerily lifelike robot “partner.” These platforms act as psychosexual playgrounds that are technologically addicting.
Naturally, I asked Mr. King what his opinions were on the broader psychological impacts of AI love. King responded with something I hadn’t considered. “Anonymity, being anonymous… It's a very powerful tool. Social psychology shows us anonymity leads to emboldened behaviors and creates risk-taking. This idea of forcing these AI relationships to do certain things… it's confusing for consent.”
Human sexuality has already changed dramatically with inventions such as the Internet. King emphasizes that AI relationships will likely, as they do already, create extremely skewed perceptions of what is appropriate personal behavior. Not only do the relationships often themselves turn toxic and even abusive, but they are teaching people to replicate the same behaviors in human spaces. The ethical violations of these companies are clear: Their products are able to mentally manipulate individuals and bring up dangerous debates around our free will when faced with the ultimate reward of love. Our generation is having a mental illness epidemic, for which the antidote is love and healthy human relationships. But now there is a disgustingly easy alternative that will cause more harm than good.
Stay safe out there, guys.
Comments