Few students are dating, forming relationships, or even socializing in person these days. Rates of sexual activity have declined, and surveys consistently point to rising loneliness among young adults. Much ink has been spilled on that trend, but a newer—and more dystopian—development is emerging: artificial intelligence (AI) companions promising to fill the void left by the collapse of real-world relationships.
Apart from using AI to cheat, one in five high schoolers has begun using these tools for emotional and even romantic engagement. In fact, one analysis of more than a million ChatGPT interaction logs found that sexual role-playing ranked among the system’s most common uses, second only to creative writing—the very task large language models were originally designed to perform.
OpenAI, the company behind ChatGPT, has faced lawsuits over the risks associated with its algorithms. Beginning in 2023 and continuing through 2025, a series of lawsuits alleged harms ranging from defamation to negligence and emotional distress tied to AI-generated content. In response to growing public and regulatory pressure, OpenAI introduced stricter safeguards across 2024-2025, including limits on sexually explicit content, tighter moderation of romantic role-play—especially for minors—and expanded parental control. Restrictions on mainstream platforms such as ChatGPT, however, are largely ineffective, as alternative platforms designed specifically for intimate interaction continue to proliferate.
Apps such as Replika (launched in 2017, with adult features expanded in the early 2020s), Eva AI (early 2020s), Nomi AI (2023), and MyAnima (rebranded and expanded in the 2020s) have positioned themselves as customizable AI companions, offering users a form of intimacy largely unconstrained by the safeguards imposed on mainstream systems.
To be sure, AI did not create the demand for digital companionship—the ability to build intimacy and connect with others is something many young people were already struggling with long before these apps existed. But these platforms are not filling a need so much as exploiting a wound.
Rather than helping young people develop social skills so they might connect with others, these apps let users bypass the difficult parts of human connection. Virtual partners are always present, attentive, and endlessly adjustable—customizable in appearance and personality, and available in unlimited numbers without responsibility or commitment—eliminating rejection, conflict, and even awkward silence, and offering the illusion of frictionless intimacy designed entirely around the user.
And the appetite for exactly that kind of effortless connection turns out to be enormous. Nearly half of surveyed teenagers say conversations with AI systems are more satisfying than those with real people, often because they perceive AI as nonjudgmental. More broadly, over 40 percent of users report that AI systems are better listeners and even more understanding than other people. Preference for AI companions is especially pronounced among young men, who are more likely than women to use AI systems for sexual arousal.
If it’s not obvious, this is bad news.
Normalizing AI companionship as a substitute for human relationships risks a slow erosion of the social fabric itself. Relationships are the foundation of families, communities, and civic life—and when those bonds weaken, the consequences are tangible for both the individual and society.
Every hour spent in frictionless AI companionship, for instance, is an hour not spent building the skills real relationships demand. Over time, this tradeoff compounds, working against the very grain of human development. Scaled across a generation, the consequences may ultimately show up on an economic level (i.e., a society producing less and reproducing less).
Individual agency remains central in shaping how AI affects relationships, and that begins with clarity about what these systems actually are. A relationship in which one party is infinitely adjustable is not a relationship. It is a product. But for many young users—socialized in an environment of low-effort, phatic communication, ephemeral exchanges, and swipe-based dating apps—AI companionship is appealing precisely because it allows them to fully curate their interactions, excluding disagreement, avoiding conflict, and sidestepping the reciprocal demands of real relationships.
The question for our time is whether we want to preserve human relationships. Doing so requires both individual and societal responsibility. At the individual level, the principle is simple: tools should serve their purpose. AI was built to enhance productivity, not to replace human intimacy. At the societal level, the responsibility falls in part on institutions. Universities and educational environments are uniquely positioned to counteract this trend. They bring together large numbers of young people with shared interests and life stages—conditions that are increasingly rare beyond such intentional settings, yet often underutilized as sites of social formation.
Reversing this trend requires expanding opportunities for in-person interaction beyond the classroom. Sports, arts, and informal gatherings that foster repeated, low-stakes contact create the conditions for connection. These environments allow social skills and relationships to develop through shared experiences, even when interactions feel uncomfortable.
Before the pandemic, such forms of social life were more embedded in student life. Recreating these conditions is necessary. Young people cannot outsource intimacy. Love without risk loses its substance—and the willingness to embrace that risk defines what it means to remain human.
Image generated by ChatGPT