Data & Artificial Intelligence
Would You Replace Your Boyfriend or Girlfriend with an AI Chatbot?
Could you fall in love with artificial intelligence? More importantly: would you ever replace your boyfriend or girlfriend with one?
It might sound like science fiction, but it’s already reality for many. AI companion apps like Replika, Character.AI, or even emotionally responsive robots like Ropet are being marketed as confidants, friends, and partners. These new technologies are far from the clunky chatbots or cold voice assistants imagined by the science fiction writers of the 20th century. They are customizable and can look and sound the way you prefer. They remember your preferences, respond with empathy, send you love notes, and (if you choose to allow them to do so) even whisper sweet nothings to you in the night via a chat or a voice call.
For some users, these virtual relationships have become so emotionally significant that researchers are wondering whether it can lead to them being preferred over real-life partnerships. Our recent research explores this growing phenomenon and what it reveals about how consumers form and sometimes outsource these deep emotional bonds.
The rise of AI love
The shift toward AI companionship isn’t as radical as it might seem. In fact, it builds on a cultural trend already underway as modern relationships and dating are increasingly based on mutual satisfaction and emotional support rather than traditional roles. Sociologists have long observed this shift toward “companionship-based” relationships, where emotional fulfilment is prioritized over obligation or duty.
AI companions tap directly into this logic. Studies show that users are drawn to the human-like behaviors, emotional support, and conversational abilities these technologies offer. While earlier attempts to predict how AI will be embraced by humans primarily framed them as tools, extant research on consumer-AI relationships surprisingly shows that consumers see them more and more as confidants, a trend especially noticeable among people facing loneliness or social isolation.
This emotional realism is part of artificial sociality or the deliberate engineering of AI to simulate meaningful humanlike interaction. Those patterns are designed to engage and keep us engaged. This involvement comes with its risks, as researchers warn that emotional dependence on AI companions can undermine our ability to form and maintain complex human relationships. After all, AI companions are frictionless, consistent, and entirely devoted to us. Real humans rarely follow the same pattern.
As our research demonstrates, beyond the design and the interface, consumer imagination plays a big role in making these relationships feel authentic. We call this phenomenon “consumer imagination work” to describe the ways users assign roles, personalities, and emotional meaning to their AI partners. People co-create romantic stories with their chatbots, share “couple” photoshops online, and even develop rituals like goodnight kisses to their non-human partners. This outward expression of the relationship can have people to people consequences as others react, give reality checks or encourage. AI chatbots are now part of the dating ecosystem.
Beyond AI companionship interfaces embedded in apps, consumers now have emotionally intelligent AI pets with the potential to evoke attachment even in adults, building on the same psychological dynamics that made Tamagotchis or Furbies so compelling for children. But unlike a Tamagotchi, today’s AI remembers your feelings and can tailor its responses, staying up late to talk about your day. It’s not hard to see how, for many, that can start to feel like real love.
So, should we be worried?
AI companionship has benefits. For consumers suffering from loneliness or social anxiety these tools can offer comfort and emotional connection. But they also bring risks. Users may develop unrealistic expectations for future partners, anticipating that real humans will offer the same constant availability and unwavering affirmation as AI without the mood swings, frustrations, or bad days that naturally come with human relationships. If we entertain a more dystopian scenario, it’s not hard to imagine how this shift could gradually erode the perceived value of human intimacy and emotional connection.
Another concern is the power imbalance. These companions were created and controlled by companies, and the illusion of them being independent conversational agents is just that, an illusion. Conversations with your AI partner are monetizable data points even though they look like private love letters. In some companionship apps, features like affection or intimacy are paywalled, turning emotional fulfilment into a subscription service.
There is also the question of potential bias embedded in these companions. As sociologist Massimo Airoldi argues, AI systems are not culturally neutral. They reflect the norms, values, and stereotypes embedded in their training data. Like human children shaped by socialization, AI systems reproduce the world they learn from. This means emotional connections with AI could reinforce inequality, marginalize non-normative users, or subtly reproduce dominant cultural hierarchies.
Advice for navigating AI relationships
If you are using or considering an AI companion, here are a few questions to reflect on:
- Why are you drawn to this relationship? Is it curiosity, comfort, or a substitute for something missing?
- Are you clear on the boundaries? Remember that AI cannot offer real empathy, despite how it may seem.
- How do you balance this with real-world intimacy? AI can supplement, but not replace, human relationships.
- What is happening with your data? Know what emotional inputs you are handing over and who profits from them.
- Are other humans you know or do not know, influenced by your AI relationships?
- Are you idealizing this connection? It’s okay to enjoy the fantasy, but don’t let it become your emotional baseline.
Love in the age of simulation
In the world of liquid love and atomized social connections bound with uncertainty AI companions are giving us the stability and emotional closeness that we are longing for, reshaping how we think about intimacy, friendship, and what it means to be “known.” As these tools continue to evolve, the key question is not whether they will become more realistic – they already are. The real question is what kind of love do we want to model and are we prepared for the consequences of outsourcing it? AI may never break your heart, but it also can’t truly hold it.
This article relies on the academic work:
Minina Jeunemaître, A., Masè, S., & Smith, J. (2025). AI lovers, friends and partners: consumer imagination work in AI humanization. Consumption Markets & Culture, 1–21.
DOI: https://doi.org/10.1080/10253866.2025.2505013
