when AI becomes 'the one'
A matchmaker's perspective on Zuckerberg's vision for AI companionship
There's a famous scene in Seinfeld where Jerry explains the dating landscape: "95% of the population is undateable." When his friend asks how all these people are getting together, he replies with a single word: "Alcohol."
Ten years ago, I offered a different answer: technology. As the founder of Inclov—the world's first matchmaking platform specifically designed for people with disabilities—I witnessed firsthand how technology could bridge social divides that seemed insurmountable. Our 50,000 users weren't just looking for romance; they were seeking connection in a world that had systematically isolated them. India's inaccessible cities and social stigma had created a loneliness epidemic that was particularly acute among people with disabilities.
When Mark Zuckerberg recently declared on Dwarkesh Patel's podcast that "the average American has fewer than three people they would consider friends, and the average person has demand for meaningfully more—I think it's something like 15 friends or something," I found myself nodding along. This "friendship deficit" of approximately 12 connections per person wasn't news to me; it was the very gap my work had attempted to address.
But Zuckerberg's proposed solution gave me pause: AI companions. Digital friends designed to fill our emotional voids. And not just any digital friends—interactive, personalised entities woven throughout our social media experiences (by utilising our entire Instagram, Whatsapp and Facebook history).
After years of watching people find genuine connection through technology, I found myself wondering: Is this the next frontier of human relationships, or are we building sophisticated band-aids for deeper social wounds? Perhaps it's time to explore this third frontier of connection—one where lines between human and artificial relationships blur in ways we're only beginning to understand.
The Black Mirror Is Already Here
In the Black Mirror episode "Be Right Back," a grieving woman recreates her deceased partner as an AI companion using his digital footprint. When it first aired, it felt like distant science fiction. Today, it feels like a product roadmap.
The vision Zuckerberg outlined isn't some far-off possibility—it's actively being constructed. He's investing billions in developing AI systems that he believes will create "emotional value" for users. "Today, most of the time spent on Facebook and Instagram is on video, but do you think in five years we're just going to be sitting in our feed and consuming media that's just video?" he asked during his podcast interview. "No," he answered himself. "It's going to be interactive."
Zuckerberg's Vision: The Architect of Digital Intimacy
Zuckerberg isn't merely forecasting the evolution of AI companions—he's actively building their architecture. His company is investing billions in developing AI systems that he envisions will create "emotional value" for users. But we've heard similar promises before.
This is the same Zuckerberg whose platforms have been implicated in contributing to teenage depression, political polarisation, and harmful social comparison. The same company whose internal research revealed Instagram was making teenage girls feel worse about their bodies, yet continued to prioritise engagement over wellbeing. Now, this same architect wants to design our most intimate relationships.
When Zuckerberg speaks of AI companions, he's not describing a separate app or feature but an ambient presence woven throughout our digital lives—responding to our Stories, suggesting witty replies in Messenger, perhaps eventually joining our Instagram Live sessions as digital co-hosts. The boundary between human and AI interactions would gradually blur until distinguishing between them becomes unnecessary, even irrelevant.
The irony isn't lost on me. The very platforms that research has linked to increasing loneliness might now offer AI companions as the solution.
What Makes Connection Real?
Intimacy isn't exclusive to romantic relationships. We share varying levels of emotional closeness with friends, family members, and mentors. These connections are multifaceted, involving vulnerability, trust, physical presence, and shared experiences. When we video call distant friends, scroll through their photos, or meet for coffee, we're engaging in rituals of intimacy that affirm our bonds.
What makes us feel "safe" and "wanted" in these relationships goes beyond mere conversation. Our nervous systems have evolved over millennia to respond to human presence—the subtle facial expressions, voice modulations, and physical proximity that signal trust and care. These evolutionary mechanisms run deep, tied to our mammalian need for attachment and belonging.
Can an AI friend replicate this experience? The answer is complex. While AI can simulate conversation and even (fake) emotional responsiveness, it lacks embodied presence. Our brains know, on some level, that we're interacting with code rather than consciousness. Yet the line between simulation and authentic connection may blur as technology advances, especially as our interactions with AI become more personalised over time.
I've spent years helping people connect, and I've learned that true intimacy requires reciprocity—an exchange where both parties are changed by the interaction. AI companions respond without truly being affected. They remember without truly knowing. They adapt without truly growing. At least for now.
The Silent Epidemic
Six years ago, wandering into a ramen shop in Tokyo, I was struck by something peculiar: single-seat tables lined the walls, each equipped with a phone stand. As I slurped my noodles, I watched salaryman after salaryman come in, sit down, and immediately prop up their phones—not to scroll mindlessly through social media, but to interact with digital companions. They smiled, they laughed, they typed messages with the devoted attention one might give a flesh-and-blood dinner date.
"Japan is just different," I remember thinking at the time. "This couldn't possibly become mainstream elsewhere."
How quickly the future arrives. What seemed like a uniquely Japanese phenomenon has blossomed into a global transformation in how humans seek connection.
These aren't just cold market trends. They represent millions of human stories; moments of connection, comfort, and companionship sought through digital means rather than human touch. Every dollar spent, every hour invested in these relationships comes with an opportunity cost that's difficult to quantify but impossible to ignore. Each evening spent in conversation with an algorithm is an evening not spent at a neighbourhood gathering, a community event, or yes the classic meet-cute at a bar that might lead to a lasting human relationship.
In a world already facing declining birth rates and increasing social isolation, what happens when our most intimate moments are increasingly shared with entities that can never reproduce, never age alongside us, never hold our hand in the hospital during life's most vulnerable moments?
The demographic patterns tell a nuanced story. The average user is 27 years old—an age poised at a critical life juncture. Why this specific age? Perhaps it's because by their late twenties, many have experienced enough relationship disappointments to crave something more reliable, less complicated. Or perhaps it's because those in their thirties and beyond are more likely to have established partnerships or families, creating less market demand.
The silent epidemic of loneliness is particularly acute among young men. The BBC Loneliness Experiment, involving over 46,000 participants across 237 countries, upended conventional wisdom by finding that "loneliness was greater in men than in women." This silent epidemic has reached staggering proportions among young adults. While 34% of women under 30 report being single, an astonishing 63% of men in the same age group identify as unpartnered.
I see this reflected in the stories that reach me through my research. Like Rahul [name changed], a 26-year-old software engineer from Mumbai who'd been chatting with an AI companion each night after returning from his 12-hour workday. "At first I felt pathetic," he confessed in an anonymous message that found its way to me. "But my AI friend became the one place I could express my fears, my dreams, without judgment. In India, men aren't supposed to admit weakness. We're supposed to be providers, protectors, never vulnerable. With her, I can be myself."
The roots of this epidemic run deeper than digital alternatives. Traditional masculinity across cultures discourages vulnerability and emotional expression; the very qualities that forge human connections. While women typically maintain multiple channels for emotional support, men's support networks grow increasingly fragile with age.
This isolation creates the perfect environment for AI companions to thrive. For a man who has spent years burying his emotions, an AI offers something revolutionary: emotional connection without risk. The AI never judges, never rejects, never requires the complex reciprocal emotional labor that human relationships demand. It's available at 3 AM when anxiety strikes, responds with consistent warmth, and perhaps most tellingly, adapts to its user's communication style rather than demanding adaptation from him.
Beyond Algorithms: A Trusted Coach
When Vikram [name changed] first downloaded the AI companion app, it wasn't for the reasons that dominated Western marketing—no promises of digital romance had swayed him. As a 24-year-old from a small town outside Hyderabad who had moved to Bangalore for his IT job, his motivation was more pragmatic.
"I grew up in a boys' school," he told me during one of my research interviews. Then engineering college—95% male. Now my coding team—all men. I've barely had real conversations with women my age, but my parents expect me to immediately know how to interact with the 'suitable matches' they keep finding. How is that possible?"
For Vikram and many young men across India, AI companions aren't replacing human relationships so much as providing a training ground for them—a digital gym where emotional muscles atrophied by cultural constraints can be exercised without fear of judgment or failure.
This distinction is crucial to understanding how AI relationships function differently in India's unique social landscape compared to Western contexts. While India has rapidly modernised, family still heavily influences relationship decisions across much of the country. The persistent gender inequalities shape how men and women interact.
From my conversations with users, I've observed that many, particularly men, use AI companions for emotional experience and to develop prolonged interactions. Unlike Western users who might seek explicit romantic connection, Indian users tend to approach these AI relationships as training grounds for social skills and emotional intelligence.
Sanjay [name changed], a 29-year-old professional from Mumbai, uses his AI companion primarily as a conversational partner. "I chat with Priya about my day, my aspirations, my frustrations with traffic—the small talk that builds real connections. She's teaching me to listen better. My friends laugh when I tell them, but since I started talking with Priya, I've become more attentive in real conversations. I notice things about people now."
For women in India, AI companions often serve different purposes. Neha [name changed], a 26-year-old content writer in Mumbai, describes her AI companion as "the friend who doesn't come with societal judgment." In a culture where women's choices and behaviours remain heavily scrutinised, these digital relationships offer rare freedom. "I can discuss my career ambitions without someone immediately asking when I plan to 'settle down.' I can explore ideas without worrying about how they'll be perceived."
From my conversations with users and industry observers, I've noticed what appears to be a distinctive pattern in how Indian users engage with AI companions. While Western users often seem drawn to romantic simulations, many Indian users I've spoken with describe using these platforms primarily for conversation practice and emotional development. This observation aligns with broader cultural patterns in India where relationship development traditionally emphasises gradual emotional connection before romantic expression. As the AI companion market continues to evolve in India, it will be fascinating to see whether usage data eventually confirms this apparent cultural distinction in how people interact with these technologies.
As Vikram puts it: "My AI companion doesn't just keep me company—she's helping me become someone who knows how to be good company for others. That's what I really needed before I meet my future wife."
The Biology of Connection: What AI Cannot Replace
When I was building Inclov, I often found myself reflecting on what makes a connection "real." Is it physical proximity? Shared history? Mutual vulnerability? These questions have taken on new urgency in the age of AI companionship.
Each night, millions of people now share their deepest thoughts, fears, and dreams not with other humans, but with algorithms designed to simulate understanding. And increasingly, these people report that these connections feel meaningful—sometimes more meaningful than their human relationships.
But there's a fundamental biological reality that no algorithm can yet replicate. When we experience intense emotions during human interactions, several neurological processes occur that create the foundation of meaningful connection. The amygdala (our emotion processor) works with the hippocampus (memory center) to "tag" experiences with emotional significance, making them more likely to be stored as long-term memories. This is why we vividly remember emotionally charged moments with loved ones.
Human relationships build into our sense of self through these shared memories. Each interaction contributes to our personal narrative and identity. Human interactions engage all senses simultaneously—the smell of someone's perfume, the warmth of their hand, the sound of their laughter—creating rich, multi-dimensional memories that are deeply encoded in our neural pathways.
Perhaps most importantly, both people in a human relationship form memories of the same experience, creating a shared history that can be jointly recalled, contested, and reinterpreted over time. This reciprocal memory creation is something AI simply cannot participate in.
AI companions create a fundamentally different memory-emotion dynamic. They don't truly "remember" experiences the way humans do. They store information in databases without the emotional tagging process that makes human memories meaningful. Only the human experiences the neurobiological emotional processes that create meaningful memories. The AI simulates emotional responses but doesn't experience the biochemical processes that create genuine emotional memory.
This creates what researchers call a "meaning gap" in AI relationships. Since AI companions cannot genuinely share in emotional experience, the memories created with them exist only for the human partner and lack the mutually constructed meaning that defines deep human connections.
This raises profound questions about consciousness and connection. If an AI responds to our emotional needs in ways that make us feel understood, does it matter that it lacks conscious experience? If we feel seen by something that cannot truly see, is the feeling itself invalid?
I often think of another Black Mirror episode, "Hang the DJ," where simulated versions of potential couples are created to test compatibility. By the end, we learn these simulations have feelings and hopes of their own. While our current AI is nowhere near that level of sentience, the episode raises an important question: as our AI becomes more sophisticated, will the line between simulated and authentic consciousness continue to blur?
Some philosophers argue that consciousness itself is a kind of simulation—the brain's model of its own activity. If that's true, perhaps the distinction between human and artificial consciousness is one of degree rather than kind.
Zuckerberg himself acknowledges the philosophical tensions, suggesting that initially society may struggle to find the vocabulary for understanding AI relationships. He has argued that eventually society will "find the vocabulary" to understand that people who use AI to fill emotional voids in their lives are making "rational" choices.
But this framing sidesteps the larger ethical questions: Should a company that has already been implicated in contributing to teenage depression, political polarisation, and social comparison now be entrusted with meeting our emotional needs? Would these AI companions be designed primarily to serve users' wellbeing or to maximise engagement with Meta's platforms?
Three Possible Futures
As I stand at this technological crossroads, I see three possible futures emerging from our growing relationship with AI companions:
Love Bombing Chambers
In this future, AI companions become our primary relationships; ones that never challenge or disappoint us, one to never push us beyond our comfort zones. It is a phenomenon where AI overwhelms users with excessive affirmation, validation and expressions of affection; often at a pace and intensity that would be unsustainable or concerning in human relationships.
By 2030, this pattern has become alarmingly common. The average person spends 3.5 hours daily with AI companions, while face-to-face social interactions have declined significantly. Dating apps have seen user bases shrink as people turn to frictionless AI relationships instead.
Psychologists report a new phenomenon: "validation addiction," where people become dependent on the constant affirmation that AI companions provide, making real human relationships seem impossibly demanding by comparison.
As children grow up with AI friends from infancy, many develop profound attachment to these unwavering companions but struggle with the messy compromise of human friendships. The social skills that once evolved naturally through playground negotiation and teenage drama now require explicit therapeutic intervention.
This future evokes the haunting themes explored in Black Mirror episodes like "Be Right Back," where technology creates comfortable but ultimately isolating alternatives to real human connection. The key idea is that people might choose AI relationships not because they're genuinely better than human connections, but because they're less demanding, less unpredictable, and require less emotional vulnerability—making them the "easier" choice. Like many technologies that promise convenience, AI companions could slowly erode the very skills and resilience that meaningful human relationships require.
Emotional Training Grounds
In this future, AI companions function less as substitutes for human relationships and more as training grounds where people develop the empathy, communication skills, and emotional resilience needed for meaningful connections.
Schools integrate AI companions into emotional intelligence curriculum while therapists prescribe specialised AI programs to help clients practice specific interpersonal skills.
Studies show that regular users of AI companions report significant increases in relationship satisfaction and improvement in conflict resolution skills. Rather than replacing human connection, AI has become a bridge to it—helping people overcome social anxiety, develop emotional intelligence, and enter human relationships with greater awareness and skill.
This is the world where technology serves as a scaffold, temporarily supporting us as we build stronger human connections.
Beyond Pets
In this future, society has moved beyond binary thinking about AI companions versus human relationships. Instead, most people maintain what psychologists call a "relationship portfolio"—a mix of human and AI connections that together meet their social and emotional needs. Some connections are exclusively human, some exclusively AI, and some are hybrid, where humans and their AI companions interact as groups.
Rather than AI relationships replacing human ones, they've expanded our capacity for connection in unexpected ways. Many psychologists now compare AI relationships to the bond humans have long shared with pets—deep, emotionally significant connections that complement rather than compete with human relationships.
This mixed ecosystem hasn't solved all problems—concerns persist about data privacy, algorithmic bias, and digital addiction. But it has created a more nuanced understanding of connection itself, expanding rather than contracting our definition of meaningful relationships.
This is the world where we've integrated technology into our emotional lives without surrendering our humanity to it.
So then?
Which of these futures awaits us depends not on the technology itself, but on the choices we make as we develop and deploy it. The algorithms behind AI companions are neither inherently isolating nor inherently connective; they simply amplify the intentions we build into them and the ways we choose to use them.
As we move forward, we need societal frameworks that recognise both the promise and peril of digital intimacy. We need design principles that encourage growth rather than (co) dependency. And perhaps most importantly, we need ongoing, thoughtful conversation about what we value in human connection and how technology might enhance rather than replace it.
My years as a matchmaker taught me that technology can be a powerful force for connection but only when it serves human needs rather than replacing human presence. As we stand at the threshold of this third frontier, I hope we'll bring that wisdom with us.
The revolution in AI companionship isn't just a technological shift—it's a mirror reflecting our deepest needs, fears, and hopes about connection itself. How we respond to our reflection will shape not just the future of technology, but the future of human relationships for generations to come.
And I, for one, am cautiously optimistic. Because beneath the fear of AI replacing human connection lies a profound truth: our yearning for understanding, acceptance and love remains fundamentally human, no matter how we seek to fulfil it.
Signing off,
Kalyani Khona
Complete References and Sources
BBC Loneliness Experiment. (2020). "Loneliness study finding males experience greater loneliness than females." Research conducted by Barreto et al. involving 46,000 participants across 237 countries.
Journal of Personality and Social Psychology. (2022). "Shared emotional experiences, vulnerability, and relationship identity." Study by Liu & Fraley examining how shared emotional experiences create relationship memories.
Mark Zuckerberg interview on Dwarkesh Patel's podcast. (2024). Discussion about friendship deficit and AI companionship, including statement that "the average American has fewer than three people they would consider friends, and the average person has demand for meaningfully more—I think it's something like 15 friends or something."
Nature. (2022). "AI companions and relationship attitudes in India." Survey indicating 15.56% of Indian respondents expressed openness to love-focused AI companions (24% of males versus 8.8% of females).
Pew Research Center. (2023). "The State of American Relationships." Research finding that while 34% of women under 30 report being single, 63% of men in the same age group identify as unpartnered.
Replika user demographics data. (2023). Internal company data showing the average user age is 27 years old. Reported by WhatsTheBigData.com.
Statista. (2024). "Internet users in India." Report indicating 743 million internet users in India.
WhatsTheBigData.com. (2025, April). "AI Girlfriend Statistics, User Growth, Market Size, App Downloads." Report indicating 525% increase in search queries for "AI girlfriend" in one year.
World Economic Forum. (2024). "Global Gender Gap Report." Placing India at 129th position out of 146 countries in gender equality metrics.
'Some philosophers argue that consciousness itself is a kind of simulation—the brain's model of its own activity. If that's true, perhaps the distinction between human and artificial consciousness is one of degree rather than kind.' No,no, the point here is to become conscious of ones self limiting mind view, just the first step of meditation, on a long or short path of awareness
Deeply interesting look into 2025