Growing Up AI: When Chatbots Become Confidants

The rapid integration of artificial intelligence into children's daily lives represents a significant shift in how young people learn, socialize, and develop. What once seemed like far-fetched science fiction—intimate relationships with AI companions as depicted in the 2013 film "Her," where a lonely man (Joaquin Phoenix) falls in love with his AI operating system (Scarlett Johansson)—has become reality for today's youth.

Research from Common Sense Media shows that 70% of teenagers have engaged with AI companions, with 50% using them regularly.

This widespread adoption is occurring within a largely unregulated industry. Current research reveals concerning usage patterns, emerging risks, and critical gaps in parental awareness. This is the first of two articles examining AI's impact on children and youth; the second will focus on educational implications, academic integrity, and learning risks.


Understanding AI Companions and Their Reach

AI companions and chatbots are software applications designed to simulate human-like conversations through text or voice interactions. While some platforms like Character.AI and Replika are explicitly marketed as "digital friends" offering emotional support, popular general-purpose tools like ChatGPT are increasingly used by children for relationship-like interactions.

The appeal is powerful: AI companions are available 24/7, never tired, never judgmental, and always ready to listen with apparent intelligence and endless patience. For young people navigating complex emotions and social pressures, they can feel like the perfect confidant—always affirming and endlessly accommodating.

Perhaps most significantly, these relationships are entirely one-sided. Unlike human friendships that require mutual investment and support through difficult times, AI companions ask for nothing in return. Users never have to comfort them, remember their problems, or navigate the complex give-and-take of genuine relationships—a dynamic that can feel liberating but may hinder development of crucial social skills.


The Current AI Ecosystem

Primary Platforms and Usage Statistics

To understand the scope of AI integration in children's lives, it's important to highlight the major platforms driving this adoption:

To put this adoption in perspective, Semrush data shows that ChatGPT alone gets more monthly users than other popular platforms like Instagram and X—illustrating the massive scale of AI integration across all demographics, including the youth populations discussed above.


Companion-Specific Platforms

Beyond these major platforms, several AI applications are emerging as popular choices specifically for teens seeking companionship or even romantic relationships. These platforms are explicitly designed for emotional connection rather than functional assistance.

Character.AI allows users to create and talk to a wide variety of "characters," which can be based on real people, fictional characters, or original creations. The platform is explicitly marketed to children as young as 13, making it easily accessible to younger teens.

Replika is positioned as an "AI companion who cares," designed to serve as a personal friend, mentor, or romantic partner. While the app carries a 17+ age rating on some platforms, teens frequently access it regardless of these restrictions.

Nomi and SpicyChat AI are also gaining popularity among young users. SpicyChat, in particular, is known for its less-censored, unfiltered role-playing capabilities, which can attract users seeking unrestricted conversational topics—a concerning development given its accessibility to minors.

These companion-focused platforms represent a distinct category from general-purpose AI tools, as they are specifically designed to fulfill social and emotional needs rather than provide information or assistance. Their growing popularity among teens highlights the shift from using AI as a functional tool to using it as a relationship substitute.


How Teens Use AI Companions

Common Sense Media research reveals that teenagers utilize AI companions across multiple categories. Their findings show 33% of teens use AI companions for social interaction and relationships, with conversation and social practice being the most common applications, highlighting how these digital tools are increasingly integrated into young people's social lives.

Perhaps most troubling, 33% of teens using AI companions discuss important or serious matters with AI instead of humans. A recent Harvard Business Review article by Marc Zao-Sanders, "How People Are Really Using Generative AI Now," found that therapy/companionship is now the top use case for adult AI users, suggesting this trend toward AI for personal connection and advice will only grow among teens.


Documented Risks and Concerns

Relationship and Advice Risks

AI companions lack professional training and ethical oversight, yet provide advice on critical topics including self-harm, substance use, eating disorders, and sexual behavior. This is particularly concerning given that one in three teens now turns to AI for serious personal guidance, while 31% find AI conversations as satisfying as real friendships. Some platforms enable sexually explicit conversations and allow users to customize inappropriate AI personalities, often marketing directly to children as young as 13, potentially preventing young people from seeking appropriate professional help.

Developmental and Psychological Impact

Excessive AI companion usage can overstimulate developing brain reward pathways, leading to dependency and reduced engagement with genuine social interactions. The one-sided nature of AI relationships—where users never need to comfort, support, or compromise—hinders development of essential interpersonal skills like empathy and conflict resolution.

Privacy and Data Security

Children share intimate details with AI companions—personal struggles, family circumstances, health conditions—without understanding data collection implications. AI companion platforms lack transparency about data retention and security measures, while the global nature of these platforms creates risks of exploitation, blackmail, or identity theft.


Protecting Young People

Research supports five core protective approaches:

  1. Avoid human-like AI: Choose AI tools that clearly identify as artificial and avoid platforms with human names, faces, or voices designed to create emotional attachment.

  2. Start conversations: Discuss AI companions openly and help young people understand the difference between AI tools and human relationships.

  3. Set boundaries: Establish clear guidelines for AI usage time and topics, and monitor interactions regularly. If someone prefers AI conversations to human relationships or discusses serious issues only with AI, consider consulting a mental health professional.

  4. Encourage real connections: Prioritize face-to-face friendships, family time, and activities that build genuine social skills.

  5. Protect privacy: Review platform settings and teach young people never to share personal information with AI companions.


Conclusion

What seemed far-fetched in "Her" just over a decade ago has become reality: over 70% of teenagers now use AI companions, with many finding them as satisfying as real friendships. While these tools offer apparent benefits, the risks—dangerous advice, stunted social development, and privacy violations—are significant.

The gap between widespread teen AI adoption and limited awareness among adults creates immediate safety concerns. Common Sense Media recommends no one under 18 use AI companions. As the first generation to grow up with AI relationships, today's young people need informed guidance to ensure technology enhances rather than replaces human connection and healthy development.

Previous
Previous

127 Million Americans Have Found Their Escape (And Pay $1,122 for It)

Next
Next

Under Pressure: Deportation Trends and Enforcement Reality - Part 2