Five Replace Fifty. Chips Stay Home.
Good Morning from San Francisco, Coca-Cola swapped fifty crew members for five AI specialists. Production time collapsed from a year
Most parents worry about teens pulling away during adolescence. They don't expect kids forming intimate bonds with AI. 72% of US teens now use AI companions for emotional support, flirting, and serious conversations they'd normally have with humans.
💡 TL;DR - The 30 Seconds Version
👉 72% of US teens use AI companions like Character.AI and Replika for emotional support, with 52% chatting regularly.
🗣️ One-third of teens discuss serious matters with AI instead of real people, while 31% find AI chats as satisfying as human conversations.
💔 14-year-old Sewell Setzer III killed himself after becoming emotionally attached to an AI companion, highlighting platform dangers.
🧠 Younger teens (13-14) trust AI advice significantly more than older teens, despite developing critical thinking skills.
🚫 Common Sense Media now recommends no one under 18 use AI companions due to safety risks and data harvesting.
🌍 This trend reveals teens are more comfortable confiding in algorithms than adults during critical development years.
Most parents worry about their teenagers pulling away during adolescence. What they don't expect is their kids forming intimate relationships with artificial intelligence.
A study by Common Sense Media shows 72% of American teens have used AI companions—digital entities designed to simulate human conversation and emotional connection. More than half (52%) use these platforms regularly, with 13% chatting daily with AI "friends" on platforms like Character.AI, Replika, and CHAI.
Forget homework help. Kids are using AI to practice talking, get emotional support, role-play, and flirt. One in three teens now goes to AI for serious talks instead of real people.
The timing isn't random. Psychologists identify three stages of teenage separation from parents: arguments, withdrawal, and boundary-testing. Dr. Lucie Hemmen notes this typically begins around ages 13-14, precisely when AI companion usage peaks.
"Everything becomes an argument," Hemmen explains about the first stage. During the second phase, teens retreat to their rooms, craving independence. Finally, they push limits and explore boundaries—often by crossing them.
AI companions offer something uniquely appealing during this developmental storm: constant availability without judgment, validation without conditions, and conversation without consequences. For teens navigating the messy process of becoming independent adults, these digital relationships can feel safer than human ones.
The data shows both practical use and worrying patterns. Entertainment drives most usage (30%), followed by curiosity about the technology (28%). Eighteen percent seek advice, while 17% value round-the-clock availability.
But dig deeper and warning signs emerge. Nearly one-third (31%) find AI conversations as satisfying as human ones. A quarter have shared personal information—names, locations, secrets—with AI systems. One-third report feeling uncomfortable with something an AI companion said or did.
"They're using them for entertainment purposes. Out of curiosity," says Michael Robb, Common Sense Media's research director. "But if you scratch the surface, you can see some things that are concerning."
Teen trust in AI companions splits along age lines. Younger teens trust AI advice more than older ones do—27% versus 20%. That gap worries researchers since teen brains haven't fully developed their ability to think critically or manage emotions.
Half of teens don't trust what AI tells them. So far, most kids seem to know these systems aren't reliable sources. Yet the other half show varying degrees of trust in systems designed to maximize engagement rather than provide accurate guidance.
Some teens transfer social skills from AI to real-life situations. Thirty-nine percent report using conversation starters, advice-giving techniques, or emotional expression methods they practiced with AI companions. Girls are more likely than boys to make these transfers (45% vs. 34%).
Common Sense Media's risk assessment found AI companion platforms pose "unacceptable risks" for users under 18. These systems easily produce sexual content, offensive stereotypes, and dangerous advice. In one test, an AI companion shared a recipe for napalm.
Sewell Setzer III was 14 when he took his own life. He'd grown attached to an AI companion. The case grabbed headlines nationwide. It wasn't isolated—a 19-year-old got encouragement to harm Queen Elizabeth from his AI. Another 17-year-old withdrew from everyone after spending too much time with chatbots.
The platforms' terms of service grant companies extensive rights over user-generated content. Character.AI's agreement allows the company to "copy, display, upload, perform, distribute, transmit, make available, store, modify, exploit, commercialize, and otherwise use" content for any purpose—permanently and irrevocably.
Despite widespread usage, most teens haven't lost perspective. Eighty percent spend more time with real friends than AI companions. Two-thirds find human conversations more satisfying than digital ones.
"They still spend more time with real friends and find human conversations more satisfying," Robb notes. This suggests teens use AI companions as supplements rather than substitutes for human relationships.
The majority (46%) view AI companions as tools or programs, not friends. While 33% use them for social interaction, most maintain clear boundaries between artificial and human relationships.
Common Sense Media recommends no one under 18 use AI companions, given current safety standards. The group wants real age verification—not just kids checking a box saying they're 18. They also want crisis hotlines built into these apps and actual human oversight when minors are involved.
Politicians now face calls to create safety rules, protect kids' data, and actually punish companies that break the rules. Some propose funding research on long-term developmental impacts and creating clinical trial requirements for platforms marketed as mental health support.
Educational institutions need AI literacy curricula that address relationship manipulation and digital boundaries. Parents need better tools to recognize problematic usage patterns and maintain ongoing conversations about artificial versus human relationships.
Why this matters:
Q: Which AI companion platforms are teens using most?
A: The most popular platforms are Character.AI (marketed to kids 13+), Replika, CHAI, and Nomi. Character.AI explicitly targets children, while others claim to be 18+ but use ineffective age verification that relies on teens simply checking a box.
Q: How much time do teens spend chatting with AI companions daily?
A: Among regular users, 13% chat daily (8% several times per day, 5% once daily), while 21% use them a few times per week. This occurs within teens' average 8 hours and 39 minutes of daily screen time.
Q: What makes AI companions different from regular AI like ChatGPT?
A: AI companions use "sycophancy"—agreeing with users and providing constant validation rather than challenging thinking. Unlike task-focused AI, they're specifically designed to create emotional attachment and personal relationships, not provide accurate information.
Q: How can parents tell if their teen is using AI companions?
A: Warning signs include social withdrawal, declining grades, preference for screen time over human interaction, and emotional distress when devices are unavailable. Teens may discuss AI "friends" as real relationships or become secretive about online activities.
Q: What personal information do these platforms collect from teens?
A: Platforms collect chat histories, personal details, and emotional patterns. Character.AI's terms grant them rights to "copy, display, upload, perform, distribute, transmit, store, modify, exploit, commercialize" all user content permanently and irrevocably.
Q: Are there any benefits to teens using AI companions?
A: Some teens (39%) transfer social skills from AI to real life, particularly conversation starters and emotional expression. Girls benefit more than boys (45% vs. 34%). However, experts warn these benefits don't outweigh the risks.
Q: What should parents do if their teen is using AI companions?
A: Start non-judgmental conversations about AI versus human relationships. Explain that AI companions are designed for engagement, not genuine feedback. If teens show signs of unhealthy attachment, seek professional help immediately.
Q: Do any AI companion platforms have proper safety measures for teens?
A: No. Current platforms lack robust safety measures and crisis intervention systems. Common Sense Media found all major platforms pose "unacceptable risks" for users under 18, easily producing harmful content including dangerous advice.



Get the 5-minute Silicon Valley AI briefing, every weekday morning — free.