Teenagers are turning to AI companions for intense sexual “relationships” more than any other purpose, according to an industry watchdog report published Tuesday.
The Boston-based parental monitoring app Aura found that 36.4% of users ages 13 to 17 devoted artificial intelligence conversations to sexual or romantic role-playing in the past six months, making it the most common topic.
AI companions are software programs designed to simulate human relationships through chatbots or digital avatars that adopt imaginary names and personalities.
Aura said an additional 23.2% of the teens its app tracked relied on the programs for creative make-believe, while only 13.1% asked the bots for help with homework.
The other users tapped AI companions for emotional or mental health support (11.1%), advice or friendship (10.1%) and personal information (6.1%).
Lead researcher Scott Kollins, Aura’s chief medical officer, said the findings highlight how chatbot interactions are adding to digital stress and addiction risks as teens spend more time talking to them than to peers or adults.
“It’s not an easy problem to wrangle,” Mr. Kollins, a clinical psychologist, said in an email. “Parents were just starting to wrap their heads around social media, and now they’re facing tech that doesn’t just watch their kids, it talks to them.”
The study found that adolescents averaged 163.1 words per message to PolyBuzz, an AI-powered chatbot that sent them sexually suggestive notes late into the night.
By contrast, they averaged just 12.6 words per text message and 11.1 words per Snapchat message to real-life family and friends.
In interactions with ChatGPT, a less romantically focused AI chatbot, they averaged 34.7 words per message.
The report is the first to flesh out concerns about AI companions among health and dating experts. It tracked the digital habits of 10,000 children and teens aged 8 to 17.
Aura found that nearly 20% of children under 13 reported spending more than 4 hours daily on social media applications, despite age checks and parental notification policies meant to limit their exposure to addictive programs at vulnerable ages.
The company noted that an October 2024 report from the Centers for Disease Control and Prevention linked this amount of daily screen time to higher fatigue, anxiety and depression symptoms among teens.
According to some experts not affiliated with the Aura report, the findings underscore the need for adults to establish healthy boundaries around AI romance and set a good example for children.
“AI will never surprise you with a dinner party for your birthday or turn washing the car into a water fight, so it can never be as spontaneous, hilarious, and satisfying to our souls as a real relationship,” said Amber Brooks, Florida-based editor of DatingAdvice.com, in an email. “That’s the message we need to impart to our kids to help them make choices that will lead them out into the world and away from the addiction of screen time.”
Ms. Brooks pointed to an August survey from DatingAdvice.com and the Kinsey Institute at Indiana University that found 61% of adult singles consider sexting or falling in love with an AI companion to be “cheating.”
Laura DeCook, the California-based founder of LDC Wellbeing, which leads mental health workshops for families, said AI companions pose the biggest risk to isolated or socially awkward teens who find them easier than risking real-life pain or rejection.
“There have already been youth suicides linked to AI ’friends,’ including cases where the AI encouraged harmful behavior,” Ms. DeCook said Tuesday. “This context shows just how high the stakes are.”
The Aura report also included self-reported data from 300 children ages 8 to 17 whose parents agreed to answer a series of questions with them.
It found they spent hours on social media and began checking their phones in compulsive bursts at 7 a.m. each day, mirroring the stressed-out habits of cigarette smokers.
“Analysis also revealed distinct shifts in tone and depth, suggesting that AI may be filling gaps in communication, addressing questions or feelings kids don’t share with parents or peers,” Aura said in a summary of the findings.
• Sean Salai can be reached at ssalai@washingtontimes.com.
Please read our comment policy before commenting.