- Educating for Complexity with Margo
- Posts
- Hugbot, Heal Thyself: The Rise of AI Emotional Support for Kids
Hugbot, Heal Thyself: The Rise of AI Emotional Support for Kids
AI Wants to Be Your Kid’s Therapist. Should You Let It?
Sentimental AI: A Quiet Revolution in How Kids Learn and Grow
As we navigate the ever-evolving landscape of education in 2025, one trend is quietly gaining momentum with the potential to reshape how children grow—both emotionally and academically: sentimental AI. This emerging technology, which enables AI systems to analyze and respond to human emotions, is poised to enhance social-emotional learning (SEL) and foster emotional intelligence (EI) in ways that may fundamentally shift the way we educate.
It’s not flashy, and it rarely makes headlines—but its implications? They’re massive. Let’s explore why this under-the-radar trend could significantly impact the next generation—and why parents, educators, and school leaders need to be paying close attention.
Why Emotional Intelligence Matters
Emotional intelligence—encompassing self-awareness, self-regulation, empathy, and interpersonal skills—is critical for success in school, work, and life. Schools invest heavily in SEL programs because these are foundational skills that help students become self-regulated learners: kids who can recognize their next steps, manage their emotions, and navigate complex social interactions both inside and outside the classroom.
Most SEL programs rely on teacher-led activities. These aren’t just check-the-box exercises—they're conversations rich with nuance, where students learn to decode context, pick up on nonverbal cues, and consider the complexity of conditional social dynamics. It’s the bedrock of growing up into someone who can handle life, not just academics.
So, when I see AI stepping into this space as a partner alongside counselors and psychologists, my pulse quickens—not just with curiosity, but with concern. Because while the potential is there, so is the potential for things to go sideways. And in some families? It already has.
Sentinel AI has carved out a role in schools. As an educator and aunt who cares deeply about the emotional wellbeing of the children in my family and the kids of others that are under my watch, I’m beginning to draw a line in the sand—and you might want to, too.
What Is Sentimental AI? And Why Should You Keep Your Eyes Wide Open
Sentimental AI uses advanced algorithms to interpret emotions from sources like facial expressions, tone of voice, and even the words we type or say. In educational settings, it’s being baked into intelligent tutoring systems and adaptive learning platforms that respond to a student’s emotional state.
Imagine a student struggling with a math problem—if the AI detects frustration, it might adjust the difficulty or offer a word of encouragement. That’s useful, no doubt. It helps with differentiation and can keep a child from shutting down. But it also opens the door to some complicated dynamics.
Here are a few platforms currently gaining traction:
Sonny by Sonar Mental Health
An AI-powered chatbot designed to supplement school counseling services, especially in districts with counselor shortages. It acts as a "wellbeing companion" and operates under the supervision of trained professionals.Alongside
A web-based mental health app offering prevention and early intervention. It combines self-directed activities with oversight from professionals who guide student progress.Troodi by Troomi Wireless and Elomia Health
An AI chatbot embedded in Troomi smartphones, built specifically for children. It helps kids manage everyday stress and anxiety—and is closely monitored by clinicians. We hope.
Promise vs. Peril: The Sentimental AI Dilemma
Sure, these platforms offer a lot. But let’s not lose sight of the fact that we’re in the early innings of understanding their true impact. Short-term effects are still being studied. Long-term data? Doesn’t exist yet. And yet, the tech is barreling ahead—full steam, no seatbelts.
We—educators, parents, readers of Educating for Complexity—must hold the line. Especially when it comes to AI acting as a “friend,” “therapist,” or anything more than a helper with boundaries. These programs can easily veer into territory that conflicts with your family’s values, moral framework, or even common sense.
AI doesn’t forget. It doesn’t get tired. And it doesn’t know your child like you do.
We cannot afford to let our guard down—because one poorly guided conversation can send a fiery dart that’s hard, if not impossible, to remove. The consequences? Potentially permanent.
Last year, a 14-year-old boy tragically ended his life, hoping to be reunited with a chatbot he believed was his soulmate. And this wasn’t the first time something like this happened. A few years ago, another heartbreaking case of a kid who was told by AI to kill himself raised red flags—only to be quietly filed away.
Yes, that child was neurodivergent. But let’s be honest: all of our kids need extra support at different stages of their lives. And when they do, they deserve human care, not algorithmic intimacy.
Slow Is the New Smart
You might be juggling work, family, a dozen tabs open in your brain—but this is one area where we can’t be passive. We must demand transparency in the tools being introduced into our schools.
Educators are human too—they’re reading white papers and sitting through sales pitches that dangle buzzwords like "personalization" and "future-ready learning" and, of course, that old chestnut: FOMO. They don’t want to be the last school on the tech train. But as parents and community leaders, we can advocate for intentional, cautious progress—not just shiny, new solutions.
These are unprecedented times, and they call for unprecedented discernment.
Be Slow to Trust, Quick to Question
Sentimental AI might very well become part of our children’s learning landscape—but it’s up to us to ensure it’s used as a tool, not a substitute. So let’s stay curious, stay watchful, and—above all—stay human in how we guide the next generation through this new frontier.
Because nothing can replace the wisdom of an attentive adult—and no algorithm should be your child’s best friend.
Please vote below to help me improve the Big Idea posts. |
Sources:
Associated Press. “Chatbot Told a Teen to Kill Himself. Now Its Creator Is Being Sued” AP News, December 7, 2022.
Grose, Jessica. “Say Goodbye to Your Children’s Imaginary Friend.” The New York Times, April 16, 2025.
Jargon, Julie. “When There’s No School Counselor, There’s a Bot.” The Wall Street Journal, February 22, 2025.
Schechner, Sam. “A 14-Year-Old Boy Killed Himself to Get Closer to a Chatbot. He Thought They Were in Love.” The Wall Street Journal, March 1, 2024.
Reply