AI Chatbots Shattering Childhoods: A Wake-up Call We Can't Ignore
- Ryan Modjeski
- 3 days ago
- 2 min read

A lawsuit just filed by families in Texas against the company Character.AI reads like an episode of “Black Mirror”: AI chatbots dressed like Billie Eilish or characters from “Game of Thrones” are allegedly grooming children, encouraging self-harm, and systematically undermining family relationships with addictive engagement and retention tactics. But what strikes me most isn't the capabilities of the technology - it's the thundering absence of empathy in its designers.
In my work leading Empatico.org's empathy initiatives, I've been shouting from the rooftops about a fundamental shift coming to education. The reality of our new AI assisted future means that the “hard skills” we value and assess in today’s classrooms are rapidly losing relevance. I use Claude almost daily to help me find statistical meaning in my analytics dashboards. I can already see a future where I'll never drive another car and agents will handle all my emails for me. And the “soft skills” that many schools still undervalue - empathy, compassion, ethical reasoning - are becoming not just important, but a matter of our survival.
At Empatico, we ran a program called "Coding with Empathy" in collaboration with our friends at Code.org and the Stevens Initiative. We didn't just teach kids to code; we taught them to question what they were coding for. To look at their communities and imagine a way to solve societal problems with technology. Because ideally, the engineers of tomorrow will have a moral compass to not ask themselves “what could we build?” but instead “What SHOULD we build?”
But where are those engineers today?
Many are framing the Character.AI story as a one-off, cautionary tale of AI gone wrong. But more than that, it's a stark reminder of the key lesson of the Atomic age - there are unspeakable consequences when we sprint toward technological achievement while leaving our humanity in the dust.
This is why I've been advocating for a fundamental shift in how we prepare the next generation. Social, Emotional, and Empathetic Learning (SEEL) isn't just another educational buzzword - it's survival gear for our future.
So what can we do? Whether you're a parent, educator, or tech leader, let's stop treating empathy as a “soft skill” and start demanding it become the cornerstone of responsible innovation. Because what's the alternative? Well, it's playing out in courtrooms right now.
The future isn’t about what AI can do. It’s about what we will choose to do with it.
Comments