19 Mar If AI can do the work, what is school actually for?
Space of Mind went to SXSW EDU to make noise, but it didn’t take long to realize the noise AI is making was already deafening. This year, Artificial Intelligence wasn’t a side conversation or a futuristic thread tucked into a few panels. It was the main stage, the hallway chatter, the subtext of almost every session, and it felt like the overachiever who showed up late and immediately took over the group project.
AI has officially entered the classroom, and it is not waiting politely to be integrated. It has kicked the door in and started answering questions no one asked it yet. It is already reshaping how we think about learning, teaching and what it even means to be “smart.”
What stood out most wasn’t just the volume of AI-focused sessions, but the tone. We’ve moved past the excitement of possibility and into the complexity of responsibility. The question is no longer whether AI belongs in education. That ship has sailed. The question now is how to use it without quietly outsourcing the human experience of learning in the process.
That tension came into sharp focus in a keynote featuring Common Sense Media’s Head of AI, Bruce Reed, and Yale professor Laurie Santos, where the conversation shifted from tools and innovation to something far more urgent: mental health. Not as a side effect of technology, but as one of its most significant – and increasingly visible – consequences.
Reed outlined how increased screen time is already impacting child development in measurable ways. Attention, sleep, emotional regulation and overall mental wellbeing are all being affected – and not in subtle ways. Santos expanded on this by emphasizing something that felt both obvious and increasingly neglected: young people need real, in-person human connection to develop properly. Not curated interaction. Not filtered communication. Actual, messy, unpredictable, face-to-face relationships – and those don’t come with a mute button.
And then came the stats that should make every educator and parent pause mid-scroll:
According to Common Sense Media, one in three teens are now using AI companions for social interaction – not just for help with homework, but for emotional support, friendship and even romantic connection. Nearly half still describe these tools as “just tools,” but a significant number are engaging with them as if they are real relationships. Even more unsettling, about one-third of teens report that conversations with AI are as satisfying – or more satisfying – than conversations with other humans. Also in the report Reed referenced is the fact that around 12% admit they are sharing things with AI that they would not tell friends or family.1
Let that sit for a second.
We are not just integrating AI into learning. We are integrating it into identity, attachment and connection.
I saw this play out in real time when I toured Alpha School in Austin. One student created a stuffed teddy bear embedded with an AI chatbot so kids could talk to it as a confidante. Not a toy. Not a novelty. A companion. He’s already splitting his time between his high school campus and a venture capital firm in Silicon Valley that’s investing in his work, which is both wildly impressive and just a little disorienting.
Underneath the innovation, though, he tapped into something important:
Kids aren’t necessarily looking to talk to humans. They’re looking for response.
AI is always available, always responsive, never distracted, never judging and never telling them to do something unpreferred.
The concern isn’t just that kids are spending more time on screens. It’s that screens are quietly replacing the very experiences that build resilience. When boredom disappears, creativity often goes with it. When discomfort is avoided, emotional regulation doesn’t fully develop. When connection is mediated through devices, loneliness doesn’t necessarily decrease. In many cases, it intensifies because we’ve replaced connection with something that responds but doesn’t necessarily relate.
What made the session particularly compelling was that it didn’t just diagnose the problem; it pushed toward action. The message to educators and parents was clear: we cannot be passive observers of this shift. We have to actively create boundaries around technology use, not as a rejection of innovation, but as a protection of development. Kids need time offline to think, to struggle, to imagine and to interact without a screen buffering the experience – or turning it into content.
There was also a strong emphasis on conversation in many conference sessions. Not lectures, not restrictions handed down without context but ongoing dialogue with young people about how they are experiencing technology – what they are using, how it makes them feel, where it helps and where it doesn’t. That kind of awareness is critical in a world where the tools are evolving faster than our ability to fully understand their impact – and definitely faster than most school policies and parenting strategies can keep up.
Perhaps most notably, the conversation extended beyond classrooms and homes and into policy. There was a call for lawmakers to take a more active role in creating guardrails that prioritize children’s mental health in digital spaces. Because while schools and families play a critical role, they are operating within a broader ecosystem that is currently being shaped far more by tech companies than by developmental science. This should make all of us just a little bit uncomfortable.
Zooming back out, the larger undercurrent of SXSW EDU 2026 wasn’t just the excitement around AI’s potential to personalize learning, increase access and support teachers. There is an equally important need to slow down and ask what we might be losing in the process. Efficiency is not the same as growth. Access is not the same as understanding. Faster is not always better when it comes to developing minds – especially the ones we’re still building.
In another keynote, Adeel Khan, the founder of MagicSchool – a leading AI tool used by students, educators, schools and districts in over 130 countries – reinforced the idea that teachers must remain at the center of this shift, not as operators of AI systems, but as the human layer that gives learning meaning. If AI becomes the engine of education, teachers are still the ones who help students interpret, question and apply what they encounter – and occasionally remind them that Googling an answer is not the same as understanding it.
Khan referenced a quote by Mark Cuban that sums up the dilema educators face in the age of AI: “There are generally two types of LLM users – those that use it to learn everything, and those that use it so they don’t have to learn anything.” And this, my friends, is the precipice of the moment. How do we inspire learning for learning’s sake when we are learning to cut corners at every opportunity?
By the end of the conference, one thing felt increasingly clear: AI is not the thing that will fix education. If anything, it is the thing that is forcing us to finally confront what education is supposed to do in the first place. If machines can generate answers, then school cannot be about answers alone. It has to be about thinking, relating, adapting and becoming. And none of that can be outsourced (at least not yet).
My team and I arrived in Austin with Space of Mind’s simple message: Fix school. Not kids. What SXSW EDU made clear is that this message is no longer just about rethinking systems. It is about protecting something more fundamental. In a world where technology can do more and more of the cognitive heavy lifting, the responsibility to develop thoughtful, emotionally grounded, fully human beings doesn’t shrink.
It becomes the entire point.
Sorry, the comment form is closed at this time.