What Teachers Really Think About AI in Education: Insights from the 21st Impact Workshop
When we talk about AI in education, it’s easy to focus on the tools, chatbots, assistants, adaptive platforms.
But during our workshop at the EfVET Annual Conference 2025 in Portugal, what truly emerged from the voices of teachers across Europe was not about technology itself. It was about what it means to remain human in an age increasingly shaped by algorithms.
The AI & Education Teacher Survey, completed by eighteen educators from several countries, reveals both enthusiasm and unease.
Teachers see AI as a catalyst for creativity and personalized learning, yet also as a force that can erode critical thinking if left unchecked.
Hope and hesitation
66.7% see AI as a path toward personalized learning.
50% mention creativity and time-saving as major benefits.
But 83.3% worry that students may lose their critical-thinking habits.
66.7% are concerned about AI-driven misinformation.
Optimism and caution coexist. The message is clear: teachers are not rejecting AI, they are asking for ethical, guided integration that preserves the essence of education.
Prepared, but not fully equipped
Most participants rated their preparedness to integrate AI at 4 out of 5, yet one in five admitted feeling underprepared.
That tension reveals a crucial gap: educators need time, community, and training, not more tools, but deeper understanding.
Educating the human algorithm
When asked what “educating the human algorithm” means, teachers spoke about nurturing students’ critical thinking, empathy, and ethical judgment, the skills that machines cannot replicate.
In other words, our challenge is not to make education more technological, but more profoundly human.
Why this matters
AI can personalize content, but only humans can personalize meaning.
The findings remind us that digital transformation must go hand in hand with moral and emotional literacy. If we want students to navigate intelligent systems wisely, we must first help them understand themselves: their values, emotions, and cognitive limits.
That also means teaching them to recognize the invisible architectures behind technology: how algorithms learn, where bias comes from, and why transparency and regulation matter.
AI is not neutral. It reflects the data it’s trained on, and that data carries our collective prejudices, gaps, and assumptions.
By making these mechanisms visible, we help students move from passive users to conscious citizens of the digital world.
There’s also a mental and emotional dimension. Constant comparison, cognitive overload, and algorithmic pressure can quietly shape attention, self-esteem, and wellbeing.
Addressing this requires more than technical skills: it calls for emotional literacy, ethics, and a healthy digital culture in schools.
At the same time, the benefits are undeniable. When used with intention, AI can personalize learning, enhance creativity, and support inclusion for students with different needs.
The challenge is not to choose between risk and opportunity, but to equip young people to see both, and to act with clarity, empathy, and responsibility in the spaces where humans and machines meet.
Final reflection
Teachers today stand on the frontline of the AI revolution.
Their voices reflect both courage and care,a commitment to use technology without losing humanity.
Because, as I often say: we’re not preparing students for jobs; we’re preparing them to inhabit a complex, digital, and deeply human world.