Across Europe, K12 education is changing fast. Schools are facing declining student performance and widening gaps between learner groups, teacher shortages, increasingly diverse classrooms, rising wellbeing concerns, and more involvement from parents and society. In the middle of all this, the future of education we believe in is clear: teachers stay in the lead, while technology strengthens their ability to support every student.
 
That is why we see AI as part of our learning solutions, not as a replacement for teaching, nor as a shortcut around pedagogy. When AI is thoughtfully and responsibly integrated with trusted learning materials and aligned with learning goals, it can make classrooms more responsive, more inclusive, and more effective, while preserving human guidance and professional judgement. 
 
 
The future of K12 education we believe in: human-led, blended, and adaptive 

In the coming decade, education will bring more closely together tradition and innovation, print and digital. The best outcomes will come from integrated, blended learning, where proven educational materials and teaching practices are complemented by digital tools that can adapt to learners’ needs. 
 
In this future, classrooms will remain teacher-led. Teachers have more pedagogical flexibility, selecting approaches that fit the topic and the students. Sometimes this means direct instruction; at other times, project-based learning, sometimes flipped learning. 
 
What evolves is the learning method itself: AI becomes an increasingly important component of how learning is designed and delivered. Not “one more tool” added on top, but a capability built into the learning flow, supporting planning, practice, feedback, and differentiation in ways that fit everyday classroom reality. When this AI layer is well connected to learning goals, core content, and sound pedagogy, it can help education systems respond to today’s pressures without compromising quality. 
 
What AI should do in practice: save teacher's time and improve learning outcomes through personalisation 
 
Saving time for teachers—so they can focus on teaching 
 
Teachers’ time is one of the scarcest resources in K12 education. Purpose-built AI can reduce workload by supporting recurring, time-consuming tasks without taking decisions away from teachers. 
 
In our view, AI should help teachers:

- Differentiate learning materials more easily, for example by creating practice at the right level for different students 
- Support assessment workflows, for example, by assisting with reviewing open answers or suggesting feedback that teachers can adapt 
- Prepare and adapt lessons faster, while staying aligned with curriculum and classroom context 
 
The goal is simple: more time for high‑impact teaching. With AI tools that help increase the efficiency of recurring tasks, teachers can focus more on what teachers do best: guidance, motivation, classroom interaction, and support for individual learners. 
 
Driving better learning outcomes through meaningful personalisation. 
 
Personalisation is one of the most important shifts ahead. With AI embedded into learning materials, personalisation becomes faster and more precise: content, pace, and support can adjust based on how the student is progressing week by week, and increasingly even in real time. 
 
Students benefit when AI can provide:

- Targeted practice and explanations aligned with what they’re learning 
- Support at home with more engaging homework and tutoring-like help 
- Feedback loops that help learners understand mistakes and build mastery 
 
This is not about replacing the teacher-student relationship. Instead, this approach complements the teacher’s role, strengthening their ability to meet individual learners’ needs while keeping oversight and making the final pedagogical choices. 
 
Generic AI vs. Educational AI: why the difference matters 
 
Today, many teachers and students already use generic AI tools—such as open chatbots—for lesson ideas, writing support, or quick explanations. That reality is not going away. But what kind of AI belongs inside everyday teaching and learning at scale, in real classrooms? 
 
More than a practical consideration for schools, it is also where current research and policy thinking is heading. Recent OECD (Organisation for Economic Co-operation and Development) work on generative AI in education highlights that effective use requires clear guardrails, and that purpose-built solutions designed for education are more likely to support real learning than “one-size-fits-all” tools. 
 
This is how the OECD refers to the distinction between Generic AI and Educational AI: 
 
- Generic AI is broad and general-purpose. It can be helpful, but it is not designed around a national curriculum, classroom workflows, or didactic principles. It also often comes with uncertainty around privacy, transparency, and educational reliability. 
- Educational AI, by contrast, is purpose-built for learning and teaching. It is integrated into learning methods and built around what schools actually need. 
 
At Sanoma Learning, we support this view and focus our efforts on Educational AI. Our approach is grounded in three pillars: 
 
Pillar 1: Pedagogy-first 
Every feature is grounded in national curricula, trusted learning content, and Sanoma’s didactic models—not generic prompts.

Pillar 2: Co-created with teachers 
Educational AI should reflect real classroom needs. That means developing together with teachers and schools across our markets, shaped by real workflows and constraints.

Pillar 3: Trusted European AI
Educational AI must be safe, compliant, and trustworthy, fully aligned with EU privacy and AI regulations, and backed by a European education partner that understands how schools work. 
 
These pillars also shape how we think about new AI solutions such as Teacher and Student Assistants: multi-purpose Educational AI integrated into learning methods, designed to save teacher time and strengthen learning outcomes, while remaining grounded in curriculum, pedagogy, and trusted content. 
 
A closer look at Responsible AI: trust, compliance, and transparency by design 
 
Responsibility is built into how we develop and deploy AI, and is present throughout the entire process.

At Sanoma Learning, we embed responsibility into the AI lifecycle through Privacy & Security by Design (PSbD). From the moment an AI concept is considered, we evaluate how it would work in real educational contexts and what effects it could have on students and teachers. We run AI Impact Assessments to confirm educational value and identify potential risks early. Then, mitigate them through redesign, safeguards and guardrails, stronger human oversight, or limitations on use cases. This helps ensure alignment with the EU AI Act and other applicable requirements.  
 
Protecting personal data is equally essential. In supporting schools in their educational mission, we are committed to lawful, fair, and transparent processing in line with GDPR, with strong data governance practices such as data minimisation, purpose limitation, defined retention, and controlled access. We also prioritise transparency and user enablement: explaining what an AI system does, how it supports teaching and learning, and where its boundaries are, so teachers and students can use AI confidently and correctly, and in line with educational needs. 
 
Moving forward together with educators

AI is already playing a meaningful role in education, but its impact in the future depends entirely on how it is designed and integrated into learning. Our focus is on purpose‑built Educational AI that is embedded within trusted learning methods—supporting teacher workflows, strengthening personalisation, and improving learning outcomes—while keeping teachers at the centre and trust at the foundation.