A recent conference on AI in European schools gathered a group of education leaders to tackle this issue. Their discussions were observed by educators worldwide as they explored AI integration. How to guide. How to adjust. How to ready schools for a future that seems destined to be quite different from today.
The EU Artificial Intelligence Act
By June 2025, significant parts of the legislation were already active, with most obligations coming into effect from August 2026. Its aim is to guarantee that AI systems are secure, fair, and open.
The Act categorizes AI based on risk. Tools posing unacceptable threats to rights or safety are forbidden completely. High-risk systems must meet stringent requirements regarding transparency, data management, human oversight, and security. Even systems seen as lower risk are required to follow new transparency guidelines.
The consequences are significant. Non-compliance could result in fines up to €35 million or 7% of global turnover. These regulations apply to any provider or user whose system reaches users within the EU, irrespective of the company's location.
The law has sparked debates, particularly in the United States. American businesses and policymakers have voiced concerns about competitiveness, innovation, and data openness. There's discussion about excessive regulation.
So, how should EU schools respond? How can they proceed in a manner that adheres to these new standards while keeping students and learning central?
That's precisely what the conference aimed to investigate.
Beginning with Urgency
As the host of the event, I began with a sense of urgency. I collaborate strategically with schools globally and with certain governments regarding AI adoption. I initiated by outlining the current global developments concerning AI and highlighting the innovative mindset education must embrace to advance with optimism.
I prompted the educators in the audience to lead with purpose yet acknowledged that change is emotional. It's exhausting. And not everyone feels prepared. Leaders cannot overlook this. They need to support their teams with empathy, not just plans. If we can foster trust, we can generate momentum.
It's not merely about incorporating technology into existing practices or impulsively jumping into new trends.
Analyzing the EU AI Act in Education
Following me was Matthew Wemyss, author of AI in Education: An EU AI Act Guide and a prominent figure in the EU AI Act within schools. His presentation served as a practical guide for schools to initiate compliance with the EU AI Act. He guided educators through what they needed to comprehend and implement to embark on the path to compliance.
The law doesn't treat all AI equally. Some AI tools present minimal or limited risk. AI systems utilized for determining student access to educational programs, evaluating learning outcomes, assessing appropriate education levels, or monitoring student behavior during tests are categorized as high-risk and come with stricter rules.
Wemyss was clear: compliance is non-negotiable. Yet, the Act is not solely about avoiding fines. It's a framework to encourage responsible and transparent AI usage in education.
He framed his message around three primary actions: assess, review, and comply. Schools must first audit the AI currently in use. This involves identifying which tools are in place, understanding their functions, and determining who is accountable for them. This includes not only formal platforms but also tools with built-in AI features used informally by staff.
From there, Wemyss encouraged schools to scrutinize how these tools are being applied. Are decisions fair? Are outputs explainable? Is human judgment involved? Schools should not take vendor claims at face value. If an AI tool impacts student learning or access, leaders need to grasp how it operates. If providers aren't compliant, the school faces compliance risks as a user.
Compliance, he explained, is not a checklist. It entails establishing systems that are ethical, safe, and suitable for each school's context. What is essential in one setting might not apply in another. Even when using third-party systems, schools remain accountable as users. "Ask tough questions," he stated. “Acquire the clear documentation you need about compliance measures.”
He also urged schools to assign someone capable of overseeing AI governance. Not just someone technical, but someone who can grasp the ethical aspects and translate regulations into everyday practice.
Wemyss' closing message was practical: start now, but start wisely. "You don't need to resolve everything at once," he said. "But you do need to know what you're dealing with." Schools should be aiming for compliance by August 2026. Delaying it risks hasty decisions and overlooked risks.
Strategy Over Hype
Next, author and educational consultant Philippa Wraithmell redirected the conversation. She’s worked with schools from Dubai to Dublin, aiding them in using digital tools effectively. Her main point is to avoid conflating activity with strategy.
AI isn't beneficial simply because it exists. It's beneficial when aligned with a goal. Wraithmell demonstrated how some schools are achieving this. They're not merely using AI to hasten grading. They're employing it to customize support. To create superior lesson plans. To provide teachers with genuine insights into how students learn.
Yet none of this occurs by chance. It demands planning. It requires training. It necessitates trust. Wraithmell emphasized that trust must originate with the teachers. If they lack confidence, the technology won't endure. That's why she suggests starting small. Pilots. Coaching. Time for reflection. And always, room for teachers and students to construct together.
One of the most pragmatic pieces of advice she offered was a straightforward decision matrix. For every AI concept, schools should inquire: does this align with learning objectives? Is the data secure? Do teachers feel confident utilizing it? If it doesn't meet all three criteria, they hold off.
Her strongest statement came towards the conclusion. "If your AI strategy doesn't encompass the entire school community," she remarked, "then it's not truly a strategy."
Informed Governance
Al Kingsley MBE took the stage last. He’s been in education leadership for decades, both in tech and in schools, and is a prolific author. He provided focus. He discussed governance. That part of school leadership that often remains in the background.
Kingsley clarified that schools require more than leadership. They need frameworks that facilitate sound decisions. Who approves new tools? Who monitors their influence? Who ensures policies stay current?
He presented a maturity model that boards can use to gauge their readiness. Are they passive? Reactive? Strategic? Most find themselves somewhere in the middle. Kingsley urged them to progress further. He reminded everyone that if those making decisions don't comprehend AI, they'll end up letting others decide for them.
He advocated for ongoing training. Leaders and governors need time and space to learn. Otherwise, the school will advance with a digital blind spot.
He also underscored the necessity of involving parents in the dialogue. Families desire reassurance. They wish to know how AI is employed. And why. Kingsley said schools must be prepared to explain both. Not with jargon, but with clarity. With examples. With honesty.
Mindset Over Tools
What unified the entire session wasn't a singular solution. It was a mindset. AI is here. Whether it becomes a tool for transformation or a source of confusion hinges on how schools react.
This isn't a moment for education to ask better questions. What do our students require? What do our teachers need? What do we want learning to feel like?
The schools that succeed won't be the ones moving the fastest. They'll be the ones moving with intent.
This means setting goals prior to acquiring tools. It involves listening to teachers before drafting policies. And it entails being truthful about what's working and what isn't.
So, what now?
Utilize what you have. Study what you don't know. Invite your entire community in.
And do it all like the future depends on it.
Because it does.
The above is the detailed content of How Can European Schools Innovate Under The EU AI Act?. For more information, please follow other related articles on the PHP Chinese website!

Hot AI Tools

Undress AI Tool
Undress images for free

Undresser.AI Undress
AI-powered app for creating realistic nude photos

AI Clothes Remover
Online AI tool for removing clothes from photos.

Clothoff.io
AI clothes remover

Video Face Swap
Swap faces in any video effortlessly with our completely free AI face swap tool!

Hot Article

Hot Tools

Notepad++7.3.1
Easy-to-use and free code editor

SublimeText3 Chinese version
Chinese version, very easy to use

Zend Studio 13.0.1
Powerful PHP integrated development environment

Dreamweaver CS6
Visual web development tools

SublimeText3 Mac version
God-level code editing software (SublimeText3)

Hot Topics

Investing is booming, but capital alone isn’t enough. With valuations rising and distinctiveness fading, investors in AI-focused venture funds must make a key decision: Buy, build, or partner to gain an edge? Here’s how to evaluate each option—and pr

Let’s talk about it. This analysis of an innovative AI breakthrough is part of my ongoing Forbes column coverage on the latest in AI, including identifying and explaining various impactful AI complexities (see the link here). Heading Toward AGI And

Remember the flood of open-source Chinese models that disrupted the GenAI industry earlier this year? While DeepSeek took most of the headlines, Kimi K1.5 was one of the prominent names in the list. And the model was quite cool.

Let’s talk about it. This analysis of an innovative AI breakthrough is part of my ongoing Forbes column coverage on the latest in AI, including identifying and explaining various impactful AI complexities (see the link here). For those readers who h

By mid-2025, the AI “arms race” is heating up, and xAI and Anthropic have both released their flagship models, Grok 4 and Claude 4. These two models are at opposite ends of the design philosophy and deployment platform, yet they

For example, if you ask a model a question like: “what does (X) person do at (X) company?” you may see a reasoning chain that looks something like this, assuming the system knows how to retrieve the necessary information:Locating details about the co

The Senate voted 99-1 Tuesday morning to kill the moratorium after a last-minute uproar from advocacy groups, lawmakers and tens of thousands of Americans who saw it as a dangerous overreach. They didn’t stay quiet. The Senate listened.States Keep Th

Clinical trials are an enormous bottleneck in drug development, and Kim and Reddy thought the AI-enabled software they’d been building at Pi Health could help do them faster and cheaper by expanding the pool of potentially eligible patients. But the
