AI Policy in Action: From Paper to Practice
The first two parts of this series laid the foundation: governance defines who decides, and principles provide values and direction. But a policy document that sits in a drawer helps no one.
Policy only gains meaning when it becomes visible in the classroom, administration, and communication with parents. This third part shows how schools can put AI into practice, with special attention to the AI tools that teachers and pupils already use on a daily basis.

TABLE OF CONTENTS
TL;DR Summary
AI policy only works when it provides guidance in practice. Successful schools:
- Develop an AI protocol: with clear boundaries.
- Test AI: through small-scale pilots and measure effects.
- Translate principles: into practical rules for widely used AI tools.
- Review annually: and adjust policy accordingly.
The AI Protocol: From Framework to Rules
An AI protocol translates structure and values into rules that everyone can understand.
Application Areas and Risks
Not all AI applications carry the same level of risk. A practical framework can distinguish between:
- Teaching support (relatively low risk): e.g. idea generation, differentiated exercises, producing practice material.
- Assessment & grading (high risk): e.g. grade advice, pupil selection, plagiarism detection.
- Administration (medium risk): e.g. timetabling, reporting, parent communication.
💡 This is not a legal classification but a practical tool for prioritising policy.
“Policy must live, not sit in a drawer.”
Boundaries for AI
AI tools (such as chatbots and copilots) are powerful but also risky. A protocol might state:
- Permitted: lesson ideas, text rewriting, brainstorming, personalised explanations.
- Not permitted: summative grading or handling personal data.
- Condition: AI output must always be reviewed and adapted by a teacher or staff member.
AI in Practice
AI tools are often the first point of contact for teachers and pupils. They offer great opportunities but also risks.
Opportunities
- Faster lesson preparation: ideas for assignments, alternative explanations.
- Support for differentiation: materials tailored to different ability levels.
- Language and writing support: summaries, translations, feedback.
Risks
- Unreliable output: AI can produce errors or bias.
- Privacy concerns: data entered may be stored outside the EU or used for training. Schools should require that data is processed within the EEA or with equivalent safeguards in line with the GDPR.
- Overreliance: dependence on a single tool may reduce creativity and critical thinking.
💡 Practice rule: AI is a tool, not a replacement. All output must be critically reviewed, adapted, and used transparently.
Pilots and Controlled Experiments
Rolling out AI tools across a whole school at once is risky. Start small.
- Choose one application (e.g. a writing assistant in one subject area).
- Define measurable goals (e.g. reduced marking time, more varied lesson material).
- Involve teachers and pupils actively in the evaluation.
- Adjust policy based on the findings.
Example: After a pilot using a AI tool for lesson preparation, teachers reported saving several hours per week, but pupils found some explanations too superficial. Result: the tool stayed, but teachers always added their own depth.
💡 Figures on time savings or learning gains are illustrative and vary widely depending on subject, tool, and context.
Practical Rules as Mantras
Keep rules simple and memorable:
- AI supports, it does not replace the teacher.
- AI output is always checked by a human.
- Pupils use AI only with permission and proper attribution.
Annual Review and Updates
AI evolves rapidly. Policy must keep pace. Hold an annual review with staff, ICT, and governing bodies:
- Which AI tools are really in use (formally and informally)?
- Have there been any incidents (e.g. misleading advice, misuse, data breaches)?
- Does the policy still fit technological developments and legal frameworks (GDPR, upcoming AI Act)?
Conclusion: Policy That Grows With Practice
Effective AI policy is a living document that grows with practice. Successful schools combine:
- Governance: roles and responsibilities - Part 1
- Principles: values as compass - Part 2
- Practice: rules, pilots, and boundaries for AI - Part 3 (this article)
The result: trust - in technology, in the school, and in each other.