Exploring, Designing and Improving AI Use in Education

Double Diamond Across AI Domains

AI is developing faster than traditional quality processes can keep up with. New tools create opportunities, yet not every application fits every class, subject or school. That’s why schools — alongside PDCA — also need a design-oriented approach that allows for exploration, testing and learning.

The Double Diamond provides exactly that: a clear, human-centred process for discovering, defining, developing and testing AI applications before scaling them across the school.

Two colleagues discussing a Double Diamond diagram on a whiteboard in a calm, light-filled meeting room.

17 November 2025 5-minute read

TL;DR Summary

The Double Diamond helps schools experiment, innovate and iteratively develop better AI solutions in a safe, learner-centred way — always starting from real needs.

Left diamond: understand the problem; Right diamond: design, test and refine new solutions.

Double Diamond Across the AI Domains

The Double Diamond is a simple, human-centred design model with two paired phases:

  • Discover → Define: broad exploration → focused problem framing
  • Develop → Deliver: idea development → small-scale testing
Double Diamond design process showing discovery, definition, development and delivery
Figure 1. The Double Diamond model visualises how teams move from identifying a challenge to delivering a solution through four stages: discover, define, develop and deliver.

For AI in education, the model helps teams first understand the real problem, then design safe, appropriate AI solutions — and pilot them on a small scale before wider implementation. Below you’ll find how each phase works per AI domain in education.

1. Lesson Planning & Learning Material Development

Discover & Define — Identify The Real Challenge

  • Observe how teachers currently use AI for explanations, quizzes and activities.
  • Map pain points: time pressure, quality variation, inconsistent tools.
  • Define a clear problem: “How do we speed up lesson preparation without losing quality?”
  • Set success criteria: curriculum alignment, transparency, clarity.

Develop & Deliver — Design And Pilot

  • Create 2–3 example prompts for explanations, tasks and differentiation.
  • Design quality checks (rubrics, review guidelines).
  • Test one unit or lesson with a small teacher group.
  • Select the best-working approach → ready for PDCA.

2. Assessment, Feedback & Evaluation

Discover & Define

  • Explore where AI already helps (formative feedback) and where errors occur.
  • Collect examples of bias, vague feedback or misinterpretations.
  • Core question: “How can we use AI for better feedback without losing teacher judgement?”
  • Success norms: accuracy, safety, clarity.

Develop & Deliver

  • Design several feedback workflows (AI → teacher → learner).
  • Prototype rubrics, check questions and safe prompt patterns.
  • Test with a small sample (10–15 responses).
  • Evaluate reliability and time saved → select best workflow.

3. Differentiation & Personalised Learning

Discover & Define

  • Investigate differences in level, pace and learner needs.
  • Analyse how AI currently personalises — intended or not.
  • Key question: “How do we use AI for fair differentiation without increasing inequality?”
  • Criteria: bias-free, accurate, transparent for learners.

Develop & Deliver

  • Create three variants of explanations/questions (basic, core, advanced).
  • Prototype learning routes, hint strategies and diagnostic prompts.
  • Test with a small class or subgroup.
  • Evaluate understanding, motivation and fairness→ select what works.

4. Learner Support & Didactic Assistance

Discover & Define

  • Explore how learners use AI for explanations and assignments.
  • Identify risks: incorrect answers, over-trust, privacy issues.
  • Define the challenge: “How do we use AI as an additional explanation source that strengthens critical thinking?”
  • Criteria: safety, accessible language, human-in-the-loop.

Develop & Deliver

  • Design several AI support scenarios:
    • Alternative explanations
    • Step-by-step hints
    • Diagnostic questions
  • Test with 5–10 learners of varying levels.
  • Measure comprehension and safety → choose the strongest model.

5. Communication, Organisation & Administration

Discover & Define

  • Investigate where AI currently saves time and where risks appear (tone, privacy).
  • Collect examples of errors or potential data leaks.
  • Problem definition: “How can AI support our admin tasks without creating integrity risks?”
  • Criteria: data minimisation, correct tone, full human oversight.

Develop & Deliver

  • Design 2–3 safe workflows, such as:
    • AI draft → human review
    • AI summary → checklist review
  • Prototype templates and review lists.
  • Test with a few letters, reports or schedules.
  • Select the workflow that is both safe and feasible.

6. School Development, Policy & Safety

Discover & Define

  • Investigate key policy questions: risks, innovation, governance.
  • Analyse incidents, signals and knowledge gaps.
  • Core question: “How do we develop AI policy that adapts to technological and pedagogical change?”
  • Criteria: clarity, scalability, privacy, human-centred.

Develop & Deliver

  • Design policy options such as:
    • Alternative oversight models
    • Different approaches to dashboards
    • Incident response structures
  • Involve ICT, privacy officers, leadership and teachers.
  • Test one variant in a pilot team.
  • Refine → then embed using PDCA.

Meta Insight: Design First, Then Safeguard

The Double Diamond is ideal for exploring, designing and small-scale testing AI applications — but it is less suited to speed, risk management, cross-domain coordination and high workload contexts.

That’s why it works best together with PDCA:

💡 First discover what works, then implement safely and professionally.

Practical Tips For Starting Small And Effectively

  • Start with one domain or one lesson/task.
  • Use a core team of 3–5 people for fast progress.
  • Base Discover on real classroom examples, not assumptions.
  • Set success criteria before prototyping.
  • Keep prototypes simple — a prompt + checklist is enough.
  • Test with a small, diverse learner group.
  • Document only what works — refine later in PDCA.
  • Apply PDCA after a successful pilot — design first, then safeguard.

Discover What Truly Works For Your School

If you'd like to explore, design and test AI scenarios with your team, we support schools with design sessions, AI prototyping and safe pilots aligned with classroom practice and school-wide quality assurance. A short, no-obligation intake is always possible.

« More Perspectives Our Services »