Register for free to unlock all course modules and resources.
Learning outcomes:
Understand the six principles that underpin excellent AI teaching and why they matter
Recognise the hallmarks of effective AI use in the classroom
Evaluate whether AI is strengthening or weakening teaching practice
Learn how to keep teacher judgement central to all high-stakes decisions
Build your confidence in leading an AI literate school
Free resource for module 3
AI Lesson Observation Framework
Download module completion certificate
Celebrate your progress and share your achievement!
Meet the host:
Laura Knight:
Teaher, Digital Education and AI Specialist, TechWomen100 Award Winner 2025
Laura is an experienced teacher and leading voice on AI in education. She combines classroom expertise with deep technical knowledge to help school leaders navigate AI adoption thoughtfully. Laura has trained thousands of educators across the UK and internationally on responsible AI use, always grounding her work in what actually works for teachers and pupils.
More info on this module:
What excellent AI-enabled teaching looks like
Welcome to module three, where we'll address the question: what does excellent AI-enabled teaching actually look like?
In this module, you'll learn the six principles that underpin excellent AI teaching, why they matter, and what they look like in practice. You'll also get a practical framework that you can use to evaluate any AI lesson in your school.
By the end of this session, you'll know how to lead AI use that strengthens evidence-based teaching and builds literacy for staff, students, and your school community. You'll also be able to walk into any classroom and evaluate whether AI is strengthening the teaching or weakening it.
Grounding AI in purpose
AI tools are never neutral. Every system reflects decisions about what to prioritise, what to measure, and what to ignore. These decisions shape how learning unfolds.
For example, a writing assistant might correct grammar but completely ignore originality. A quiz tool might reward recall but not credit reasoning. An analytics dashboard could highlight procedural patterns but miss misunderstanding.
None of these features are wrong exactly, but they're not pedagogically complete. And that is why integration really must be intentional.
If you're not clear on the learning purpose, the tool's design will quietly take over that role. You'll end up shaping the pedagogy around the functionality of the tool instead of around your intent.
Take Skye, Third Space Learning's AI maths tutor. Every lesson that Skye teaches starts with a teacher-written learning objective aligned to the national curriculum. The AI then delivers that objective adaptively, adjusting pace, adding scaffolding, providing hints – but importantly, it never decides what the students should learn. The pedagogy drives the technology, not the other way around.
Action step:
- Ask your team to name one priority learning goal for the next unit
- Define a targeted use of AI that clearly serves that goal
- Share those among the staff in the team
Keeping teacher judgment central
AI can assist, but it cannot discern. It can go through the motions, but it doesn't have empathy, compassion, or contextual understanding of our schools, our teachers, our students, and our curriculum.
It can't look a child in the eye and know what they need next.
Teaching expertise develops over years through careful observation, responsive planning, and professional reflection. Those are not tasks you can delegate to algorithms.
This means the teacher must remain in control of all high-stakes decisions: which students need support and when, what learning sequence serves their needs, how to interpret what the data is showing, when to intervene or adjust.
As an example, Third Space Learning's AI tutor Skye ensures that teachers choose the programme for each student, and they can reorder any lesson to align with classroom teaching. They receive detailed session logs showing exactly what happened. They can observe sessions in real time or review recordings afterwards. The AI delivers the tutoring; the teacher directs the strategy.
Research shows that when teachers are trusted to exercise judgment, they are more engaged, more reflective, and more open to innovation.
The risk of deskilling
When work is automated completely, teachers lose capability. If they never plan differentiated tasks because AI always does it, they lose that skill. If they never check AI-generated explanations for accuracy, they might stop noticing errors.
As a leader, you play a key role in enabling professional dialogue, protecting time for reflection, and reinforcing that technology should serve teaching, not shape it.
Choose carefully what to automate. The goal is to free teacher time for higher-value work, not to create dependence on systems teachers cannot interpret or question.
Action step:
- In your next senior leadership meeting, choose a teaching workflow that already uses or could use AI
- Mark each task as human-only, human-with-AI, or full automation
- Note areas of tension or disagreement
- Aim for clarity with a strong rationale for decisions
Instructional design and learning science
AI has to serve the science of learning, not commandeer it.
Rosenshine’s principles suggest that new learning should be introduced in small steps, with explicit modelling, guided practice, and regular review. If these elements are missing, the AI may be steering the pedagogy.
The lesson structure for Skye was designed by teachers who understand cognitive load. After an initial diagnostic, lessons move from modelling to guided and independent practice with only the scaffolding needed. The sequence comes from educational expertise.
Action step:
- Run a five-minute Rosenshine check on one AI-supported lesson
- Tick where modelling, guided practice, retrieval, and independent practice appear
- Identify where alignment might be missing or could be strengthened
Feedback and formative assessment
AI transforms assessment not only by automating grading, but by changing timing, precision, and dialogue.
Timing: Feedback has greater value when delivered immediately. AI can do this before errors become habits.
Precision: Adaptive AI tools provide personalised feedback at scale.
Dialogue: AI can prompt reflection, peer review, and feedback literacy.
Leaders must ensure ethical guardrails: privacy, bias, and avoiding depersonalised evaluation.
A quick test: ask a student, “What did you learn from that feedback?” If they can answer clearly, the system is supporting learning.
Adaptive teaching and inclusion
AI can personalise learning in ways impossible at whole-class scale. It can identify barriers early, adjust pace, adapt sequence, and tailor support.
Evidence from Stanford and UNESCO shows this can close attainment gaps.
With Skye, adaptation includes skipping mastered content, using concrete/pictorial representations, and staggered slide reveal to manage cognitive load.
When observing adaptive teaching, ask:
- Can the teacher explain how the lesson was adjusted?
- Would the student have struggled more or been under-challenged without adaptation?
- Does the AI enhance inclusion or create new barriers?
Leaders should view AI as adaptive intelligence that helps teachers intervene earlier and more effectively.
Action step:
- Take a one-size-fits-all resource and model how AI can adapt it
- Adjust reading age, build scaffolds, chunk content, or add scenario explanations
- Show how adaptations can support specific SEND needs
Students as active partners
Students are not passive recipients of AI; they are co-designers of its future.
When students understand how AI processes their work or provides support, they gain agency. They can choose challenges, revisit topics, and control support levels.
When students test models, discuss implications, and co-create protocols, they develop deeper understanding of AI systems and their assumptions.
Building critical AI literacy
Students can critique AI: what helped, what confused, what limitations exist.
This builds metacognition and maker mindsets — confidence to design, experiment, iterate.
Leaders should embed student voice in pilot evaluations and decisions. Encourage questions: Who designed this? What data was used? Who benefits? What might it get wrong?
Action step:
- Create authentic projects where students use AI to solve real problems
- Have them present their work and reflect on the process
Six hallmarks of excellent AI-enabled teaching
These six hallmarks are your practical lens. When observing a classroom, ask whether these are visible:
Hallmark 1: Clear learning purpose
The tool serves a specific curricular goal — not novelty. Look for alignment between goal and tool function.
Hallmark 2: Teacher judgment remains central
The teacher can articulate why the tool was chosen and the value it adds.
Hallmark 3: Evidence-based instruction
Rosenshine’s principles are visible. The AI supports — not replaces — learning science.
Hallmark 4: Formative assessment driving learning
Teachers can describe specific learning taking place, not just engagement.
Hallmark 5: Adaptation serves individual needs
The AI personalises learning and the teacher uses these insights meaningfully.
Hallmark 6: Students have genuine agency
Students are reasoning, choosing, thinking — not just receiving automated answers.
Closing reflections
You now have a framework for recognising excellent AI-enabled teaching. Consider:
- Where do these hallmarks already appear in your school?
- Can staff explain why an AI tool serves learning rather than convenience?
- Which routines ensure teacher judgment and formative assessment stay central?
In our next module, we'll explore risks and limitations of AI in education: accuracy, integrity, equity, and overreliance.
RELATED RESOURCES: