Register for free to unlock all course modules and resources.
Learning outcomes:
Define your school's AI vision grounded in your existing values
Create conditions where staff can experiment safely with AI
Use staff concerns to build trust and improve implementation
Map and evaluate current AI tools for coherence and purpose
Demonstrate professional AI use that builds staff confidence
Resources for module 2
AI experiments framework | AI scenarios for staff meetings
Download module completion certificate
Celebrate your progress and share your achievement!
Meet the host:
Laura Knight:
Teacher, Digital Education and AI Specialist, TechWomen100 Award Winner 2025
Laura is an experienced teacher and a leading voice on AI in education. She combines classroom expertise with deep technical knowledge to help school leaders navigate AI adoption thoughtfully. Laura has trained thousands of educators across the UK and internationally on responsible AI use, always grounding her work in what actually works for teachers and pupils.
More info on this module:
Moving from understanding to intentional leadership
Welcome to this second session, which focuses on leadership, vision and culture.
AI literacy for school leaders means understanding not just what AI can do, but how to guide its use with purpose, ethics and trust.
In this session, you'll learn how to shape your school's direction, create a culture where staff feel confident to explore, and lead conversations that keep values at the centre of innovation.
In Module 1, we looked at the six leadership shifts schools need to navigate. This module builds on that by showing how to grow the shared understanding and professional confidence that make those shifts possible.
Our aim today is to move from understanding to intentional leadership, and we'll look closely at how schools shape the direction, build the right culture for innovation and support, and help educators to lead with clarity and confidence at a time of very rapid change.
Reconnecting vision to purpose
When schools start to use AI, it can be easy to get drawn into tools, features and quick wins.
But leadership should begin with purpose, not get distracted by the details.
Your vision for AI should reflect the opportunities you want to create, the problems you want to solve, the learning you want to deepen, the relationships you want to protect, and the values you refuse to compromise.
That means creating time for dialogue – whether it's a scheduled slot for a town hall discussion in a staff meeting, building in conversations to department or team meeting agendas, or engaging colleagues in a feedback exercise after INSET.
Listen carefully. Invite uncomfortable questions.
Let your staff and students help shape the story of what it means to use this technology well in your context.
Vision doesn't need to be a single bold statement.
It can take the form of shared principles, consistent conversations, and confident decisions that stay true to the mission of the school, even as tools change.
Example of a good AI vision
Our school will use AI to improve learning, reduce unnecessary workload, and protect pupils. Human judgement stays in charge. We will check important claims, monitor for bias, and be transparent with families and staff. Every adoption must show clear curriculum value, clear data practice, and a simple way to review impact after four weeks.
Example of a weak AI vision
We will adopt AI across the school to drive innovation and efficiency. Staff are encouraged to use approved tools to enhance outcomes. We aim to be at the cutting edge and will scale what works.
What makes the good one better
- Purpose not hype – it names the outcomes that matter in schools, not vague innovation.
- The role of people – it states that teacher judgement leads, which sets a professional tone.
- Safety and ethics – it mentions bias checks, transparency and pupil protection.
- Adoption testing – it requires curriculum value and clear data practice before use.
Finally, it builds in a simple post-adoption review so learning sticks.
When Third Space Learning developed Skye, the problem they were looking to solve was not “how can we use AI” but “how can we give consistent, high-quality one-to-one maths tutoring that's affordable enough to reach every child who needs it”. That purpose drove every design decision. Technology came last, not first.
Creating conditions for responsible innovation
Innovation is as emotional as it is practical.
Teachers don't just need to know what works – they need to feel safe enough to try, to get it wrong, and to learn from that process.
A leader could model this by sharing their own small experiment publicly. For example, they might use AI to draft a parent letter, then show the unedited version in briefing and explain what they changed, what didn't work, and why.
They could invite staff to spot the weaknesses together and laugh about the errors before showing the final version.
By being open about imperfection and the process of co-creation, the leader signals that trying, adjusting and learning in public is not failure – it's part of responsible innovation.
That safety also comes from boundaries. Clear ethical expectations. Transparent goals. Space to ask questions and surface dilemmas.
When schools treat digital pilots as collaborative learning processes rather than rollouts to tick off, they create the conditions for long-term growth.
This work becomes even more powerful when it includes students.
Co-writing ethical guidelines or digital charters with pupils gives everyone a stake in how tools are used and helps keep discussions grounded in human dignity and care.
Action step:
- Build a shared digital charter
- Run a 30-minute workshop with a mixed group of staff and pupils to agree three principles for responsible AI use in your school
- Keep them short, human and visible – for example, we check outputs, we stay transparent about use, we protect privacy and dignity
- Post the draft centrally for comments, then finalise and display it in classrooms and briefing slides
Listening to resistance without judgement
Every school has staff who hesitate. New tools bring concerns, friction, instability and insecurity about workload, impact, values and trust.
This is not something to fix or ignore. It's something to learn from.
In some cases, resistance reflects professionalism. It shows that teachers are thinking critically about their practice and protecting the craft of teaching.
When leaders listen carefully to sceptical voices, they often uncover insights that others have missed.
Rather than trying to convince or persuade, invite questions. Bring dissenting views into early-stage conversations.
Pair those who are unsure with those who are already experimenting. Use stories, especially ones with humour, to create connection and reduce fear.
Where there's misalignment, get curious – what's behind it? What's the origin?
Leadership is not about ploughing blindly through resistance. It's about understanding it, respecting it, and creating the conditions in which people feel safe enough to move forward together.
Action step:
- Hold a listening round
- Gather a small, mixed group of staff and ask one question: “What worries or frustrates you most about AI in school?”
- Listen without defending or solving. Note patterns, not names
- Close by thanking everyone and sharing one line back to the wider staff: “Here's what we heard, and what we'll explore together next”
- The goal is not persuasion – it's to make people feel heard and safe before change begins
Takeaway: resistance is information. Use it to build trust, not to create division.
Stewarding a coherent digital ecosystem
Leadership is about holding responsibility for a system, not just approving products.
As more platforms and tools enter schools, that responsibility grows.
Your school's digital environment is not a collection of individual apps. It's a system that shapes the rhythm of work, the flow of information, and the experience of learners.
When that system becomes fragmented, it erodes trust and creates inefficiencies. When it's coherent, it supports clarity, confidence and equity.
Action step:
- Hit pause here, and take the time to map your tools and platforms
- Where are they overlapping? What values do they reflect? How well are they serving staff and pupils?
- Involve your IT teams, safeguarding leads and teaching staff in these reviews – not just at procurement, but throughout implementation and beyond
- What is really happening with your digital strategy and its adoption?
A well-stewarded digital ecosystem is one that stays accountable to your mission and remains adaptable as needs change.
Modelling confidence through professional voice
AI has unsettled many educators. We've talked about how important it is to listen to concerns and resistance without judgement.
But it's not enough just to listen or to talk only in theoretical terms. Leaders must also participate, demonstrate, model good practice, and lead from the front.
This is a moment to defend what matters: judgement, creativity, interpretation and care.
These are the foundations of the profession, and they remain essential, even in a more automated landscape.
Teachers are looking to their leadership teams to model what thoughtful engagement with AI looks like.
If leaders stay on the sidelines, it sends an unhelpful message – that this is something optional, experimental, or best left to the technical experts.
If, instead, leaders engage visibly through asking questions, experimenting, and sharing what they're learning alongside their colleagues, it normalises uncertainty and builds collective confidence.
Crucially, this work cannot sit with one department. AI is not the responsibility of computing or IT alone.
It cuts across every subject, every leadership role, every aspect of school life. It affects safeguarding, curriculum, professional development and inclusion.
The wider the engagement, the stronger the culture of shared responsibility becomes.
Action step:
- Choose one routine task – writing a newsletter, drafting CPD notes, or summarising a report – and use AI to create a first draft
- Share the raw output with staff, talk through what you kept, what you changed, and why
- Explain where human judgement improved it
- This shows staff that AI use is open, reflective, and guided by professionalism rather than fear
Culture is the multiplier
Every school already has a digital culture. The only question is whether it's being shaped with intention.
Culture often feels hard to define, but its effects are easy to spot.
It appears in the tone of meetings, the language of feedback, and the way staff respond to change. It's reflected in how digital tools are introduced, questioned or quietly resisted.
It shows up in how students observe risk-taking, and in whether reflection is treated as a habit or something extra.
Artificial intelligence does not override that culture. It amplifies it.
In schools where professional trust, curiosity and shared purpose are already part of the norm, AI can strengthen what works.
But in schools shaped by compliance, fatigue or fear, the same technologies are more likely to create noise, confusion or unintended consequences.
This is why culture must be a leadership priority – something that must be built with care and consistency, not left to develop organically while hoping for the best.
Start by noticing the signals. Are your school's digital habits aligned with your values? Are teachers experimenting openly, quietly working around new systems, or hiding their approach completely? Are decisions being shaped by purpose or by convenience?
Set expectations for how digital tools are introduced and evaluated. Facilitate conversations across roles about what's working and what's not.
Revisit your AI vision and test it against daily practice. Embed digital ethics and reflective inquiry into your professional development offer, your departmental reviews, your appraisals and your learning walks.
Make the invisible visible.
Because digital maturity is more than just choosing more or better tools. It comes from building a culture that is ready to use them well.
Action step:
- Run a five-minute culture check at your next leadership meeting
- Ask each person to share one quick observation that shows how your school's digital culture currently feels – for example, “people share experiments openly,” or “staff use tools quietly without discussion”
- Collect these on a slide using a tool like Mentimeter or Slido and look for patterns
- End by asking, “What would we want these examples to show a year from now?”
- This simple reflection makes culture visible and turns it into a shared priority
Closing reflections
To close, consider these questions with your team:
- What kind of digital culture is emerging in your school? What is visible, and what might be under the surface?
- How do your decisions reflect the values you want to uphold?
- What still needs attention, and what should be celebrated?
In our next module, we'll be thinking about AI literacy in the context of teaching and learning, and looking at the six principles that underpin excellence in the classroom. See you there.
RELATED RESOURCES: