AI Policy for Schools: Free Template and Step-by-Step Guide
As AI tutoring and AI tools become increasingly valuable assets in the classroom, school leaders need clear guidelines that balance innovation with responsibility. Schools need comprehensive AI policies to ensure they implement these technologies safely, ethically and effectively.
AI-powered lesson tools and even AI tutoring are already part of daily school life. Yet, without a clear AI policy, the risks – from data breaches to safeguarding failures – can quickly outweigh the gains.
This guide explains how to draft the AI policy schools need to introduce new technologies safely, ethically and in line with UK legislation.
It is based on the work done by Third Space Learning’s Designated Safeguarding Lead, who has developed the AI tutoring policy we are offering free to any school that embarks on AI maths tutoring with Skye or any other provider.
Download our free, editable AI tutoring policy template now or find out what you need to create your own AI policy below:
AI Tutoring Policy for Schools
A practical framework to help you integrate AI tutoring and meet data protection and safeguarding requirements – includes an editable policy template
Download Free Now!What an AI policy in schools should achieve
An AI policy in schools is now as much a requirement as a safeguarding or data-protection policy. It puts guardrails around every use of AI, including:
- Large language models;
- Generative AI tools;
- Conversational AI tutoring in school.
An AI policy should cover every AI use case in your setting, from lesson-plan drafting to spoken one-to-one support, and cover every AI tool used.
A well-written AI policy should set out how it:
Protects students and staff
A clear AI policy keeps everyone safe. It should set out how conversational AI maths tutoring, such as Skye is scheduled, supervised and recorded, and define the Designated Safeguarding Lead’s role in responding to any automated red flags.
Detailing data privacy rules, such as what student data is stored and where and for how long. The policy helps prevent accidental breaches and gives parents confidence that artificial intelligence safeguarding in education is managed with the same care as any other safeguarding procedures.
Supports teachers’ professional judgement
AI systems should lighten teachers’ workload, not replace professional judgment. Your policy should confirm that staff decide which AI-generated content, examples or assessment questions make it into lessons. If you are using text-based or conversational AI tutoring, it should sit alongside, not instead of, classroom teaching.
Explicitly stating the acceptable use of AI, from drafting lesson slides, generating quick-fire quizzes, and running AI tutoring like Skye for small-group numeracy practice, the policy helps teachers implement AI technologies that save time, without compromising quality.
Meets legal duties
Schools must show governors and inspectors that every AI tool complies with UK GDPR, the Data Protection Act and the Department for Education’s filtering and monitoring guidance. Plus any other legal requirements that apply.
An AI policy provides that evidence and lists:
- The lawful basis for processing data;
- Encryption standards;
- Breach-report timelines everyone must follow.
With those points documented, leaders can demonstrate that AI maths, both tools and tutoring, sit comfortably inside existing statutory frameworks.
Enables responsible progress
Generative AI is moving quickly, and new AI models arrive almost monthly. An up-to-date policy gives your school a safe path to pilot AI innovations while keeping parents and governors informed.
The AI policy sets review cycles, impact-measurement criteria and a simple approval route for the introduction of any new AI system. This structure means you can embrace AI in education with confidence and control while expanding individual support and reducing repetitive tasks.
Key considerations for creating an AI policy in schools
Safeguarding and security
As stated, every AI tool must keep students safe. Your policy should set rules for close supervision, ensure AI sessions are recorded and stored on secure servers.
Any AI policy document should list exactly which personal data each AI system collects, the lawful basis for processing under UK GDPR.
Intellectual property and copyright
Staff and students need guidance on using AI-generated text, images and explanations. Your policy should confirm whether anything produced by an AI tool should be checked for copyright before sharing, and that sources must be credited to avoid plagiarism.
Ethical use and fairness
AI outputs can contain bias. The policy should outline where your school is committed to regular checks and embeds clear ethical considerations for fair treatment across gender, ethnicity and economic background.
Also, outline whether the AI policy bans prompts or training data that could lead to discrimination.
Transparency and labelling
When generative AI has shaped a lesson, resource, plan or report, readers should know. The policy might ask staff to add a short note such as “created with generative AI assistance” so governors, parents and colleagues understand where the content came from.
Workload reduction
One aim of using AI in education is to remove repetitive jobs such as marking, routine check for understandings, drafting worksheet questions or interventions. Using AI in this way helps teachers focus on explanation and feedback. Link each authorised tool to a workload benefit where possible in your AI policy to help governors understand what the benefit is.
Accessibility and equity
All young people across all key stages must be able to use approved systems. Set out in your AI policy how the AI tool supports:
- Screen readers;
- High-contrast slides;
- Simplified language.
Monitoring and evaluation
Impact matters. Leaders should agree on the data they will collect and set a yearly review, including:
- Attendance at AI tutoring;
- Progress on post-tests;
- Teacher feedback.
If a tool is not improving learning or reducing workload, it should go back to the approval stage.
Staff training and awareness
Teachers and support staff should receive annual CPD covering:
- The benefits and challenges of artificial intelligence;
- Risks;
- Data protection;
- Safe classroom practice.
Governors should receive a short briefing, and students must learn how to use AI responsibly.
You may want to state this in your AI policy so the AI education of staff and students is clear to parents and outside bodies such as Ofsted.
Collaboration and communication
Updating the policy is a collaborative task. It involves governors, school councils and parent forums. Regular newsletters and web updates help keep families informed about new AI tools and safety measures.
Your AI policy should also comply with wider school plans and relevant policies.
Approved AI applications
It’s key to ensure your policy includes an appendix of AI models that have passed vetting, so it’s clear for all readers and staff to know which AI tools are acceptable. For example:
AI tool | Company | Use of AI |
Skye, conversational AI maths tutor | Third Space learning | One-to-one maths tutoring |
Aila | Oak National Academy | Lesson planning |
State that any new AI tool request must go through. For example, the AI/EdTech lead, DSL and DPO.
Reporting concerns and accountability
Finally, the document should explain how staff, students or parents can raise worries about bias, data leaks or misuse. Link misuse to behaviour, data-breach and disciplinary procedures so everyone understands what happens next. Create simple channels for raising issues and outline clear responsibility pathways.
About the AI tutoring policy template
When moving from traditional online tutoring to an AI-driven model, our DSLs and teachers knew schools would need the same clarity on the same aspects as any other external tutoring provider:
- Data privacy;
- Safeguarding;
- Curriculum alignment.
The template combines our recommendations for what best practice adherence to this policy looks like, using the example of Skye, the conversational AI maths tutor
Adapting the template to your school
Start by adding your school name, then:
- Define scope – list any other AI tools you already use alongside Skye.
- Confirm roles – insert the names of your DSL, DPO and AI/EdTech lead.
- Set supervision details – specify where sessions take place and who monitors them.
- Adjust data-retention periods – align recordings with your wider data policy and other relevant policies.
- Add review dates – choose a termly impact check and an annual governor review of artificial intelligence benefits and challenges.
Because the structure is already aligned with UK GDPR and DfE filtering guidance, most schools can complete the first draft quickly.
Once approved, the policy gives every stakeholder confidence that conversational AI tutoring is being used safely, ethically and to best effect for students in both primary and secondary phases.
Questions to answer when adapting the template
- How will each AI tool fit into the existing curriculum?
Map Skye or any other system to specific units and programmes of study so teachers see exactly where it supports learning. - Which students will use AI tutoring, and at what times?
Decide year groups, group sizes and timetable slots in advance to avoid last-minute logistics. - Who will supervise every session?
Name the staff member, state the supervision ratio and confirm the room; conversational AI must run under direct adult oversight. - What personal data will the tool collect, and how will it be protected?
List data fields, lawful basis, retention period and encryption standards; check that servers are located in the UK or EU. - What is the step-by-step response when a safeguarding flag is raised?
Outline the alert → DSL review → action flow, including an out-of-hours contact. - What training do staff need before launch and each year thereafter?
Schedule hands-on CPD covering acceptable use, data protection and how to interpret transcripts. - Which metrics will prove the AI intervention is working?
Agree attendance targets, post-test progress measures, student voice surveys and workload logs, and review them half-termly. - How will accessibility and equity be guaranteed?
List features such as simplified language and high-contrast slides, and set plans for students without reliable devices or quiet spaces. - How will parents and carers be kept informed?
Draft newsletter text and a website summary explaining the purpose of AI use, privacy measures and reporting routes. - Who owns the policy review, and when will it happen?
Assign the AI/EdTech lead to gather feedback and set an annual governor sign-off date to keep the document current.
Addressing these questions turns the generic template into a policy that fits your school’s phase, size and context.
AI tutoring in education with Skye
Skye is Third Space Learning’s conversational AI maths tutor: a voice-based system trained on more than 2 million one-to-one lessons delivered to over 170,000 students in 4,000+ UK schools since 2013.
It speaks, listens and responds just like a traditional, human tutor. Yet, every session sits inside a human-in-the-loop AI framework. Math experts and teachers have written every lesson and set of instructions for Skye so it never deviates from the lesson content.
Continuous transcript reviews and automated safeguarding flags mean professional judgement remains at the heart of every lesson – nothing is left to chance.
Find out more about online one to one AI maths tutoring with Skye:
Moving from principles to practice
AI tools and systems are moving quickly, but a carefully drafted AI policy gives schools the confidence to adopt them without compromising safety, fairness or educational integrity. By setting out clear safeguards, ownership and review cycles, leaders create the space for conversational AI maths tutors such as Skye to raise attainment, cut workload and widen access to individual support.
Third Space Learning’s AI policy template shows how those safeguards work in real lessons, combining clear data protection rules, teacher oversight and established pedagogy. Adapt it line-by-line to match your phase and context, tick off the ten key questions so that you have a document governors can sign with confidence.
With the policy in place, your school can adopt AI tutoring and AI tools safely and effectively.
If schools decide students can use AI tools and AI tutoring in schools, they need to publish an AI policy. It must comply with statutory guidance and sit in accordance with your other policies.
Schools can use AI, but it requires close supervision, ensuring students and staff use tools with safety, filter features, and making sure learners stick to age restrictions.
AI supports educators by automating repetitive admin tasks like creating lesson plans, generating questions, adapting materials for diverse learners and enabling low-cost personalised tutoring.