AI Policy for Schools: Step-by-Step Guide To Developing An Effective AI Policy

As AI tutoring and AI tools become increasingly valuable assets in the classroom, school administrators need clear guidelines that balance innovation with responsibility. Schools need comprehensive AI policies to ensure they implement these technologies safely, ethically and effectively.

AI-powered lesson tools and even AI tutoring are already part of daily school life. Yet, without a clear AI policy, the risks – from data breaches to student safety – can quickly outweigh the gains.

This guide explains how to draft the AI policy schools need to introduce new educational technologies safely, ethically and in line with US legislation.

It is based on the work done by Third Space Learning’s school safety coordinator, who has developed the AI tutoring policy we use with our AI math tutor, Skye.

Download Free Resources

Math Intervention Checklist

Essential step-by-step checklist to help you select, manage and evaluate the best math intervention programs for your students

Download Free Now!

What an AI policy in schools should achieve

An AI policy in schools is now as much a requirement as a student safety or data privacy policy. It puts guardrails around every use of AI, including:

  • Large language models;
  • Generative AI tools;
  • Conversational AI tutoring in school

An AI policy should cover every AI use case in your setting, from lesson-plan drafting to spoken one-to-one support, and cover every AI tool used.

AI policy in schools

A well-written AI policy should set out how it:

Protects students and staff

A clear AI policy keeps everyone safe. It should set out how conversational AI math tutoring, such as Skye is scheduled, supervised and recorded, and define the school safety coordinator’s role in responding to any automated red flags.

Detailing data privacy rules, such as what student data is stored and where and for how long. The policy helps prevent accidental breaches and gives parents confidence that student protection in education is managed with the same care as any other student safety procedures.

Supports teachers’ professional judgment

AI systems should lighten teachers’ workload, not replace professional judgment. Your policy should confirm that staff decide which AI-generated content, examples or assessment questions make it into lessons. If you are using text-based or conversational AI tutoring, it should sit alongside, not instead of, classroom teaching.

Explicitly stating the acceptable use of AI, from drafting lesson slides, generating quick-fire quizzes, and running AI tutoring like Skye for small-group numeracy practice, the policy helps teachers implement AI technologies that save time, without compromising quality.

Schools must show school board members and inspectors that every AI tool complies with FERPA, COPPA and the U.S. Department of Education’s filtering and monitoring guidance. Plus any other legal requirements that apply.

An AI policy provides that evidence and lists:

  • The lawful basis for processing data;
  • Encryption standards;
  • Breach-report timelines everyone must follow.

With those points documented, leaders can demonstrate that AI math, both tools and tutoring, sit comfortably inside existing statutory frameworks.

Enables responsible progress

Generative AI is moving quickly, and new AI models arrive almost monthly. An up-to-date policy gives your school a safe path to pilot AI innovations while keeping parents and school board members informed.

The AI policy sets review cycles, impact-measurement criteria and a simple approval route for the introduction of any new AI system. This structure means you can embrace AI in education with confidence and control while expanding individual support and reducing repetitive tasks.

Key considerations for creating an AI policy in schools

Student safety and security

As stated, every AI tool must keep students safe. Your policy should set rules for close supervision, ensure AI sessions are recorded and stored on secure servers.

Any AI policy document should list exactly which personal data each AI system collects, the lawful basis for processing under US data privacy laws.

Staff and students need guidance on using AI-generated text, images and explanations. Your policy should confirm whether anything produced by an AI tool should be checked for copyright before sharing, and that sources must be credited to avoid plagiarism.

Ethical use and fairness

AI outputs can contain bias. The policy should outline where your school is committed to regular checks and embeds clear ethical considerations for fair treatment across gender, ethnicity and economic background.

Also, outline whether the AI policy bans prompts or training data that could lead to discrimination.

Transparency and labeling

When generative AI has shaped a lesson, resource, plan or report, readers should know. The policy might ask staff to add a short note such as “created with generative AI assistance” so school board members, parents and colleagues understand where the content came from.

Workload reduction

One aim of using AI in education is to remove repetitive jobs such as marking, routine check for understandings, drafting worksheet questions or math interventions. Using AI in this way helps teachers focus on explanations and feedback. Link each authorized tool to a workload benefit where possible in your AI policy to help school board members understand what the benefit is.

AI helping reduce teacher workload, so they can focus on teaching and feedback.

Accessibility and equity

All young people across all grade levels must be able to use approved systems. Set out in your AI policy how the AI tool supports:

  • Screen readers;
  • Accessible presentation formats;
  • Simplified language.

Monitoring and evaluation

Impact matters. Leaders should agree on the data they will collect and set a yearly review, including:

  • Attendance at AI tutoring;
  • Progress on post-tests;
  • Teacher feedback.

If a tool is not improving learning or reducing workload, it should go back to the approval stage.

Staff training and awareness

Teachers and support staff should receive annual professional development (PD) covering:

  • The benefits and challenges of artificial intelligence;
  • Risks;
  • Data protection;
  • Safe classroom practice;
  • Best AI tools for different topics and subjects, for example the best AI for math.

School board members should receive a short briefing, and students must learn how to use AI responsibly.

You may want to state this in your AI policy so the AI education of staff and students is clear to parents and outside bodies such as inspectors.

Collaboration and communication

Updating the policy is a collaborative task. It involves school board members, student council and parental engagement. Regular newsletters and web updates help keep families informed about new AI tools and safety measures.

Your AI policy should also comply with wider school plans and relevant policies.

Approved AI applications

It’s key to ensure your policy includes an appendix of AI models that have passed vetting, so it’s clear for all readers and staff to know which AI tools are acceptable. For example:

AI toolCompanyUse of AI
Skye, conversational AI math tutorThird Space LearningOne-to-one math tutoring
Magic School AIMagic SchoolLesson planning

State that any new AI tool request must go through. For example, the AI/EdTech lead, school safety coordinator, and data privacy officer (DPO).

Reporting concerns and accountability

Finally, the document should explain how staff, students or parents can raise worries about bias, data leaks or misuse. Link misuse to behavior, data-breach and disciplinary procedures so everyone understands what happens next. Create simple channels for raising issues and outline clear responsibility pathways.

Questions to ask when developing an AI policy

1. Who owns the policy review, and when will it happen?
Assign the AI/EdTech lead to gather feedback and set an annual school board sign-off date to keep the document current.

2. How will each AI tool fit into the existing curriculum?
Map Skye or any other system to specific units and programmes of study so teachers see exactly where it supports learning.

3. Which students will use AI tutoring, and at what times?
Decide year groups, group sizes and timetable slots in advance to avoid last-minute logistics.

4. Who will supervise every session?
Name the staff member, state the supervision ratio and confirm the room; conversational AI must run under direct adult oversight.

5. What personal data will the tool collect, and how will it be protected?
List data fields, lawful basis, retention period and encryption standards; check where servers are located.

6. What is the step-by-step response when a safeguarding flag is raised?
Outline the alert → School safety coordinator review → action flow, including an after-hours contact.

7. What training do staff need before launch and each year thereafter?
Schedule hands-on PD covering acceptable use, data protection and how to interpret transcripts.

8. Which metrics will prove the AI intervention is working?
Agree attendance targets, post-test progress measures, student voice surveys and workload logs, and review them at regular intervals.

9. How will accessibility and equity be guaranteed?
List features such as simplified language and high-contrast slides, and set plans for students without reliable devices or quiet spaces.

10. How will parents and carers be kept informed?
Draft newsletter text and a website summary explaining the purpose of AI use, privacy measures and reporting routes.

AI tutoring in education with Skye

Skye is Third Space Learning’s conversational AI math tutor: a voice-based system trained on more than 2 million one-to-one lessons delivered to over 170,000 students in 4,000+ schools since 2013.

It speaks, listens and responds just like a traditional, human tutor. Yet, every session sits inside a human-in-the-loop AI framework. Math experts and teachers have written every lesson and set of instructions for Skye so it never deviates from the class content.

Continuous transcript reviews and automated student safety flags mean professional judgement remains at the heart of every lesson – nothing is left to chance.

Moving from principles to practice

AI tools and systems are moving quickly, but a carefully drafted AI policy gives schools the confidence to adopt them without compromising safety, fairness or educational integrity. By setting out clear safety processes, ownership and review cycles, leaders create the space for conversational AI math tutors such as Skye to raise attainment, cut workload and widen access to individual support.

With the a detailed AI policy in place, your school can adopt AI tutoring and AI tools safely and effectively.

AI policies should include sections for safeguarding, including raising red flags.
Do schools need an AI policy?

If schools decide students can use AI tools and AI tutoring in schools, they need to publish an AI policy. It must comply with guidance and sit in accordance with your other policies.

Is AI allowed in schools?

Schools can use AI, but it requires close supervision, ensuring students and staff use tools with safety, filter features, and making sure learners stick to age restrictions.

How do we apply AI in education?

AI supports educators by automating repetitive admin tasks like creating lesson plans, generating questions, adapting materials for diverse learners and enabling low-cost personalized tutoring. 

x

[FREE] Ultimate Math Vocabulary Lists (K-5)

An essential guide for your Kindergarten to Grade 5 students to develop their knowledge of important terminology in math.

Use as a prompt to get students started with new concepts, or hand it out in full and encourage use throughout the year.

Download free