Acceptable AI usage policy
Francis Barber PRU – AI Use Policy
Purpose
This policy sets out expectations for the safe, ethical, and professional use of artificial intelligence (AI) tools by staff at Francis Barber PRU. AI should be used to enhance educational practice—not to replace professional judgment. All use must prioritise pupil safety, anonymity, data protection, and equity, in alignment with our trauma-informed approach and safeguarding responsibilities.
1. Pupil Data Protection & Safeguarding
No identifying information about pupils is to be shared with any AI platform—this includes names, contact details, addresses, medical, safeguarding, SEND, or attendance records.
Staff must not upload documents that contain:
-
Pupil initials, names, UPNs, or codes that can be linked to individual students.
-
Behaviour logs, SPOD entries, or personal education plans.
-
Any sensitive, protected, or confidential information relating to students, families, or staff.
All data shared with AI tools must be fully anonymised and stripped of any identifying details. The principle of 'privacy by design' must be followed at all times.
2. Anonymisation Guidelines
Use broad, de-identified terms such as 'Pupil A', 'a Year 10 student with SEMH needs', or 'KS3 learner'.
Double-check all text before uploading, especially for hidden data in headers, footers, or metadata.
When in doubt: Do not upload. Instead, speak to the Data Protection Lead DPO or a member of the Heads’ Team for guidance.
3. Permitted Uses of AI
Drafting or refining lesson plans, schemes of work, and activity ideas.
Generating accessible resources or differentiated materials to support inclusion and SEND learners.
Writing or rewording general documents (e.g. newsletters, posters, trip letters).
Summarising professional guidance or anonymised notes.
Writing anonymised SPODs or reports (must be reviewed by staff before final use).
Adapting materials to be trauma-informed, accessible, or inclusive.
4. AI is Not Permitted For
Uploading or generating any data related to:
Individual pupil behaviours, safeguarding incidents, attendance, exclusions, or medical records.
Creating documents that will be used to make or inform decisions about individual pupils.
Any content where anonymity cannot be confidently maintained.
Evaluating or labelling a pupil’s ability, diagnosis, or progress.
Knowingly reproducing or rewriting copyrighted material without permission and appropriate referencing.
5. Trauma-Informed & Ethical Use
AI should enhance, not replace, the relational and human-centred approach that underpins our work. Staff should be mindful of bias in AI outputs and ensure that content is culturally sensitive, inclusive, and non-discriminatory.
Outputs used in educational settings must reflect the values of nurture, equity, and respect central to Francis Barber.
6. Staff Training & Support
All staff will receive guidance on ethical AI use, with specific reference to:
-
GDPR and UK Data Protection law
-
Current KCSIE statutory guidance
-
Francis Barber Inclusion, Safeguarding, and SEND Policies
-
Questions or concerns about AI should be directed to the Data Protection Lead DPO or a member of the Heads’ Team.
7. Monitoring & Accountability
AI usage may be monitored as part of ongoing data protection and safeguarding audits. This policy will be reviewed annually or in response to updates in legislation or technology. Any breach of this policy will be dealt with under the school’s Data Protection, Safeguarding, or Disciplinary procedures, as appropriate.
Quick Recap
🚫 No names.
🚫 No personal data.
✅ Always anonymise.
❓ When in doubt—don’t upload.
👥 Ask a member of the Heads’ Team or the Data Protection Officer.