Implementing an AI policy in your school is about more than just writing rules; it’s about making sure those rules work in practice. This means assigning clear oversight, setting boundaries for safe and ethical use, and ensuring everyone in the school community understands both the benefits and the risks of AI.
Why AI governance matters in schools
AI is already part of daily life in many schools. Staff use tools like ChatGPT, Copilot, and even Photoshop’s AI features for everything from lesson planning and resource creation to writing emails and generating quizzes. Students are also using AI for research, drafting assignments, and even summarising textbooks.
While these tools can save time and spark creativity, they also raise important questions:
- Are students learning to think critically, or just relying on AI to do the work?
- How do we protect student privacy and data?
- Are staff and students using AI responsibly and ethically?
- What happens if AI-generated content is inaccurate, biased, or inappropriate?
- How do we ensure students benefit from AI rather than just blocking it altogether?
These are not hypothetical concerns. Teachers consistently raise issues such as plagiarism, loss of originality, over-reliance on AI for lesson planning, and the risk of students using AI to bypass genuine learning.
There are also worries about privacy, data security, and the lack of clear policies or detection tools for AI-generated work.
Key frameworks for AI governance in Australian schools
If your school is serious about managing AI use, several frameworks offer a solid starting point:
Covers teaching and learning, wellbeing, transparency, fairness, accountability, and privacy.
Sets out rules for consent, data protection, and academic integrity.
Focuses on defining your school’s position, setting guardrails, and communicating with the whole community.
Offers practical steps for adopting AI safely and building a positive culture around its use.
An international standard that addresses the unique challenges AI poses, such as ethical considerations, transparency, and continuous learning.
Turning school AI policy into practice
Having an AI policy is just the beginning. Here’s how to make it work in your school.
1. Appoint oversight and accountability
- Assign clear roles: for example, the principal approves new AI tools, IT audits data flows, and a committee reviews classroom practice.
- Ensure decisions about AI remain in human hands.
2. Audit current AI use and data flows
- Find out which AI tools are already being used by staff and students. Teachers report using a wide range of AI tools for tasks like lesson planning, email drafting, and resource creation.
- Check how student data is being handled and whether privacy is protected.
3. Create enforceable policy documents
- Use the frameworks above as a base, but tailor them to your school’s needs.
- Define which tools are approved, what counts as personal data, and how staff and students are trained.
- Policies should explicitly address the use of generative AI in assessment tasks, including academic integrity, disclosure, and ethical use.
4. Build capability and understanding
- Start with why. Provide examples of how staff uploading confidential information to ChatGPT has led to reportable data breaches.
- Provide training for staff and students on what AI can and can’t do.
- Use real scenarios (e.g., ‘If an assessment is written this way, what will stop a student just using AI to answer it?’) to build practical understanding.
An example is how the NSW DET offers a GenAI Foundations course and professional learning modules to help staff understand the principles of safe and ethical AI use in education.
5. Monitor, measure and evolve governance
- Regularly review how AI is being used and its impact on learning and wellbeing.
- Continuously review GenAI tools and features to be used on school-owned devices, restricting those that do not meet safety requirements.
- Track metrics like the number of approved tools, staff training completed, assessment tasks adjusted, and any incidents or breaches.
Safe and ethical use: practical guidance
Staff responsibilities include:
- Never enter personal, sensitive, or confidential information into free-to-use GenAI tools.
- Always review AI outputs for quality, accuracy, hallucinations, and bias before sharing with students or the community.
- Be aware of copyright status and terms of use for any AI tool.
- Use only approved tools for student work and follow school guidelines for privacy and data protection.
For students:
- Assessment policies should state whether AI tools can be used, and students should disclose and cite any use of AI in their work.
- Generative AI plagiarism detection tools are not recommended due to reliability concerns.
- Students should be encouraged to critically analyse AI-generated content and rewrite it in their own words.
Risks and responsibilities
When using GenAI in education, key risks include:
- Privacy and data security: AI can potentially re-identify individuals from anonymous data, retain personal data longer than necessary, and may not comply with local privacy laws.
- Accuracy and bias: AI tools may generate inaccurate, outdated, or biased information, and may not reflect Australian values or curriculum.
- Accountability: Many third-party AI providers shift liability onto users and lack transparency in decision-making.
- Copyright: AI-generated content may infringe on copyright; always check terms and label AI-generated work appropriately.
To mitigate these risks:
- Remove personal, sensitive, or confidential information from AI inputs.
- Critically review all AI outputs for errors, bias, and safety before use.
- Comply with school privacy and security policies.
- Maintain oversight and use AI only for tasks you can critically assess.
- Respect community expectations and use AI in ways that align with your school’s values.
What does good AI governance look like?
Imagine a school where every new AI tool is checked for privacy and bias before use. Teachers feel confident, students understand how to use AI critically, and parents know what’s happening. This builds trust and helps your school stay ahead of rising expectations and regulations.
Need help?
You don’t have to do it alone. Digipro IT can help your school assess its current state, build customised policies, train staff, and monitor AI use, so your policy is not just on paper, but working in practice. Contact Digipro IT to find out more about how we can help.
