Featured User Guides

Access Editable Doc

The process of developing guidance will look slightly different for a system leader, a local administrator/principal, or a classroom teacher, and you can find roadmaps for using the toolkit tailored to each role below.  No matter your role, the overarching goal remains the same: fostering the safe, effective, and equitable use of artificial intelligence in education. 

While terminology varies across countries and regions, “education system” refers to a district, regional, state, or national governing body, agency, or authority. Each entity must thoughtfully consider its own unique role in developing appropriate AI guidance and policies.

A note about

Education System Terminology

Featured User Guides

Roadmaps for using this toolkit are available for your specific role.

Education System Leaders

Education system leaders, such as school board members, local and national education leaders (e.g. ministries of education, superintendents), and directors of technology, can use this toolkit to inform the development of a vision statement, set of principles and beliefs, or a responsible use policy.

See Guide

Principals and Local School Administrators

Principals and local school administrators, such as academic officers or staff development specialists, can use this toolkit to inform instructional guidance and professional development.

See Guide

Teachers

Teachers can use sections of the toolkit to inform their use of AI in instruction and assessment, and how their students should or should not use AI when completing assignments.

See Guide

Guide For

Education System Leaders

Objective: Develop or update systemwide AI guidance and/or policy that aligns with educational goals, addresses compliance, and sets the tone for AI use in multiple districts or schools.

  • Action: Review the document’s explanation of how AI guidance, organizational learning, and transformation interrelate.
  • Outcome: Define high-level goals for AI use (improving student outcomes, supporting teacher well-being, ensuring equity, etc.).
  • Why: This lays the foundation for any forthcoming guidance or policy, rooting AI use in your system’s existing strategic priorities.
  • Use the TeachAI Principles and Sample Guidance to determine the guidance and policy you need to develop
    • Action: Examine each principle—Purpose, Compliance, Knowledge, Balance, Integrity, Agency, Evaluation—and determine the issues that need to be addressed in system wide policy.
    • Outcome: Adopt or adapt these guiding principles into official documents.
    • Why: Clear principles help unify diverse stakeholders around shared beliefs about AI’s role.
  • Assess current policies and gaps (Refer to Sample Considerations for Existing Policies)
    • Action: Compare existing technology, privacy, and academic integrity policies with the toolkit’s sample language around AI.
    • Outcome: Identify policy gaps (e.g., no mention of generative AI, inconsistent privacy stipulations, significant student protections) and draft updates or new sections.
    • Why: This ensures your overarching policy infrastructure is coherent, up to date, and compliant with relevant laws.
  • Facilitate stakeholder input (Refer to Engage Parents, Staff, and Students)
    • Action: Use the toolkit’s suggestions on engaging families, educators, and students early. Consider hosting listening sessions or surveys. Make use of the adaptable AI in Education Presentation.
    • Outcome: Gather and address real-world concerns and ideas from the field to refine your policy, building broad-based support.
    • Why: Early engagement fosters trust, surfaces issues leaders may overlook, and increases policy adoption. It is also a useful first step in building general AI Literacy within your system’s community.
  • Plan for capacity-building and cultivate AI Literacy (Refer to Principle 3. Knowledge: Promote AI Literacy)
    • Action: Budget for statewide/district wide training, referencing the toolkit’s emphasis on AI literacy and professional development.
    • Outcome: Develop training plans and other supports so resources are available for teachers, staff, and students.
    • Why: Even a well-written AI policy will fail if end users lack the knowledge or resources to implement it responsibly across subjects.
  • Establish Ongoing Evaluation (Refer to Principle 7. Evaluation: Regularly Assess the Impacts of AI)
    • Action: Create a mechanism to track AI’s efficacy and potential unintended consequences.
    • Outcome: Policies remain agile as technology evolves; updates are based on evidence rather than reactive bans.
    • Why: AI tools, regulations, and best practices will rapidly change, and your system must remain proactive.

Guide For

Principals and Local School Administrators

Objective: Create practical, localized guidelines and professional development plans so that staff and students have clarity on how to use AI tools within the school or region.

    • Align with systemwide framework (Refer to A Framework for Incorporating AI in an Education System)
      • Action: If a larger education authority has set guidelines, map your local strategy to their overarching framework. If not, use the toolkit to define your vision mapped to your own key frameworks and principles (e.g. instructional vision, portrait of a graduate)
      • Outcome: A vision or AI mission statement for your organization that respects existing regulations and emphasizes local priorities.
      • Why: Alignment ensures that you address immediate community needs and that AI deployment advances rather than frustrates broader system goals.
    • Draft local guidance (Refer to TeachAI Principles and Sample Guidance)
      • Action: Use the sample guidance sections—Purpose, Scope, Principles, Responsible Use, Prohibited Use—to craft a succinct document.
      • Outcome: A clearly articulated document or handbook for staff, students, and families.
      • Why: Clearly defined expectations reduce classroom confusion and inconsistencies, offering teachers the confidence to innovate rather than become overly risk-averse.
    • Review and Update Existing Policies (Refer to Sample Considerations for Existing Policies)
      • Action: Work with your legal, IT, and instructional teams to integrate AI references into your responsible use policies, privacy statements, academic integrity rules, and other relevant guidance.
      • Outcome: Updated policies that directly mention generative AI, specify how teachers and students should cite AI-generated work, and outline restrictions (e.g., no personally identifiable information (PII) or data are shared with AI).
      • Why: In many schools, general “acceptable use” policies need explicit AI-focused language to address privacy, plagiarism, and misuse within the context that generative AI creates.
    • Engage Your Community (Refer to Engage  Parents, Staff, and Students)
      • Action: Host workshops or Q&A sessions that highlight how AI might be used for tutoring, creativity, or administrative tasks—and address concerns such as privacy or cheating.
      • Outcome: Greater transparency and support from families and the broader community.
      • Why: AI adoption can spark misunderstandings and fears; proactive dialogue fosters trust and clarifies benefits and boundaries.
    • Plan and deliver professional development and training (Refer to Principle 3. Knowledge: Promote AI Literacy)
      • Action: Engage and communicate with teachers and staff. Schedule AI Literacy training for teachers and staff on AI basics, best practices, ethical considerations, and the specific guidelines you’ve set. Consider using the AI in Education Presentation available in the toolkit.
      • Outcome: Improved staff confidence in using AI to create materials, differentiate lessons, assess student work, and remain alert to potential harmful biases.
      • Why: Teachers in all subjects need both theoretical and hands-on exposure to generative AI tools to integrate them responsibly and effectively.
    • Implement a Feedback & Improvement Cycle (Refer to Principle 7. Evaluation: Regularly Assess the Impacts of AI)
      • Action: Collect feedback from teachers, students, and families about AI usage in classrooms. Monitor changes in academic integrity incidents, privacy issues, or administrative efficiency.
      • Outcome: A regular improvement process that updates your local guidelines as technology, laws, and best practices evolve.
      • Why: AI changes quickly; ongoing evaluation ensures your policies stay relevant and effective.

Guide For

Teachers

Objective: Use the toolkit’s recommendations to guide daily classroom practices, incorporate AI in ways that enrich student learning, and set appropriate boundaries.

​

  • Familiarize Yourself with the Basics (Refer to Summary of Key Messages and Principles, Principle 1. Purpose: Use AI to help students achieve educational goals, Principle 3. Knowledge: Promote AI Literacy and the AI in Education Presentation)
    • Action: Read or view the toolkit’s foundational materials on AI (what it is, how it works, typical pitfalls like misinformation, and ways to use it for creativity).
    • Outcome: A clear understanding of generative AI’s strengths and limitations (e.g., hallucinations, bias, harassment).
    • Why: A teacher’s ability to model responsible AI use starts with personal proficiency and awareness of both benefits and risks. (Refer to Principle 4. Balance)
  • Clarify Classroom AI Use with Students (Refer to Principle 5. Integrity: Advance Academic Integrity, Principle 4. Balance: Realize the Benefits of AI and address the risks, Sample Student Agreement on the Use of AI, and Sample Guidance)
    • Action: Explicitly state when AI is allowed, how it must be credited, and in which assignments or portions of assignments it is prohibited. Integrate this process across all subjects.
    • Outcome: Students understand academic integrity expectations—especially around originality versus AI-generated content.
    • Why: Transparency prevents misconduct and helps students see AI as a learning aid rather than a shortcut for plagiarism.
  • Design Lessons and Assessments Thoughtfully (Refer to Principle 3. Knowledge: Promote AI Literacy, Principle 5. Integrity: Advance Academic Integrity, Principle 4. Balance: Realize the Benefits of AI and Address the Risks)
    • Action: Rethink assignments so they focus on critical thinking, personal reflection, deliberation, debate, or in-class presentations—tasks that can’t be done solely by AI. This work will continue as AI evolves. For instance, agentic AI will have different capabilities and require different assignments for learning with agents.
    • Outcome: Students build deeper knowledge, leveraging AI as a learning tool, while simultaneously building the skills of voice and agency that the advent of AI will require of them throughout their educational and career pathways.
    • Why: Adapting lessons, assignments, and assessments to the AI era helps you maintain rigorous standards for creativity, original thought, content knowledge, and agency skills.
  • Use AI to Enhance Instruction (Refer to Principle 6. Agency: Maintain human decision-making when using AI, Sample Guidance sections Guiding Principles for AI Use, and Responsible Use of AI Tools)
    • Action: Explore AI for differentiating instruction, creating tailored assessments, generating reading passages at different levels, or providing quick translations.
    • Outcome: Increased time to offer more personalized support to students.
    • Why: Generative AI can reduce repetitive tasks, increasing time on relationships and high-value instructional activities.
  • Model Safe and Ethical Usage (Refer to Principle 4. Balance: Realize the Benefits of AI and Address the Risks)
    • Action: Demonstrate how to verify AI-generated information, talk about harmful biases, and show students how you incorporate AI responsibly in your workflow.
    • Outcome: Students develop digital literacy skills and a healthy skepticism toward automated outputs.
    • Why: Teachers shape student attitudes; modeling best practices fosters a culture of responsible, critical use of technology.
  • Provide Feedback and Adapt (Refer to Principle 7. Evaluation: Regularly Assess the Impacts of AI)
    • Action: Share classroom experiences with colleagues and leadership—both successes and pitfalls. Document changes in student engagement or achievement. Involve students in sharing their experiences and ideas for using AI to learn and deliberating over ethical and responsible practices in class.
    • Outcome: A professional learning community that refines local AI guidelines.
    • Why: Teacher experiences on the ground should inform how the region or system updates its AI strategies and policies over time.

Develop an Overall Vision:
A Framework for Incorporating AI 

Read More

Inform Your Guidance:
Principles for AI Guidance

Read More

View Sample Guidance:
Guidance on the Use of AI in Our Schools

Read More

Review Existing Policies:      Sample Considerations for Existing Policies

Read More

Give a Presentation: The AI in Education Presentation

Read More

Engage
Communication with Parents, Staff, and Students

Read More

Consider:
Featured User Guides

Read More

Suggested Citation:  TeachAI (2025). AI Guidance for Schools Toolkit. Retrieved from teachai.org/toolkit. [date]. 

© 2025 TeachAI

Back to Top
Give Feedback
Back to Toolkit Home