If you have questions about the use of Artificial Intelligence (AI) in education, you are not alone! As I deliver trainings about AI across my state and beyond, I hear lots of questions from educators, such as:
- What is AI?
- Why is it important for educators to engage in conversations about AI?
- How are educators using AI?
- What are some of the challenges and risks of using AI in school settings?
- Why do AI policies and regulations matter?
- How can we include educator, family, and student voices in conversations and decisions related to AI?
This article addresses these questions and offers recommended resources created by educators and organizations doing thoughtful, valuable work about AI. Whether you are just getting started or are looking for resources to share with colleagues, you are in the right place. Let's begin!
What is AI?
So, what is AI, anyway? For many, the mention of AI recalls robots that think and feel like humans, futuristic sentient machines like those in movies such as Terminator and The Matrix. The reality is that AI is a tool, already in many apps and services you use every day. Alexa and Siri utilize AI. So do our smart watches, map apps, and security systems. Social media is a large user of AI — do you get targeted ads on your social media feeds? If so, AI is analyzing your searches, purchases, and how long you view content to then streamline the ads and content you see to your personal preferences. Netflix and other viewing platforms also use AI and generate recommended viewing lists by analyzing your data for that platform and finding similar content.
AI doesn't "think" — although it may seem like it does. It's a technology that uses machines to do tasks typically accomplished by humans. What makes this possible is the ways that AI systems use statistical analysis on large data sets to make predictions.
- Predictive AI analyzes data to predict future outcomes. For example, a school can use predictive AI to forecast how students will perform on assessments and create plans proactively to address concerns and highlight successes.
- Generative AI, used by tools such as ChatGPT, Google Gemini, and Microsoft CoPilot, creates new content, or output, in seconds. This content is informed by existing data, information, and content already online. In education, one way generative AI is being used is to create differentiated materials based on student needs and progress. The more generative AI is used, the more refined the generated content becomes, making it more personalized for users, including educators and students. And in turn, the creation of new content continues feeding and informing the content that will be created by generative AI in the future. There are also many education AI tools that focus on specific tasks for educators.
Note: When using generative AI, it is critical to review what is created. You may be able to refine the product by modifying your prompts, but it is also essential to check for accuracy. (You don't have to look far for examples of AI-generated errors, as seen in this BBC news story, Glue pizza and eat rocks: Google AI search errors go viral.)
Learn more
Learn more from the following resources. For some brief videos and tutorials, we recommend Common Sense Media's AI Literacy Lessons.
- AI Resource Collection (Colorín Colorado)
- AI Literacy Lessons for Grades 6-12 (Common Sense Media)
- AI for Educators: Free Professional Development Courses (Common Sense Media)
- Generative AI: Articles for Educators (Edutopia)
Why Educators' Voices Matter
This brave new world of AI can be intimidating and disconcerting. There are also a lot of questions about how to use AI safely and appropriately, or whether to use it at all. Yet as tempting as it may be to keep AI at a distance, our students need us to engage with these important conversations. Here's why:
Teaching digital literacy
Students need to learn the skills to navigate the digital world they are growing up in, such as media literacy skills, digital citizenship skills, and critical thinking skills.
Preparing for the workplaces of the future
Increasingly, AI tools are being included in workplaces and many schools. Today's students will be expected to use AI tools in their own careers in the future. And educators themselves may also be increasingly asked to learn about and use AI.
Advocating for more teacher training
Educators need ongoing training around AI. The more familiar educators are with AI tools and their uses, the more input they can provide about the most relevant types of training in their work.
Learning from our students and families
Many students and families are already using AI tools in a variety of ways. There is a lot we can learn from them! And the more educators know about how AI is used outside of school, the more effective partnerships can be around harnessing its capabilities in constructive ways, while also establishing norms and guidelines for appropriate use.
Supporting students
Fundamentally, it's critical for educators to get comfortable with talking about the use of AI while also articulating our own essential roles in schools. AI is not a replacement for educators or support staff, and this message can not be overemphasized. It is especially important to talk through appropriate uses of AI as new uses are explored or adopted across districts.
More than ever, students need caring, nurturing school communities in which educators:
- Build relationships with students through interaction and connection
- Provide personalized learning and support
- Provide human insight, understanding, and warmth
- Teach critical thinking skills, problem solving, communication, collaboration, ethics, creativity, and responsibility
Learning about AI is a way to bring clarity to this conversation, and keeping "humans in the loop" (U.S. Department of Education, 2023) is the best way to ensure that educational values and principles are prioritized in schools as the use of AI and technology continues to grow.
How Districts and Educators Are Using AI
How is AI being used in classrooms and schools? Here are some of the most common uses:
Teaching and learning
Many educators are finding AI instructional tools to be comprehensive and helpful — with proper prompts! These tools are proving to be a great jumping-off point for lesson planning, assignment, and activity creation.
- Lesson planning: Generative AI can be used to create prompts and activities; to differentiate tasks and texts; to create specific images and other kinds of media; and to create a wide range of materials with tools such as lesson/unit plan generators, writing checkers, quiz generators, and rubric creators, among other features.
- Real time feedback and assessment: AI tools can provide quick feedback, which can help with instructional decision-making in the classroom. These tools can also incorporate more formative assessment into instruction. AI tools are being used to provide instant feedback on writing tasks, as well as to analyze the data from student assessments to provide teachers and students with strengths and areas for improvement.
- Personalized learning: AI analyzes learning patterns and adjusts pace, level of knowledge, and provides on-demand assistance. This can be particularly helpful for students who need additional supports. For example, AI tools can be used to level text as a form of differentiation. In addition, using AI tools can potentially free up time to provide more focused, personalized instruction for students.
Administrative tasks
Using AI for administrative tasks can save time and offer a chance to "dip your toes in the AI pool."
- Administrative assistance/automation of routine tasks: Generative AI can automate routine tasks such as data entry, report generation, and the writing of emails and newsletters, allowing educators to focus on more complex and nuanced work.
Student accessibility and engagement
- Language access: AI tools can break down language barriers in a variety of settings, and many teachers of ELLs are trying out various AI tools to determine which functions best support instruction. (See more about using AI with English learners in our forthcoming article.)
- Accessibility: AI also provides new accessibility options for students with disabilities and students who may need additional support. For example, some educators are exploring the use of IEP resources including goal creation and behavior intervention generators, which they stress are a starting point for creating a first draft that they can customize with personalized input. (This is an area in which transparency with parents would be critical.)
- Student engagement: The interactive nature of AI engages students. AI tools also allow students to access experiences and explore curated resource collections that would be cost-prohibitive to attend in person. Educators are creating interactive multimedia lessons, as well as accessing museum collections and experiences that were previously only available in person, thereby fostering interest and engagement. AI tools can bring the experiences into the classroom, providing opportunities for exploration and discovery.
In addition, AI is being introduced in arenas where the implications are more complex, such as tutoring, observing instruction, district "customer service," and counseling. This is another reason why educators' voices matter in decisions related to AI as new tools and possibilities emerge.
AI in the Classroom: Challenges and Risks
As we begin to incorporate AI into classrooms and schools, we also need to understand the challenges and risks of AI. There are many concerns regarding AI that are giving professionals across all sectors, including education, pause. These include:
- Bias and Discrimination: AI systems can (and most likely will) perpetuate biases present in their training data, potentially leading to discriminatory outcomes. The AI tool is only as good as the data used to train the tool. For example, if the data set only includes examples of people from one gender or ethnicity, the tool does not know to include more diversity; the generated results will only reflect the data that was used for training. Educator Tammi Sisk shares an example in a Common Sense Media webinar on Chatgpt that reflects gender bias. She submitted a blog post to an AI tool for feedback under the student name "Alice." She then submitted the same post under the name "Bob." While the feedback was similar, there were notable differences, including an emphasis on the emotional dimensions of Alice's response, in contrast with praise of Bob's analytical strengths.
- Privacy Concerns: Generative AI requires large datasets, which may include sensitive personal information. When a person inputs personal information into an AI tool or other online source, that information becomes part of the data an AI tool uses to generate new material. Ensuring privacy and compliance with regulations is critical. New York's Education Law 2d, California's Privacy Rights Act, and Colorado's Privacy Act are some ways states are providing additional privacy protections to residents.
- Security Risks: Many employees handle sensitive information. AI systems could be targeted by cyberattacks, leading to data breaches or manipulation of information. AI is also being used to create data breaches and collect sensitive information on internet users.
- Job Displacement: As AI takes over increasing complex tasks, there is a risk that some sector jobs could be displaced, requiring a shift in workforce skills. Jobs such as call center technicians are already seeing positions replaced with chatbots. And although some jobs may be displaced, new jobs such as AI engineers and AI solutions analysts will be created as well, according to CNET (Lacy, 2024). At the same time, it's important to recognize that many jobs are incorporating the use of AI, and so candidates who are familiar with AI will have a competitive edge.
- Intellectual Property and Copyright Infringement: The content generated by generative AI can resemble other peoples' work. The nature of how generative AI gathers information creates questions about who owns AI-generated content and the copyright issues surrounding the information it culls to create new content. Universities are building their own AI tools to overcome the ethical and equity issues with the use of AI. These proprietary tools are also a way to protect intellectual property. (Some AI tools, such as Perplexity, also indicate sources and citations for their content.)
- Accessibility: There are a number of questions related to digital equity. Do students have access to the devices, tools, and internet connections needed to use AI? This issue was highlighted by the digital access issues of the COVID-19 pandemic. And do all students have opportunities to learn how to use these tools? (If not, how will this disadvantage be compounded over time in their future education and careers?) In addition, many AI tools are subscription-based, which also creates digital equity issues. There is also the question of access to AI training for educators. According to a 2024 research brief from RAND, "Urban districts were the least likely to deliver such training."
- Hallucinations: AI can perceive patterns or objects that are nonexistent, creating inaccurate outputs. Due to the nature of generated AI, the false information is presented as truth and is very convincing.
- Fraud and Cyberbullying: Fraud scams have increased with the use of AI (e.g., account takeover attacks and password scams) and are becoming more sophisticated and difficult to detect. AI-powered tools have also made it easier to target people online, creating convincing fake content such as altered voice recordings and pictures — another reason why media literacy is such an essential skill for students to learn.
Academic Integrity
In addition, one of the biggest challenges of generative AI is academic integrity. Both students and educators are concerned about this issue; research featured by TeachAI indicates that many students want clear guidelines about the use of AI. In addition, some educators have reported that their students, including ELLs, have been accused of using AI tools to create work that they in fact created themselves. These challenges indicate a clear need for discussions regarding ethical use of AI, setting expectations for students, and clear explanations about school policies about AI. In addition, students will benefit from a deeper understanding of what constitutes plagiarism and the possible consequences, especially if they don't know what plagiarism is. (See more in this article from eCampusNews.)
At the same time, there are important questions about what the use of AI means for students' skill development and critical thinking, particularly in the areas of writing, research, and project planning. Educators in middle and high school are grappling with students' regular use of these tools, even though schools may not yet have formal AI policies or guidance in place yet.
AI detection tools have also become an area of interest for educators. There are many AI detection tools available to identify AI-generated content. Educators are using these tools to check student work with mixed success, however. Many of these tools cannot keep up with increasingly sophisticated generated material and the ever-improving nature of AI tools, which can make it difficult to use them consistently for student work. Running student work through AI detection tools can also add to teachers' already full plates, taking time away from the tasks that most benefit students.
The Role of AI Policies and Regulations
In order to address the many questions and concerns related to this fast-moving technology, the U.S. Department of Education has developed an AI toolkit that addresses inclusion of AI in classrooms, along with releasing the National Educational Technology Plan in February, 2024. The National Institute of Standards and Technology is holding discussions to develop federal standards for responsible AI systems, creating a framework similar to the European Union's AI Act. Additionally, the American Federation of Teachers has released "Common Sense Guardrails for Using Advanced Technology in Schools," a resource created by educators with the aim of providing guidelines for minimizing the harm associated with AI while maximizing the benefits.
Much of the work around AI regulation and compliance is happening at the state level. Although each state has approached the question of AI regulation differently, the 2024 legislative session showed that AI is on the forefront of legislators' minds. According to the National Conference of State Legislatures, "at least forty-five states…introduced bills, and thirty-one states…adopted resolutions or enacted legislation" (National Conference of State Legislatures, 2024). Twelve of those states have enacted laws that authorize government bodies and organizations to expand AI expertise, particularly around potential impacts and submit policy recommendations for AI in relation to employment, healthcare, education, and elections, while other states have already specifically targeted discriminatory AI in hiring practices, protecting data privacy, and manipulative media and deep fakes (Brennan Center for Justice, 2023).
School district AI policies vary widely, much like states. Some districts such as Gwynnette County Georgia's Seckinger School District have embraced AI fully, creating an AI-themed elementary, middle, and high school to prepare students for the future. Other districts have done little to address AI use with employees or students through policy or guidance, waiting perhaps for state and federal guidance first. The RAND research brief cited above notes that 60 percent of districts planned to offer AI training by the end of the 2023–2024 school year.
Engaging Educator, Family, and Student Voices
As districts begin discussing AI policies and incorporation into classrooms, educators, families, and students must all be part of the conversation. As a district leader notes in a 2024 report on AI from the Wallace Foundation, "It's going to take humans staying in the loop, being part of the development, and being very thoughtful and engaged to make sure that AI is used in a way that's beneficial."
Some steps that educators can take on behalf of creating a collaborative conversation include:
- Looking for opportunities to actively engage with school leadership in discussing AI — the potential uses, ethics, professional development, and specific tools to be used in classrooms — and advocating for those opportunities if they are not yet available
- Including a diverse range of voices and roles on district technology committees, including educators who work with different ages and diverse student populations
- Advocating for ongoing professional learning and training around AI
- Engaging with students and families, either through technology or other channels, to learn from them about how they are using AI, as well as to address their questions and concerns
- Inviting the school-wide community into the conversation ensures that diverse voices are heard and keeps the narrative focused on best practices, responsible use, and keeping human connection at the forefront
The more educators' and families' voices are included in these conversations, the more input educators and families can have on how these tools are used and even developed.
What Teachers, Students, and Families Need to Use AI Successfully
As AI permeates classrooms, teachers, students, and families need guardrails that maximize AI's educational possibilities while providing protection from potential harm. Some key recommendations follow:
- First and foremost, educators need upskilling to understand the positives, negatives, and ethical questions that come with using AI tools.
- Educators must be able to learn how to harness AI's potential to effectively teach and help students navigate the world of AI.
- Becoming familiar with the many toolkits, webinars, and tools that are available, as well as specific training in best practices and practical implementation, will help educators take the reins of AI more successfully.
- Families need information and transparency regarding the use of AI in their child's education, including privacy protections that are in place.
- Students and families also need guidance in understanding the ethics and concerns surrounding AI and how to use this tool responsibly.
- District administration must emphasize the appropriate use of AI and advocate for responsible use, while centering educators' essential roles in providing personalized instruction and support for students.
There is no denying that AI's impact on our society and in education will be transformative, and it is here to stay. With thoughtful implementation and ongoing dialogue, schools can channel AI's potential while reducing its risks and affirming the vital work that educators do in our school communities every day.
About the Author
Christina Patterson is an Assistant in Research and Educational Services at New York State United Teachers (NYSUT), where she specializes in technology, math, science, STEM, and AI. With a deep passion for education, she brings 27 years of experience as a Special Education teacher to her role. Outside of work, Christina enjoys traveling, reading, gardening, and spending time with her family.
Add new comment