• Using AI to write IEPs can lead to a loss of personalization and cookie-cutter IEPs that do not adequately address the student’s needs.
  • There are ethical considerations involved in using AI to write IEPs, including the risk of perpetuating biases and devaluing professional expertise.
  • While AI may offer some benefits in terms of streamlining the IEP writing process, educators should carefully consider the potential drawbacks before implementing this technology.

Artificial intelligence (AI) has been rapidly advancing in recent years, and many industries are turning to it as a tool to streamline processes and reduce workload. The education sector is no exception, with some schools and districts exploring the use of AI to write Individualized Education Programs (IEPs) for students with disabilities.

While this may seem like a promising solution to the time-consuming and complex process of IEP writing, there are several reasons why teachers should think twice before relying on AI for this task.

Depositphotos 473361604 XL 1
  • Save

First, I get it. I’ve been lobbying and testifying before Senate committees for over a decade to improve school funding. I am aware of the lack of resources in schools. It’s a really bad time to be a teacher. But as it stands right now, AI can only make it worse. For the short term anyway.

Save The Post IEP Parent Form
📧 Save this for later? 📧
 
We can instantly send this to your inbox. Or, send to a friend.

I invite you to read all three essays on this topic.

What are AI and Open AI?

AI stands for Artificial Intelligence. It refers to the simulation of human intelligence processes by machines, especially computer systems. These processes include learning (the acquisition of information and rules for using the information), reasoning (using rules to reach approximate or definite conclusions), and self-correction.

AI is utilized in various applications such as speech recognition, natural language processing, problem-solving, and decision-making.

OpenAI is an artificial intelligence research laboratory consisting of both for-profit and non-profit entities. It was founded in December 2015 by Elon Musk, Sam Altman, Greg Brockman, Ilya Sutskever, Wojciech Zaremba, and John Schulman. They claim that the organization’s mission is to ensure that artificial general intelligence (AGI) benefits all of humanity. It’s worth noting that they are all billionaires, so I’m not convinced that they are doing this for “the greater good.”

But–there’s that keyword–OPEN. It’s open. That means that any information that a human puts into AI to generate an answer, is added to the database. More on that in a bit.

teacher using ai to write an IEP
  • Save

Over the past several years, various big online entities have been building this database by scraping every bit of information on the entire internet. Anything you clicked, anything you read, said, commented, purchased….it was all added to this giant database.

“Hey! I didn’t give permission for that!” Actually, you did. We all did. It’s been in everyone’s terms and conditions forever….just no one ever reads terms and conditions on websites.

Do you have Alexa, Siri, Ring or anything else like that? Then yep, that’s where some of the data comes from. You gave permission. (As an aside: Not my online chat forums, I have not left those open. Or my online training or any of my sites. They are private and not being scraped.)

A humanoid robot with a white, expressionless face sitting at a table, appearing to use AI to write an IEP on a touchscreen device.
  • Save

How AI Works

Here’s a very brief summary of how AI works. It needs two main things to work–information and algorithms.

  1. First the AI platform has to collect data. It has to get its information from somewhere. This has happened as a result of various online entities basically scraping the entire internet. Any public site you used—yes, they gathered that information. Your Ring cameras, anytime you ask Alexa or Siri a question–that question and your activities after that question all go into a database.
  2. Then AI takes that information, organizes it, and puts it through its algorithms. Essentially, the algorithms are what make the AI work. So when the human asks AI to do something, it takes existing information and puts it through the algorithms to give the human an answer.
  3. This all goes through repetition, evaluation and correction, like….forever. The AI platforms repeat steps 1 and 2 over and over, and based on how the humans interact with AI, it gets refined. You wanna know why Amazon is so easy to use? Because they’ve been collecting all this information and using AI to refine their systems.

By the way, if you work with/for an AI entity, I know that I am oversimplifying this. You don’t have to write me and tell me. I am working under the assumption that many IEP parents don’t really know what AI is.

One of the primary concerns with using AI to write IEPs is the loss of personalization. IEPs are meant to be tailored to each individual student’s needs, strengths, and challenges.

It is important to note that AI is not infallible and can make mistakes.

I have spoken with many parents who feel that using an IEP Goal Bank is unethical! Can you imagine if those same parents find out their IEP is written by AI?! For the record, I don’t think using an IEP goal bank is unethical. When used properly it’s a huge time saver, which is why I keep one on this site.

Reasons to NOT Use AI to Write IEPs

Here are some of the many reasons to not use AI to write an IEP. Sure, this may change over time.

  1. Relying on AI to write IEPs could lead to a devaluation of the professional expertise of special education teachers and other professionals involved in the IEP process. School district solicitors really need to pay attention to this one. How are you going to defend an IEP in Due Process that was written by AI? Are you an IEP coordinator, or other position at your school? We know AI is going to replace jobs. I’d surely think twice before I help it replace my job.
  2. Loss of Personalization: AI can generate IEPs quickly, but it may not take into account individual needs. Each student has unique strengths, weaknesses, and learning styles that need to be considered when creating an IEP. Teachers and special education professionals have the experience and knowledge to understand these nuances and create a plan that is tailored to the student. AI, on the other hand, relies on algorithms and pre-programmed data to generate the plan.
  3. Lack of Human Touch: Another concern is the lack of human touch. AI-generated IEPs may lack the empathy and understanding that come with human interaction. Students with disabilities often need more than just a plan; they need support, encouragement, and understanding. Teachers and special education professionals are better equipped to provide this type of support than a machine.
  4. You could lose your job: Many consider AI to be cheating. Whether or not you do, is up to you. But did your employer say you could AI to write an IEP? Your job description may say that IEP development a duty. Is using AI to do that legal and within employment guidelines? I guess we’re going to find out, via a test case in the courts at some point. Do you want to be that test case?

Ethical Considerations of Using AI to Write IEPs

  1. Bias in AI: One of the biggest ethical concerns with using AI in writing an IEP is the potential for bias. AI systems are only as unbiased as the data they are trained on. If the data used to train an AI system is biased in any way, the system will also be biased. This can lead to unfair outcomes for students, particularly those from underrepresented groups. How does society treat our kids? What kind of ableist, racist, and other biases exist in society and online? So where do you think AI got its information?
  2. Data Privacy Issues: Another ethical concern with using AI in writing an IEP is data privacy. AI systems often rely on large amounts of data to make accurate predictions. However, this data can contain sensitive information about students, such as their academic performance, behavior, and medical history. Even if a school has a “closed” AI platform, a child’s data is put into the platform to generate more information and for the platform to learn and evolve.
  3. Legal Implications: Schools are required to comply with special education laws, such as the Individuals with Disabilities Education Act (IDEA) and Section 504 of the Rehabilitation Act. These laws require schools to provide students with disabilities with a free and appropriate public education (FAPE) and to create an IEP that is tailored to the individual needs of each student. AI does not guarantee to cover all aspects of IDEA.
  4. Liability for Errors: Another legal consideration is liability for errors. If an IEP created with the help of AI contains errors or does not meet the legal requirements for a student’s disability, the school district could be held liable. This could result in legal action, financial penalties, and damage to the reputation of the school district.
  5. AI Misinterpretations: One of the biggest concerns with using AI to write an IEP is the potential for misinterpretations. AI is only as good as the data it is trained on, and if there are errors or biases in the data, the AI may make incorrect assumptions or decisions. This can be especially problematic when it comes to writing an IEP, which requires a nuanced understanding of a student’s strengths, weaknesses, and needs. If the AI misinterprets any of this information, it could lead to an ineffective or inappropriate IEP. If you’re doing a ton of editing, rework and rewriting, are you really saving time?
  6. Dependence on Technology: Another concern with using AI to write an IEP is the potential for dependence on technology. While technology can be helpful, it should never replace human judgement and critical thinking. If schools become too reliant on AI to write IEPs, they may start to overlook important details or fail to consider alternative solutions. Additionally, if the technology fails or malfunctions, it could lead to delays or errors in the IEP writing process.

How AI IEPs Could Affect Students

  1. Hindering Critical Thinking: One of the most significant drawbacks of using AI to write IEPs is that it can hinder critical thinking. Critical thinking is a vital skill that students need to develop in order to succeed in school and in life. When teachers rely on AI-generated IEPs, they may be less likely to engage in critical thinking themselves. This can result in a lack of creativity and innovation in the classroom, which can ultimately harm student learning.
  2. Reducing Student Engagement: Another issue with using AI to write IEPs is that it can reduce student engagement. When teachers rely on AI-generated IEPs, they may be less likely to take the time to get to know their students and understand their unique needs. This can result in IEPs that are not tailored to the specific needs of individual students, which can lead to a lack of engagement and motivation.
teacher ai iep
  • Save
One teacher’s opinion on this.

Diminishing Value of Your Professional Expertise

Using AI to write an IEP can be problematic because it can undermine educators’ judgement and overlook their experience.

Undermining Educators’ Judgement

Educators have a wealth of experience and knowledge that they bring to the table when writing an IEP. This expertise allows them to tailor the plan to the unique needs of each student. When AI is used to write an IEP, it may not take into account all the nuances of a student’s situation. This can result in an IEP that is not as effective as it could be.

Overlooking Educator Experience

Another issue with using AI to write an IEP is that it can overlook the experience of the educator. Teachers have a deep understanding of the student and their needs, which is not something that can be replicated by a machine. By using AI to write an IEP, educators may feel like their expertise is being disregarded.

Community Response to AI IEPs

Since 2010, I have talked with literally 1000s of IEP parents.

While some parents may embrace this idea (no pun intended), I can tell you that as a whole, the parental community is not there yet. Parents are not yet comfortable with a computer writing their IEP.

This relationship, as we all know….is already contentious to begin with. If parents learn that their IEP was computer generated, it’s only going to get worse.

Do we really want that? I don’t. I work my butt off every day to repair relationships between parents and their IEP team.

IEP Tools for You

Email us if you have any questions about these products.

Parental and Staff Pushback

While some may see AI as a solution to the time-consuming process of writing Individualized Education Programs (IEPs), many parents and staff members have expressed concerns and pushback against the use of AI in this context.

One major concern is the potential lack of personalization in an AI-generated IEP. Parents and staff worry that an AI may not fully understand a student’s unique needs and may not be able to take into account important factors such as family history, cultural background, and individual preferences.

Staff members may feel that an AI-generated IEP does not allow for collaboration and discussion among the team members involved in the IEP process.

Another concern is the ethical implications of using AI in special education. Some worry that relying on AI to write IEPs may lead to a dehumanization of the process and a lack of empathy for the students and families involved. There are also concerns about data privacy and security, as AI may require access to sensitive information about students and families.

Despite these concerns, some proponents of AI argue that it can be used as a tool to assist in the IEP writing process, rather than a replacement for human input. They suggest that AI can be used to streamline certain aspects of the process, such as data collection and organization, allowing staff members to focus on the more personal and collaborative aspects of the IEP process.

Did you read a sentence or paragraph in this essay and think, “That sounded weird?” Yes! I used AI to assist me. So, if it read or sounded awkward to you, imagine how a parent is going to feel if it’s their IEP.

Frequently Asked Questions

What are the potential risks of relying on AI for creating Individualized Education Programs (IEPs)?

AI-generated IEPs may not be able to capture the nuances of a student’s learning needs. It may also lead to a lack of flexibility in the educational plan, which could negatively impact the student’s academic progress.

How might using AI to draft IEPs impact the quality of personalized learning plans for students with special needs?

AI-generated IEPs may not be able to provide the same level of personalization as a human-generated IEP. This could result in a less effective educational plan that does not adequately address the student’s unique learning needs.

Could the use of AI in writing IEPs lead to a lack of human insight and understanding in special education?

The use of AI in writing IEPs could potentially lead to a lack of human insight and understanding in special education. This could result in a less effective educational plan that does not adequately address the student’s unique learning needs.

What are the ethical considerations of using artificial intelligence to develop educational plans for students?

The use of AI in developing educational plans for students raises ethical concerns, including issues related to data privacy and security, as well as concerns about the potential for bias in the AI algorithms used to generate the plans.

How could AI-generated IEPs affect the involvement of educators and parents in the special education process?

AI-generated IEPs could potentially reduce the involvement of educators and parents in the special education process. This could result in a less effective educational plan that does not adequately address the student’s unique learning needs.

Are there concerns about data privacy and security when using AI to create IEPs for students?

Yes, there are concerns about data privacy and security when using AI to create IEPs for students. It is important to carefully assess the specific AI tool’s privacy and security policies, as well as to ensure that the use of AI tools aligns with established protocols for data privacy and security in educational settings.