40 Reading Comprehension + Fluency IEP Goals: Examples and What to Target Next

Reading comprehension goals show up in IEPs once a student can decode text with enough accuracy and fluency to access meaning. At that point, sounding out words is no longer the primary barrier. Understanding what was read is.

This post is a working list of reading comprehension IEP goals commonly used by special education teachers, reading specialists, and IEP teams. These goals focus on skills like identifying main ideas, making inferences, answering questions about text, and using strategies to understand grade-level material with appropriate supports.

Teacher supporting a student with reading comprehension skills during instruction
  • Save
Reading comprehension iep goals focus on understanding text, not just decoding words.

Reading matters. But in IEP writing, sequence matters too.

Students should not be expected to demonstrate comprehension skills before they have the underlying decoding and phonemic awareness skills in place. When decoding is still fragile, comprehension data is unreliable, and goals miss the mark. Accurate assessment comes first. Then targeted, measurable reading comprehension goals can be written to reflect what the student actually needs next.

IEP Goals for Reading Comprehension

  1. Reading comprehension of grade-level literary text: By [date], [student] will read a grade-level literary text (stories, legends, poems) and demonstrate comprehension by [describe task: answer questions/retell/identify elements] in [number] out of [number] opportunities, as measured by [method], with [accuracy/criteria].
  2. Functional vocabulary comprehension using total communication: By [date], [student] will use total communication (AAC device, PECS, verbalization) to demonstrate comprehension of at least [#] new functional vocabulary words and related short phrases in the context of [vocational activities/school environment], as measured by [method], with [accuracy/criteria] in [#] out of [#] opportunities.
  3. Reading comprehension of print texts with minimal assistance: By [date], [student] will demonstrate comprehension of print text at the instructional level by [describe task], as measured by [method], with [accuracy/criteria] in [#] out of [#] opportunities with [minimum assistance].
  4. Using context clues to determine word meaning: By [date], when given instructional-level text, [student] will use context clues to determine the meaning of unfamiliar words, as measured by [method], with [accuracy/criteria] in [#] out of [#] opportunities.
  5. Reading and using vocational phrases: By [date], [student] will read and verbalize short phrases related to vocational activities and complete the related functional tasks within the school environment, as measured by [method], with [accuracy/criteria] in [#] out of [#] opportunities.
  6. Identifying main idea: By [date], after reading or viewing a simple story, [student] will identify the main idea, as measured by [method], with [accuracy/criteria] in [#] out of [#] trials.
  7. Confirming predictions using text evidence: By [date], when given an instructional-level passage, [student] will confirm or revise an initial prediction using information from the text, as measured by [method], with [accuracy/criteria] in [#] out of [#] trials.
  8. Sequencing events: By [date], after reading a story, [student] will explain the sequence of events, as measured by [method], with [accuracy/criteria] in [#] out of [#] trials.
  9. Answering WH questions: By [date], when given a short text (up to [length]), [student] will answer rotating who/what/where questions, as measured by [method], with [accuracy/criteria] in [#] out of [#] opportunities with [minimum assistance].
  10. Answering how/why questions: By [date], when given a short text (up to [length]), [student] will answer rotating how and why questions, as measured by [method], with [accuracy/criteria] in [#] out of [#] opportunities with [minimum assistance].
  11. Identifying main idea and supporting details: By [date], after reading an instructional-level text, [student] will identify the main idea and [#] supporting details, as measured by [method], with [accuracy/criteria] in [#] out of [#] trials.
  12. Answering comprehension questions after silent reading: By [date], after reading a story silently, [student] will answer comprehension questions (how, why, what-if), as measured by [method], with [accuracy/criteria] in [#] out of [#] trials.
  13. Peer-to-peer question asking and answering about text: By [date], after reading a grade-level story, [student] will ask a peer [#] text-based questions and answer [#] questions asked by a peer, as measured by [method], with [accuracy/criteria] in [#] out of [#] trials.
  14. Using synonyms, antonyms, and homonyms: By [date], when given words from instruction or text, [student] will identify synonyms, antonyms, and homonyms and use them correctly in sentences, as measured by [method], with [accuracy/criteria] in [#] out of [#] trials.
  15. Identifying cause and effect: By [date], after reading a story, [student] will identify the effect of a character’s action or event, as measured by [method], with [accuracy/criteria] in [#] out of [#] trials.
  16. Matching pictures to words for functional vocabulary: By [date], [student] will match pictures to words and words to pictures for at least [#] functional vocabulary words, as measured by [method], with [accuracy/criteria] in [#] out of [#] opportunities with [visual supports/minimum assistance].
  17. Citing evidence in the text: By [date], after reading a passage and answering questions, [student] will locate and point to or quote information in the text that supports answers, as measured by [method], with [accuracy/criteria] in [#] out of [#] opportunities.
  18. Matching pictures to sentences for functional vocabulary: By [date], [student] will match pictures to sentences and sentences to pictures for at least [#] functional vocabulary words, as measured by [method], with [accuracy/criteria] in [#] out of [#] opportunities with [visual supports/minimum assistance].
  19. Answering questions using implied meaning: By [date], after reading a short passage, [student] will answer comprehension questions using implied meaning (not directly stated), as measured by [method], with [accuracy/criteria] in [#] out of [#] opportunities.
  20. Answering “where” questions with visual supports: By [date], when given short text with visual supports (color coding/highlighting), [student] will answer “where” questions, as measured by [method], with [accuracy/criteria] in [#] out of [#] opportunities with minimal gestural cues.
  21. Identifying fiction vs nonfiction statements: By [date], when given statements or short texts, [student] will identify whether information is fiction or nonfiction, as measured by [method], with [accuracy/criteria] in [#] out of [#] opportunities.
  22. Following two-step written directions: By [date], when given two-step written directions, [student] will complete the directions and answer related who/what/where questions, as measured by [method], with [accuracy/criteria] in [#] out of [#] opportunities with [minimum assistance].
  23. Distinguishing fact from opinion: By [date], when given statements or short texts, [student] will distinguish fact from opinion, as measured by [method], with [accuracy/criteria] in [#] out of [#] opportunities.
  24. Answering structure questions from short text: By [date], when given a short text (up to [#] sentences), [student] will answer comprehension “structure questions” (who/what/where/when/how/why), as measured by [method], with [accuracy/criteria] in [#] out of [#] opportunities with minimal gestural cues.
  25. Using text organizers to locate and categorize information: By [date], when provided printed material, [student] will use a text organizer to locate and categorize information, as measured by [method], with [accuracy/criteria] in [#] out of [#] opportunities.
  26. Answering inferential questions: By [date], after reading print text, [student] will answer inferential questions (e.g., feelings, motives, implied meaning), as measured by [method], with [accuracy/criteria] in [#] out of [#] opportunities with [minimum assistance].
  27. Recognizing figurative language: By [date], after reading a passage, [student] will identify examples of figurative language and explain intended meaning, as measured by [method], with [accuracy/criteria] in [#] out of [#] opportunities.
  28. Overall reading comprehension increase: By [date], [student] will increase comprehension of printed materials to [level/score], as measured by [assessment/tool], with [accuracy/criteria].
  29. Identifying mood: By [date], after reading a passage, [student] will identify the mood of the selection and provide [evidence/example], as measured by [method], with [accuracy/criteria] in [#] out of [#] trials.
  30. Answering “what” questions with visual supports: By [date], when given short text with visual supports (color coding/highlighting), [student] will answer “what” questions, as measured by [method], with [accuracy/criteria] in [#] out of [#] opportunities with minimal gestural cues.
  31. Identifying key plot details: By [date], after reading a story, [student] will identify details most important to the plot, as measured by [method], with [accuracy/criteria] in [#] out of [#] trials.
  32. Functional vocational and safety vocabulary comprehension: By [date], [student] will demonstrate comprehension of at least [#] new functional words (vocational and safety vocabulary) across school settings, as measured by [method], with [accuracy/criteria] in [#] out of [#] opportunities with minimal gestural cues.
  33. Comprehension across genres and cultures: By [date], [student] will increase ability to understand and respond to literature from various genres and geo-cultural groups to [level/score], as measured by [rubric/assessment], with [accuracy/criteria].
  34. Using questioning strategies: By [date], when given a reading passage, [student] will use questioning strategies (generate and answer questions) to increase comprehension, as measured by [method], with [accuracy/criteria] in [#] out of [#] opportunities.
  35. Analyzing tone, character, point of view, and theme: By [date], after reading a passage, [student] will identify and describe the tone, character, point of view, and theme, as measured by [method], with [accuracy/criteria] in [#] out of [#] opportunities.
  36. Identifying author’s purpose: By [date], after reading a passage, [student] will identify the author’s purpose and provide [evidence/example], as measured by [method], with [accuracy/criteria] in [#] out of [#] opportunities.
  37. Recognizing fact vs opinion in text: By [date], after reading a passage, [student] will identify statements of fact and opinion, as measured by [method], with [accuracy/criteria] in [#] out of [#] opportunities.
  38. Predicting outcomes: By [date], when given a passage, [student] will predict the outcome and support the prediction with text evidence, as measured by [method], with [accuracy/criteria] in [#] out of [#] opportunities.
  39. Identifying cause of a situation: By [date], after reading a passage, [student] will identify the cause of a situation/event and support the answer with text evidence, as measured by [method], with [accuracy/criteria] in [#] out of [#] opportunities.
  40. Predicting the main problem: By [date], after reading a passage, [student] will predict the main problem and support the prediction with text evidence, as measured by [method], with [accuracy/criteria] in [#] out of [#] opportunities.

When This Doesn’t Work

If a student is not making progress with well-written reading comprehension goals, it does not mean the parent, teacher, or student has failed. It usually means the goal is targeting the outcome without addressing the underlying barrier, such as language load, working memory, or regulation. When that happens, the next step is not more practice, but a closer look at what is interfering with access to meaning.

Save The Post IEP Parent Form
📧 Save this for later? 📧
 
Instantly send this to your inbox.

Yes, there is a commonly misunderstood layer that explains why comprehension goals sometimes fail: cognitive load and regulation, often influenced by anxiety, executive functioning, or masking. This does not need diagnostic language. For some students, reading comprehension breaks down not because they lack skill, but because managing attention, anxiety, or mental effort uses up the cognitive energy needed to make meaning from text. In those cases, progress may require adjustments to pacing, supports, or expectations rather than new comprehension goals.

Reading Fluency IEP Goals

Here are six example IEP goals for reading fluency:

  1. Oral reading fluency (words correct per minute): By [date], when given a grade-level passage, [student] will increase oral reading fluency from [current WCPM] to [target WCPM], as measured by three consecutive oral reading fluency probes administered every [timeframe], with [accuracy/criteria].
  2. Reading expression and prosody: By [date], during oral reading tasks, [student] will demonstrate appropriate expression and prosody (intonation, stress, and rhythm), as measured by a fluency rubric, with a score of at least [percentage]% across [number] of assessments.
  3. Reading rate with accuracy: By [date], [student] will read grade-level text at a rate of at least [target WPM] while maintaining [percentage]% accuracy, as measured by weekly progress monitoring probes.
  4. Sight word recognition and automaticity: By [date], when reading grade-level text, [student] will correctly identify [number] out of [total number] high-frequency sight words within one-minute timed readings, as measured by weekly fluency assessments.
  5. Phrasing and chunking of text: By [date], during oral reading, [student] will read in meaningful phrases and pause appropriately at punctuation, as measured by a fluency rubric, with a score of at least [percentage]% across [number] of assessments.
  6. Fluency participation and self-monitoring: By [date], [student] will participate in at least [number] fluency-building activities per week and complete a self-assessment of fluency using a checklist, as measured by teacher-recorded participation data in [number] out of [number] opportunities.

Reading Comprehension as a Skill

Before jumping into lists of reading comprehension IEP goals, it helps to pause and define what teams are actually measuring. Many disputes around reading goals are not about ambition or expectations, but about whether the skill being targeted is developmentally and instructionally appropriate for the student.

Reading comprehension goals should only be written after decoding, phonemic awareness, and basic fluency are sufficiently established. When those foundations are weak, comprehension data becomes unreliable. A student may appear to “not understand” when the real issue is cognitive load: all of their mental energy is being used to get through the words, leaving nothing left for meaning.

This is where IEP teams often go wrong. Comprehension is treated as a single skill, when in reality it is a collection of processes that rely on language, memory, background knowledge, and executive functioning. Two students can score the same on a comprehension assessment and need entirely different goals.

Reading comprehension also overlaps heavily with skills that are rarely addressed explicitly in IEPs, such as inference and working memory. If a student cannot hold information in mind long enough to connect ideas across a sentence or paragraph, comprehension goals will stall. Likewise, students who struggle with inference may decode accurately but misinterpret meaning because they lack context or flexible thinking.

Comprehension is not simply “understanding what you read.” It is the ability to extract meaning from text using vocabulary knowledge, prior experience, language skills, and cognitive flexibility. A child can read every word correctly and still misunderstand the message. That disconnect is often missed in goal writing.

A simple example illustrates this. A student may accurately read a sign or label but misinterpret it because they lack background knowledge or context. The words were decoded correctly, but meaning was not constructed. That distinction matters when selecting goals and interventions.

Well-written reading comprehension IEP goals are not generic. They reflect how the student processes text, what breaks down during reading, and what type of instruction actually helps. Some students need explicit, structured instruction with guided practice. Others need language support, vocabulary pre-teaching, or visual organizers. An unmotivated or disengaged student often needs more direct teaching, not more independent discovery.

The goal of this section is not to teach reading comprehension strategies, but to help IEP teams write goals that align with how comprehension actually works. When goals match the student’s cognitive profile and instructional readiness, progress becomes measurable instead of frustrating.

Tests to Evaluate Reading Comprehension

Reading comprehension IEP goals are only as good as the data behind them. One of the most common problems I see is that teams assume comprehension has been evaluated when, in reality, the assessment focused on decoding, fluency, or a broad reading composite score.

For IEP evaluations, schools are required to assess in all areas of suspected disability. If reading comprehension is a concern, it must be evaluated directly. Many commonly used “reading” assessments do not isolate comprehension skills or provide enough detail to explain why a student struggles to understand text. As a result, goals are written without clear targets, and progress is difficult to measure.

It is also important to look closely at the evaluation protocols, not just the summary pages. Test publishers vary widely in what they measure, the age ranges they cover, and how comprehension is defined. Two students can receive similar scores while needing very different types of instruction and support.

When evaluation data does not clearly explain a student’s comprehension needs, goal writing becomes guesswork. If you disagree with the findings or believe comprehension was not adequately assessed, this is where understanding your right to an independent educational evaluation matters. Clear assessment leads to precise goals, and precise goals lead to meaningful progress.

This post focuses specifically on reading comprehension IEP goals. If a student is still working on:

You can find separate goal banks and guidance for those areas in those links.

  • Save

Save Time. 
Stay Compliant.
Free Guide: IEP Present Levels Planner.
Featured Image