The state of special education in the U.S.: an exploratory study on the workflows and needs of special education teachers

 

by , and | February 5, 2024

 

 

Why a research initiative?

In the 2020-2021 school year, Frenalytics launched a pilot with a special education classroom in California to use our digital platform, originally built to personalize cognitive therapy for people with brain injuries (like stroke), to reinforce and assess personalized skills with their students. As we continued to grow in this space, the significant gap in digital platforms for special education (SpEd) became clear.

 

After talking with dozens of SpEd teachers, we learned that current digital platforms were not designed for the real workflows of SpEd teachers and no digital platform was designed for the needs of older SpEd students. Through these conversations with SpEd teachers and initial research before the start of the VITAL Prize Challenge, we had identified a need to:

  • Reduce the time teachers spend tracking, managing, and analyzing data
  • Remove inconsistencies in the accommodations, phrases, and language used to accurately identify present levels (i.e. did the student answer incorrectly because they do not know the answer, do not understand the question, or the do not understand the format of the question?)
  • Improve student interest, focus, and confidence in learning skills at their grade level by using age appropriate visual aesthetics
  • Encourage student independence
  • Involve and educate parents of SpEd students and improve communication

 

With the support of the National Science Foundation and Digital Promise, we launched a 10-month exploratory research initiative with 249 participants to understand the workflows and needs of schools & districts with special education and English Language Learner populations. Our research team was led by Caitlyn Hutchison and Matt Giovanniello and supported by Stephanie Lance and Juan Alvarado.

 

 

Improving Workflows for Special Education Teachers

Our primary objective was to understand if automatic and digitized assessment opportunities could improve an educator’s ability to consistently and accurately track progress on academic IEP goals.

 

Our research initiative ran from May 2023 – February 2024. Our research questions explored four key categories of questions:

  • What data is collected on IEP goals and who are the people involved in collecting it?
    • What data types are collected?
    • How is this data useful and meaningful?
    • How is the data managed and used?
    • What technologies and solutions are being used?
    • Is there data that’s not being collected that should be?
    • What assignments, worksheets, and assessments are relevant?
  • What are the benefits and barriers for shared data?
    • What problems could be solved by sharing data?
    • What barriers exist that make it challenging to collect, analyze, and share data, and what barriers make it difficult to collaborate?
    • When and where is data being shared and what motivates teachers to share data?
  • What is the role of Common Core and state standards in special education?
    • Are all IEP goals associated with a standard?
    • How are standards used throughout the school year in special education?
    • What is the value that standards provide in special education?
  • What are the frustrations for individuals across the IEP ecosystem?
    • Where are the knowledge gaps for individuals in the IEP ecosystem?
    • What isn’t being addressed?
    • What are the processes, expectation, technologies, and solutions that aren’t working?

 

 

Methods

 

Research Methods

The research team developed five semi-structured interview guides; given the exploratory nature of the research, we custom tailored our interview guides to each participant.

The research team conducted interviews with 249 participants across the special education ecosystem, including:

  • 117 educators: inclusion, general education, and special education teachers; speech-language pathologists (SLPs); counselors; and coaches
  • 102 school & district leaders: superintendents, SpEd directors, principals
  • 30 students & parents: gen ed, students with IEPs

 

Interviews were typically 30-60 minutes in length and were conducted in-person or remotely over video or phone. In many instances, participants referred us to others who could further contribute to our research.

 

During our planning, we mapped out the interactions of key individuals in the special education ecosystem:

 

During our research initiative, we identified individuals in this ecosystem that we previously didn’t know existed- like the Special Education Coach, a job employed by a district to support SpEd teachers at multiple schools; and individuals that are unintentionally yet commonly excluded in the design process- like entire technology departments.

 

Analysis Methods

Our research team conducted a thematic analysis, using the approach by Virginia Braun and Victoria Clarke, to identify common patterns in the qualitative data.

From our notes, audio recordings, and interview transcripts, we used a digital white board to analyze each interview using descriptive and interpretive codes. We then re-evaluated the initial analysis for themes and corresponding groupings and wrote a narrative from the insights.

Our research team also conducted task analyses of various workflows for special education teachers.

 

Considerations

Our research team addressed the following challenges:

  • Contradicting information through the interview. This occurred when a participant’s answers to direct questions were later contradicted in the interview when asked to describe steps they take to complete a task (i.e. a participant answering “no” when asked “is there anyone else that helps you to collect data;” but later in the interview, when describing the role of aides in her classroom, said that “there’s a couple that will collect data, and then there’s one that thinks that’s my job.”) In response, our research team prioritized asking questions that would require a participant to describe steps in a task.
  • Generalized responses (i.e. a participant answering “I think it depends on the teacher and depends on the student. It depends on their access to if they have another adult in there to help with them or not” when asked about collecting data from general education teachers.) In response, our research team started interviews by asking participants to share examples that could be used as a reference throughout the interview and asked follow-up questions.
  • Response bias. This cognitive bias occurs when participants consciously or subconsciously share inaccurate information (i.e. misremembering information, leaving out important details, or changing responses to portray a certain self-image). Response bias is often unavoidable when participants are self-reporting information in interviews. In response, our research team asked similar questions throughout the interview that should result in the same answer.
  • Social desirability bias. This cognitive bias occurs when participants share information that they believe to be socially desirable and avoid sharing what they believe to not be socially desirable. During our interviews we heard participants preface information with “I don’t know if I want this recorded” before continuing to say something negative or controversial. Since IEPs are legally binding documents, we recognized this bias at the start and our research team took time to make participants both feel comfortable and informed that their interview would be anonymized.

 

 

Key Findings Narrative

 

What data is being collected?

Our analysis showed no consistent data points that were consistently collected by every educator. Educators self-reported collecting additional data for IEP goals on:

  • Mastery criteria (i.e. fluency words per minute, rate of improvement, percentage correct, X/Y)
  • Whether the session or assignment was completed
  • Date of the session
  • Absences
  • Accommodations used
  • Class engagement
  • Environmental conditions
  • Interventions used
  • Prompting level and type of prompt
  • Strategies used
  • Student’s mood and actions
  • Time of the session
  • Time on task
  • Tools used (i.e. name of book, worksheet, physical objects)
  • What was difficult and easy for the student
  • Which answers were correct/incorrect

In our analysis we also identified the inconsistency in additional points of data that not all educators collect results from how the IEP goal is written, the level of experience of the educator, and the school’s processes and procedures (or lack of having processes and procedures).

 

What is the source of the data?

Our analysis showed no uniform definition for “data” among educators. Educators self-reported using:

  • Worksheets
  • Activities with physical objects
  • Student work samples
  • State testing
  • Scores from digital learning tools
  • Diagnostic assessments from digital learning tools
  • Notes from observations
  • Opinions

 

How is data being collected?

Our analysis showed no standardized process for collecting data. We identified six approaches used by educators as their process for collecting progress data and inferred the frequency and preference of each approach from information the participant repeatedly spoke about during the interview:

  • Most took notes throughout the day based on observations
  • Most pulled information from other assignments the student completed
  • Many informally pulled students aside when appropriate
  • Many modified assignments when appropriate
  • Some pulled students aside to complete personalized assignments in groups or individually
  • Some made inferences based on “just knowing” where the student is at

 

Many educators referred to both the frequency and necessary steps for collecting data as overwhelming, saying it took time away from teaching.

Some educators also shared how some data collected by other educators appeared to be made up.

 

How is data being organized?

We identified four approaches used by educators to keep track of notes and information on progress data. Many educators used more than one approach:

  • Most educators used binders, clipboards, and paper data sheets that they created themselves
  • Many used Post-it notes
  • Some used a notes app on their phone, like Apple Notes
  • Some used Google Forms

 

All educators also had to transcribe their data into the school’s IEP compliance software (e.g. IEP Direct, EasyIEP, EmbraceIEP, SpedTrack) for progress reports. Additionally, some educators also supplemented the reports by transcribing pen and paper data into:

  • Spreadsheets
  • Google Docs
  • Regularly scheduled email threads soliciting data from other educators

 

How is data being analyzed?

Many educators voiced frustration that nothing is done with the data. Because schools are requiring data to be collected, educators assume schools should be using the data.

 

All educators used data to identify if a goal has been mastered; however, few teachers shared concrete ways they used data to make informed instructional decisions, and of the ones that did, they used the data to identify the cause and effect in student outcomes:

  • Which steps were difficult for the student
  • What helped the student (i.e. support, strategy, environment)
  • What time of day the student performed better during
  • What happened that caused an outlier data point (i.e. environmental irregularities, student mood or actions)
  • What happened that caused a sustained dropped score (i.e. ADHD medication was unavailable, student is experiencing medical complications)

 

Because most educators in the classroom don’t receive professional training on how to analyze progress monitoring data, many educators preferred to reference their original pen and paper data that was easier for them to understand. Digital graphs were harder for educators to read and some educators also didn’t trust the accuracy of the graph.

 

Who is collecting the data?

While there were a few educators who referred to their aides as being their “eyes and ears,” most SpEd teachers were fully responsible for collecting data. During our interviews, we identified five reasons as to why SpEd teachers were alone in completing this task:

  1. SpEd teachers have no way to track accountability of aides or general education teachers when they only share partially collected data
  2. SpEd teachers categorize data collection as a key function of their job and responsibilities
  3. General education teachers not being responsive when asked for data
  4. Aides refusing to collect data if the task is not written as a job requirement
  5. Aides not being allowed to track data, due to:
    • The employment relationship to the school or district
    • The terms of their contract
    • Their (lack of) training, and/or
    • HIPAA/FERPA concerns that the school has in following “minimum necessary” provisions

 

In addition to SpEd teachers, depending on the school or district, there are three educators involved in collecting data:

  • Special educator, via a “push in” or “pull out” model used by the school
  • Interventionist, via a “push in” or “pull out” model used by the school
  • Resource room educator

 

In almost all of our interviews, if a student received additional services, the service provider or specialist would collect and share their own relevant data – but how service providers shared that data with other educators varied. Many educators voiced their belief in better student outcomes if there was more collaboration and shared data.

 

What purpose do goals serve?

There is a divide among educators. Some referred to goals as a way to create a complete picture of the student’s strengths and weaknesses. These educators defined goals as either maintaining a student’s current level or pushing a student’s progress. The other half defined goals as only pushing a student’s progress.

Almost all educators felt pressured by the school to get their students to grade level, even if that goal was not appropriate. Some educators voiced doubt in they way goals had to be written and believed the requirements were preventing better student outcomes.

 

How are IEP goals written?

Training and support for writing IEP goals, both as part of Master’s courses and for teaching certification, as well as in ongoing professional development provided by schools, is highly inconsistent across districts and states. Some educators, including special education teachers, report receiving no training at all on how to write goals or where to access resources; other educators report being trained on incomplete or inaccurate processes, forcing them to “learn by doing” in a new role or at a new school.

 

Our analysis from interviews with SpEd coaches, experienced SpEd teachers, and parents showed six elements that create strong IEP goals:

1. Achievable outcomes:

In schools that employ Special Education Coaches, these coaches ensured goals were both clear in their expectations and achievable for that student. These coaches shared that they’re most often helping educators break standards into foundational skills and evaluating if those skills are appropriate for that student.


2. Identified weakness:

Writing achievable goals ensures that the student’s weakness has been properly identified, ensuring the appropriate scaffolds or academic resources can be provided (i.e. on the surface a student has trouble finding the greatest common factor, but further information shows that this student also hasn’t mastered factors of numbers).


3. References to shared knowledge:

Educators preferred when IEP goals referenced standards; however, only some educators reference the specific state standard code(s) correlating to each IEP goal. This practice is dependent on their district’s policies, the state they taught in, and the IEP compliance software they use. IEP goals that referenced non-standardized checklists, objectives, and rubrics were almost always missing necessary documents for other educators to properly interpret.

All educators referred to standards as common knowledge that they could reference for clarity when needed. Some educators also shared that standards were a resource to visualize the common learning path.

Educators also shared that accommodations did not have common definitions (i.e. if a student is provided notes, how exactly is this to be done?).


4. Clarity and context:

All educators voiced frustration with deciphering how and why goals were written when they inherited an IEP from another educator, school, district, or state. Educators wanted to know five key pieces of information:

  • Why this specific goal and not another goal?
  • What was the baseline and how was it collected?
  • What is the mastery criteria?
  • What is the deciding factor behind what is defined by success for this student?
  • How will this goal impact the student in the future?


5. Clear mastery criteria:

Many educators were also uncertain about what type of data the goal was intended to track (i.e. rate of improvement, percentile, or probe).


6. Relevancy throughout the year:

Many parents were frustrated to learn that many of their child’s IEP goals couldn’t be worked on all year. Some educators wrote IEP goals as skills that could be applied to different topics.

 

What are the requirements for IEP goals?

Our analysis showed inconsistencies in how educators self-reported the use of standards when writing IEP goals:

  • Some educators were required to align goals to standards within the grade band appropriate to that student’s age (i.e. a student who is 11 years old must have goals that align to 5th grade standards)
  • Some educators were expected to align goals to a standard in any grade brand regardless of a student’s age (i.e. a student who is 11 years old can have goals that align to 1st or 2nd grade standards)
  • Some educators did not appear to have any requirements when writing IEP goals

 

Our analysis also showed that some schools prioritized having fewer goals with a focus on foundational skills or a student’s weakest skills, while other schools prioritized goals for each and every need. The number of goals for a student was influenced by the teacher’s approach to writing IEP goals, as well as school leadership’s philosophy on measuring progress towards goals and whether the student’s family had an advocate.

 

How do parents understand IEP goals?

Many parents assumed positive intent or blindly trusted the school’s recommendations, not having the resources to form their own informed opinions. Some parents had conflicting opinions that the school would not agree to, and many shared regret for not pushing for more.

 

All parents voiced frustration in:

  • Not seeing how IEP goals build from one to the next
  • Not seeing outcomes for all the time and money spent on achieving IEP goals

 

Parents we interviewed also had the time and money to dedicate resources to help their student achieve their IEP goals. One interview participant shared how they “want to see what the outcomes have been for all the time spent on the goals.” These parents preferred IEP goals that could be worked on throughout the entire year that would also show clear improvement and progress.

 

Parents wanted direct recommendations on:

  • What’s working for their child
  • What’s not working for their child
  • What they can do with their child at home
  • What’s coming up that they can prepare their child for ahead of time

 

All parents wanted goals that would help their child build independence, learn core competencies for early education, and be able to demonstrate mastery of skills in educational, home, and community settings.

 

 

Solutions

We will be detailing our solutions as derived from this research study in the coming weeks. To be notified when this update to the report is available, enter your email address below:

 

Glossary

  • SpEd: Abbreviation for ‘special education’. In the United States, federal and state laws guarantee a free and appropriate education for all children, including those who qualify for special education services.
  • IEP: Individualized Education Plan (or Program). An IEP is a plan or program developed to ensure that a school-age child with an identified disability receives specialized instruction and related services. The IEP is developed by a team, typically comprised of multiple educators, the child with a disability, and the child’s family. An IEP typically includes the child’s present levels of educational performance, their level involvement and progress in a general education classroom, the services they qualify for, the necessary accommodations for them to be successful in school, and measurable annual goals and objectives for them to achieve throughout each school year. (Credit: University of Washington)
  • 504 plan: Enshrined in Section 504 of the U.S. Rehabilitation Act of 1973, a 504 plan is developed to ensure that a school-age child with an identified disability receives the necessary accommodations that ensures their academic success. Unlike an IEP, a 504 plan is more suitable for students with disabilities who do not require specialized instruction but still need a plan to ensure equal access to public education and services. (Credit: University of Washington)
  • Aide / para / teacher’s aide / teaching assistant: Synonymous titles for a school employee who serves as an assistant to a licensed classroom teacher. Their primary role is to provide instructional support and non-instructional services to students who may require individualized support to reach their full potential. (credit: Grand Canyon University)
  • Co-teacher: Refers to a teaching environment where there are two teachers in the same classroom working together with groups of students. In some classrooms, co-teachers will share the planning, organization, delivery, and assessment of instruction, as well as the physical classroom. Sometimes, a co-teaching model (also referred to as integrated co-teaching (ICT) or inclusion) will feature one licensed general education teacher and one licensed special education teacher to meet the needs of their students with IEPs or 504 plans. (credit: University of Minnesota)
  • Goal: Part of a student’s IEP, an IEP goal is a statement that describes the knowledge, skills and/or behaviors a student is expected to achieve within the year the IEP will be in effect. By law, IEPs must include measurable annual goals consistent with the student’s needs and abilities, as identified in the student’s present levels of performance. (credit: Frontline Education)
  • Progress monitoring: A process required by federal law for administering IEPs that refers to the ongoing, frequent collection and use of formal data in order to assess a student’s performance, quantify a student’s rate of improvement based on the services they’re receiving in and out of the classroom, and evaluate the effectiveness of the instruction and intervention they’re receiving. Educators will use measures that are appropriate for the student’s grade and/or skill level, and progress monitoring is most often carried out by special education teachers. (credit: American Institutes for Research)
  • IEP compliance software: Software that helps schools and educators ensure each student’s IEP is being implemented as documented. It involves adhering to the legal requirements set forth by the Individuals with Disabilities Education Act (IDEA) and ensuring that each student receives the services, accommodations, and supports specified in their IEP. (credit: Everyday Speech)
  • Interventions: Targeted instruction that is used to help a student improve a specific skill based on evidence-based strategies and techniques. Interventions are based on a child’s needs and supplement the general education curriculum being taught in their school. (credit: Understood.org)
  • Modifications: A change in what a student is taught or expected to learn. This is documented in both IEPs and 504s. (credit: Understood.org)
  • Accommodations: A change to how a child is learning or showing knowledge that is made in teaching or testing, with the goal of removing barriers and providing equal access to learning. Unlike a modification, it doesn’t change what a child is learning. (credit: Understood.org)
  • HIPAA: Health Insurance Portability and Accountability Act of 1996. HIPAA is a federal law that required the creation of national standards to protect sensitive patient health information (PHI) from being disclosed without a patient’s consent or knowledge. (credit: U.S. CDC)
  • FERPA: Family Educational Rights and Privacy Act. FERPA is a federal law that gives parents the right to have access to their children’s education records, the right to seek to have the records amended, and the right to have control over the disclosure of personally identifiable information (PII) from the education records. (credit: U.S. Department of Education)

 

 

Acknowledgements

This research was supported by the Visionary Interdisciplinary Teams Advancing Learning (VITAL) Prize Challenge, an interdisciplinary research & development (R&D) competition for education technology companies governed and funded by the U.S. National Science Foundation, its partner foundations (the Bill & Melinda Gates Foundation, Schmidt Futures, and the Walton Family Foundation), and Digital Promise.

The authors would like to thank the listed and anonymous peer reviewers for their insightful contributions to the expressiveness of this report.

In addition, the Frenalytics team would like to extend a special thanks to all of the educators, parents, students, and subject matter experts who graciously offered their time and insights to make this report a success.

 

 

Footnotes

This report discusses the independent analysis of data collected exclusively by the authors and does not necessarily represent the views of Frenalytics or its partners, nor of the VITAL Prize Challenge, Digital Promise, the National Science Foundation, the Bill & Melinda Gates Foundation, Schmidt Futures, or the Walton Family Foundation. The indirect funders of this research had no role in study design, data collection and analysis, decision to publish, or preparation of the report.

Leave a Reply

Your email address will not be published.