June 10, 2025

By Dr. Sarah Brown, MTSS consultant and author

The goal of intervention is to accelerate K‒12 student learning. In schools across the country, teams gather regularly to discuss intervention data. Educators pull up progress monitoring graphs, review each student’s trajectory, and consider next steps.

This individualized approach, while well-intentioned, often leaves teams feeling overwhelmed and frustrated. Educators and leaders work incredibly hard to provide intervention, only to find that—more often than not—student learning gaps are not closed after that intervention.

There’s a more efficient, effective way to approach this work within a multi-tiered system of support (MTSS) framework. What if, instead of focusing exclusively on individual student progress, we zoomed out to examine how our interventions are working at the system level?

This shift from a student-by-student process to a systems lens can transform not only how we measure intervention success, but also how we scale what works and address what doesn’t.

This is a point that Stephanie Stollar and I explore in our new book, MTSS for Reading Improvement: A Leader’s Tool Kit for Schoolwide Success. In this blog, I’ll share several key ideas from the book, focusing on a system-level approach to intervention at each instructional tier.

How system-level planning improves MTSS intervention success

Most schools approach intervention data through an individual lens, with data teams following a standard protocol:

  1. Teams review progress monitoring data for each student in intervention.
  2. Decisions are made one student at a time, often by answering questions such as “Is this student making enough progress?,” “Should we change the intervention?,” and “Do we suspect the student may have a disability?”
  3. Success is measured by individual student gains, often without considering the broader effectiveness of the intervention itself.

This approach, while necessary for individual student support, can mask systemic issues. Because teams are taking a student-focused approach, next steps often point back to student issues. Reasons for lack of growth are focused on the students, such as expecting too much growth, or wondering if the student has a learning disability.

Without asking system-level questions, teams may miss critical system-level barriers to student success.

When teams also ask questions like, “Which of our interventions are most effective overall?” and “Are there patterns in who is and who isn’t responding to the intervention?,” the conversation shifts to the barriers to success that are within our control.

Additionally, teams find opportunities to improve outcomes for many students at once, instead of having to find solutions one student at a time.

The power of a systems lens for effective interventions

A systems approach shifts the perspective and dialogue. Teams analyze aggregate data, looking at how groups of students respond to specific interventions. Success is measured by the percentage of students who are on track to meet their goals.

If large numbers of students are not accelerating with the intervention, the intervention as a whole is modified, avoiding student-by-student actions. The focus shifts to improving the overall system success, so that interventions become more effective.

This shift doesn’t mean abandoning individual student intervention intensification. Instead, it means that by strengthening the system, we create conditions where more students succeed and fewer need individualized, intensive support.

Let’s consider ways to improve intervention effectiveness in MTSS, first at Tier 1 and then with Tier 2 and Tier 3 supports.

Tools for data-driven MTSS

Explore Renaissance solutions that support a highly effective MTSS framework.

Effective Tier 1 intervention in MTSS

We often think of interventions as only being provided to some students. Yet, many times, the overall system creates too many students at risk, which requires a system-level solution.

This is the first place to shift to a system-level perspective when using data. To make this shift, ask yourself, “Do we have more students who need intervention than we can reasonably provide?”

For many schools, when more than 20 percent of students are scoring below grade-level targets on reading and math screening measures, such as Star Assessments or FastBridge, the intervention system can quickly become overwhelmed. When that happens, intervention intensity and effectiveness decrease.

In these cases, Tier 1 intervention is necessary and results in exceptional growth in a short period of time (Fuchs & Fuchs, 2005; VanDerHeyden et al., 2012). It’s also an equitable practice that has demonstrated effects with multilingual learners (McMaster et al., 2008).

What does Tier 1 intervention involve? Instead of trying to serve too many students within intervention systems, schools provide intervention to all students in a class or grade level, within the classroom setting. This makes the intervention easier to implement and effective for all students.

For example, one school in Iowa used MTSS and a focus on Tier 1 intervention to increase the percentage of second graders meeting screening targets on FastBridge CBMreading from 29 percent to 61 percent in a single year (Duncan & Brown, 2024).

Implementing Tier 1 interventions in the classroom

To implement a successful Tier 1 intervention, schools:

  1. Review aggregate screening data as a team to spot trends across grades.
  2. Identify grades that have too many students at risk to close gaps through supplemental and intensive intervention.
  3. Provide classwide intervention to all students.
  4. Review aggregate data to decide when classwide intervention is no longer needed.

The following example from Renaissance’s eduCLIMBER data tracking platform helps to illustrate this process. A second-grade team identified a phonics need due to a large percentage of students scoring below targets on Star Phonics screening. They can set a goal, decide on a classwide intervention, and then review screening data later in the year to identify when this intervention is no longer needed:

Classwide data comparison

System-level planning and action maximize supplemental and intensive intervention resources for the students who need the most support. They also ensure that those interventions can be aligned and effective.

Effective Tier 2 and 3 intervention in MTSS

When providing supplemental and intensive interventions, system-level data are also helpful. Often, team meetings about interventions focus on each student, with teams first discussing one student’s progress, then another student’s, and so on.

As noted earlier, one important way to improve the success of interventions in MTSS is to evaluate interventions from a system perspective. This involves three key steps.

#1: Aggregate your assessment data

In order to review intervention success from a systems lens, teams need to look at data differently. Both screening and progress monitoring data can be used for these analyses.

When using screening data, isolate data for those students who received interventions. You may also find benefit in identifying which intervention(s) each student accessed. Following is a sample report from eduCLIMBER showing screening data for students receiving Tier 2 reading intervention. The team can see that the percentage of students who made enough progress to become “on track” grew significantly from fall to winter:

Screening data for students receiving intervention

When using progress monitoring data, the data also need to be aggregated, ideally identifying the specific intervention(s) that students were provided. Teams using the eduCLIMBER Goal Progress Report for this purpose can easily compare the effectiveness of different interventions:

Progress monitoring data aggregated by intervention

Identifying effective instruction and intervention is critical for data-driven decisions, and having access to aggregate data is the first step in accomplishing this.

#2: Schedule time to review aggregate data

With aggregate screening data, teams need time immediately after each screening window to analyze and act on these data. Using aggregate data, teams can answer key questions, including:

  • What percentage of students who received intervention are on track to meet their goals?
  • Which interventions resulted in most students meeting screening targets?
  • Which interventions resulted in the fewest students meeting screening targets?

Using the eduCLIMBER Interventions Module, teams can see if an intervention is successful. On a single page, a team can see information for an intervention, including how many students are receiving the intervention and what percentage are on track to meet their goals:

eduCLIMBER Interventions Module

Focusing on progress monitoring data

With frequent progress monitoring data, system-level questions can be asked more often. On a regular schedule of approximately every six weeks, teams can review the progress of all students in each intervention group, answering the following questions:

  • What percentage of students are on track to meet their goals?
  • Which interventions have the most students on track to meet their goals?
  • Which interventions have the fewest students on track to meet their goals?

For instance, compare the effectiveness of these two interventions. One shows that most students are on track to meet their goals, while the other has most students not on track:

eduCLIMBER Goal Progress Report

For the first, we’ll want to continue the intervention as we’re currently using it. Intervention plans for individual students who aren’t making enough progress can be intensified, but overall, this intervention is successful for most students. Teams may even consider if its use should be expanded to other grade levels.

However, for the second intervention, the team will need to consider if the intervention is able to be implemented as intended. If so, then the team can plan intensifications to the intervention itself to achieve accelerated progress.

#3: Take action to increase intervention effectiveness

With system-level intervention data, teams can make more efficient and effective decisions about next steps. For example, if we notice that most students in an intervention aren’t on track to meet their goals, teams shouldn’t spend time intensifying the intervention for individual students. In this case, the intervention itself needs consideration.

Instead of taking actions like…

  • Changing the intervention program
  • Suspecting a disability
  • Moving a student to a different intervention group
  • Teaching the student in a smaller group

…teams can instead explore:

  • Whether the intervention was implemented as often as expected
  • Whether the intervention matches the skill needs of the group of students
  • Whether the intervention has a strong enough dosage to see needed growth

And, for highly effective interventions, teams can look to expand their use with more students and/or additional grade levels, as appropriate.

Building on system-level intervention success

The real measure of intervention success isn’t just the number of students who show some growth—it’s the percentage who actually catch up to grade-level expectations.

When teams see significant gains, it’s essential to pause and reflect:

  • Ask yourself: What did we do that worked? Did we implement new routines, increase instructional time, or better align interventions with core instruction?
  • Acknowledge successes: Publicly recognize successful interventions and invite teams to share specific supports that impacted student success.
  • Share stories: Highlight collective gains in staff meetings and newsletters to build collective efficacy—the belief that, together, educators can make a difference for all students. Celebrating these wins isn’t just about feeling good; research shows that when teachers believe in their collective power, student achievement rises (Norris, 2018).

Improving learning outcomes schoolwide

The journey from overwhelmed to effective intervention is possible when schools move beyond isolated fixes and embrace system-level planning. Shifting from a student-by-student approach to a system-level focus isn’t just more efficient; it’s also more effective.

By regularly reviewing aggregate data, identifying patterns, and making strategic changes, schools can ensure that interventions work for all students who need them. This shift also supports overwhelmed teams and maximizes MTSS resources to ensure the hard work educators do to provide intervention has the impact necessary to result in lasting reading and math success.

As you prepare for your next data meeting, challenge your team to start with a system-level question, such as:

  1. Do we have more students who need intervention than we can reasonably provide?
  2. Which of our interventions is creating the greatest impact, and how can we scale this success for every student in every classroom?

For more MTSS best practices, particularly related to Science of Reading implementation, I invite you to explore my new book. Also, reach out to Renaissance if you’d like to learn more about eduCLIMBER, Star Assessments, or other solutions that support data-driven MTSS.

References

Brown, S., & Stollar, S. (2025). MTSS for reading improvement: A leader’s tool kit for schoolwide success. Bloomington, IN: Solution Tree.

Duncan, A. & Brown, S. (June 26, 2024). What’s possible in a year? One school’s growth using FastBridge to drive SoR instruction and intervention. Presentation at the 2024 Iowa Reading Conference.

Fuchs, D., & Fuchs, L. (2005). Peer-assisted learning strategies: Promoting word recognition, fluency, and reading comprehension in young children. Journal of Special Education, 39(1), 34–44. https://doi.org/10.1177/00224669050390010401

McMaster, K., Kung, S., Han, I., & Cao, M. (2008). Peer-assisted learning strategies: A “Tier 1” approach to promoting English learners’ Response to Intervention. Exceptional Children, 74(2), 194–214. https://doi.org/10.1177/001440290807400204

Norris, B. D. (2018). The relationship between collective teacher efficacy and school-level reading and mathematics achievement: A meta-regression using robust variance estimation. [Dissertation, The University at Buffalo, State University of New York].

VanDerHeyden, A., McLaughlin, T., Algina, J., & Snyder, P. (2012). Randomized evaluation of a supplemental grade-wide mathematics intervention. American Educational Research Journal, 49(6), 1251‒1284. https://doi.org/10.3102/0002831212462736

Learn more

Connect with an expert to explore solutions for strengthening MTSS from preschool through high school.

Share this post