Even in a normal school year, carving out time to do a midyear progress check using data can feel overwhelming. With all the challenges and changes districts are still navigating on a daily basis, a midyear data check might feel almost impossible.
There can also be some trepidation around what the data might show us. Many teams are navigating questions like:
- What if some things aren't working?
- What if students are still behind?
- What if our Tier 1 intensifications aren't working?
- What if groups of students aren't being served?
- What if some practices aren't as effective as others?
- What if some practices we were excited about aren't working?
- How do we reach students who aren't engaged or making progress?
But these questions are precisely where the power lies in doing a midyear data check. There is still time in the year to adjust and improve outcomes.
Moreover, it provides a chance to reflect on what is working, so that those efforts can be celebrated and retained. There may be some effective practices that can be amplified to support additional students throughout the year.
Blueprint for Midyear Checks

But where to start? How can teams realistically go about doing this work?
Here is a five-step blueprint overviewing how you and your team can approach this work and still complete your midyear check if you haven't already.
(1) Start with Students – First, it's important to understand, "Who are our students?" Start by analyzing demographic data by different groupings of students such as gender, ethnicity, disability code, socioeconomic status. During this time, you can also factor in groups related to COVID-19 (e.g., those who have access to the internet, those supporting younger siblings at home, etc.). The significance of starting with these questions is that it creates a baseline for your other analysis. It helps you determine whether results are proportionate to the student population down the road.
(2) Dig into Data – Start by analyzing your current data to get a baseline understanding of where students are. For this work, I recommend looking at risk data (as measured by a universal screening assessment) and then proficiency data (as measured by a standards-based interim assessment).
When analyzing your universal screening data for levels of risk, consider asking:
- Are at least 80% of students meeting low-risk targets?
- On which specific academic and/or SEB skills are more than 20% of students at risk
- What is the average score? How does it compare to the national norm?
- Do we see different results for different groups of students (e.g., by ethnicity, gender, socio-economic status, teaching modality, device and internet access)?
When analyzing your standards-based interim data for proficiency, consider asking:
- What percentage of students are achieving proficiency in math? In ELA?
- Are there areas of especially high proficiency? Especially low?
- Do we see different results for different groups of students (e.g., by ethnicity, gender, socio-economic status, teaching modality, device and internet access)?
(3) Examine Growth – How much academic or SEB progress are students making between two points in time? That is essentially the concept of growth. In other words, from one point to another, how much have they improved?
Average growth is the typical growth seen for all learners in a specific grade over a period of time. Catch-up growth is growth that happens more aggressively or rapidly than average growth—it happens when learning is accelerated in order to help a student who is further behind to "catch up" to a goal or benchmark. When students start significantly behind a goal, average growth won't be enough for the student to reach the goal in time, and thus, catch-up growth would be required.
A good way to measure this is by tracking growth percentiles, or what is often referred to as a student growth percentile (SGP). This describes student growth or change over time compared to other students with similar prior test scores. In other words, how are students growing compared to students who started with a similar achievement score?
SGPs span from 1 to 99; average growth means that the student is in the 50th percentile, while catch-up growth typically ranges between the 50th to 85th percentile. Many assessment providers, such as FastBridge, calculate the SGP as part of their reporting, and different assessments may use different calculation methods.
When examining growth, your team may consider asking questions such as:
- Are risk levels decreasing between fall and winter?
- Is the average score increasing between fall and winter?
- What is the average growth percentile? Is it above the 50th percentile (indicating catch up growth)?
- What percentage of students are achieving catch up growth?
- Is the percentage of students who are proficient increasing or decreasing between fall and winter for math? Reading?
- What percentage of students are proficient but not making average growth?
- What learning targets saw the most growth for math? Reading?
- What learning targets saw the least growth for math? Reading?
- Do we see different results for different groups of students?
(4) Set Goals & Create a Plan – Now, after you’ve analyzed the data, you will likely find areas of success that deserve celebration. At the same time, you may have identified areas where current instruction or intervention are not helping students catch up or are not yielding equitable results. If so, the next step is to work as a team to articulate the area of need and create a plan for change.
Consider the what, when, where and how. What are your goals? When will you accomplish the goals? Where are you now, and where do you want to be in the spring? How will you measure whether the goals are obtained? Here are sample questions you can ask your team:
- What specific actions will we take to meet our goal?
- Who is responsible for which action items (or will be leading the work)?
- By when will items be done?
- How will we track our progress in the meantime?
- When will we meet to review our progress?
Remember, don’t let perfect be the enemy of good, and don’t let this process be debilitating. Small changes can lead to big outcomes, and any positive change is indeed positive. Action planning simply allows everybody to be on the same page and stay on track with measuring the goals.
(5) Implement & Evaluate – Once you have implemented the plan, set aside time to evaluate your progress throughout the rest of the year. Get the team together and discuss how the results fared in comparison with the goals. What worked and what didn’t work? What are the underlying causes behind the results? Decide on how to move forward in light of the data and schedule a future date to re-evaluate