Supporting Math MTSS through SpringMath: FAQs

Evidence-base FAQs 

Is SpringMath an evidence-based tool? 

Yes. Two randomized controlled trials studying the efficacy of SpringMath were submitted to the National Center on Intensive Intervention (NCII) for rating on their academic interventions tools chart.  NCII technical experts evaluate studies of academic intervention programs for quality of design and results, quality of other indicators, intensity, and additional research. The studies of SpringMath earned the highest possible rating for one study, and one point below the highest possible rating for the second study submitted. Additionally, SpringMath is listed as an evidence-based intervention by the University of Missouri Evidence-Based Intervention Network. SpringMath is also listed as an evidence-based resource by the Arizona Department of Education.  

There are different levels of evidence, and some sources set forth by commercial vendors as “evidence” would not be considered evidence by our SpringMath team. We prioritize peer-reviewed published research studies in top-tier academic journals using rigorous experimental designs and a program of research that addresses novel questions and conducts systematic replication. SpringMath meets the most rigorous standards for evidence, including a transparent theory of change model for which each claim has been evaluated. The assessments and decision rules have met the most rigorous standards for reliability and decision accuracy. Intervention efficacy has been evaluated in three randomized, controlled trials, one of which was conducted by an independent research team not associated with the author or publisher of SpringMath. Additionally, the first two RCTs were published in three articles in top-tier, peer-reviewed academic journals. The third RCT was completed in 2021 and has not yet been published, but SpringMath earned the strongest effect size on math achievement among intervention tools included in that study. Additional published research has demonstrated an extremely low cost- to- benefit ratio (or stated another way, a strong return on investment), reduced risk, systemic learning gains, and closure of opportunity (or equity) gaps via SpringMath. Drs. VanDerHeyden and Muyskens maintain an ongoing program of research aimed at continually evaluating and improving SpringMath effects in schools.  

What effects can I expect when I use SpringMath in my school?

Systems using SpringMath classwide and individual interventions with fidelity typically demonstrate accelerated student achievement in math for all student groups. These gains are evidenced by improved percent of students meeting the not-at-risk target across screening occasions, strong upward growth patterns during classwide intervention, improved performance on year-end state test scores for all students and closing of opportunity gaps. Across years, many systems see that children who experienced SpringMath the preceding year demonstrate greater readiness for the next grade level’s instruction.

FAQ_Beginning_year_acadience_percent_proficient

COVID provided a natural experiment for how districts struggling with math achievement can accelerate learning. 

In districts using SpringMath before COVID closures, we have seen rapid recovery of learning loss. For example, these data are from all second graders in a district in Michigan. One can see that scores were lower at winter screening following COVID school closures in the preceding spring. But one can also see that post-classwide-math-intervention, percent proficient was comparable to the years preceding COVID, which is powerful evidence of learning loss recovery. 

 
 

As another example, one district that was using SpringMath in all nine elementary schools before COVID closures had seven schools continue using SpringMath during closure and subsequent disrupted learning, whereas two schools paused their use altogether. This natural variation in use allowed us to look at what happened in schools that continued versus schools that paused. The pattern that we will describe was apparent at all grade levels, but we will highlight one skill measure from grade 3 to illustrate the point. The schools that did not use SpringMath in 2020-21 lost much more ground on the fall screening measures, with students scoring in the not at-risk range declining by 19%. In the schools that continued to use SpringMath the percentage of students scoring not at risk declined by only 3%. Schools that continued to use SpringMath experienced faster mastery of the fall screening skills.

FAQ_Percent_proficient_example
FAQ_Fact_Families_Add_Subtract

Because SpringMath is designed to facilitate data-team decision making, we expect that you will be able to conduct more efficient data team meetings, making more efficient and accurate decisions, and easier eligibility decisions using RTI to determine SLD. One early implementing district found that after implementing classwide math intervention the percent of students eligible for special education declined, achievement improved, fewer evaluations were conducted, and the percent of evaluations that qualified for special education increased to near 100% from a baseline of 50% (VanDerHeyden et al., 2007). Systems tell us that using SpringMath brought to life MTSS, and more aspirational culture shifts in their district, using words like “transformation,” “growth” and “excitement” to describe the results that they bring about through effective MTSS implementation in mathematics using SpringMath.

Implementation FAQs

Does SpringMath work well with special populations?

(e.g., students receiving special education, students who are struggling in math, advanced students)

Fortunately, two randomized, controlled trials were conducted that included students at risk for math difficulties and disabilities. Analyses were conducted to examine effects for students in intervention groups who met the criteria for being at risk for math difficulties (e.g., scoring 1 SD below the mean on last year’s state test, scoring 2 SDs below the mean on last year’s state test), and for students receiving special education services. In general, in populations with greater risk to start, when assigned randomly to SpringMath intervention, those students experience a much stronger risk reduction (because they had more risk to start). Cost-benefit analyses indicate that as risk increases, the return on investment in implementing SpringMath intervention actually improves (i.e., lower ICERs). The National Center for Intensive Intervention (NCII) requires reporting of intervention effects for students scoring below the 20th percentile on an external measure (i.e., a population considered to be at risk or in need of supplemental instruction). On the standard curriculum-based measures, effect sizes ranged from 0.56 to 1.11 for students scoring below the 20th percentile on the preceding year’s state test and 0.19 to 0.73 for students receiving special education services under any eligibility category. On the year-end test for grade 4 students, the effect size was 0.79 for students scoring below the 20th percentile on the preceding year’s test and 0.35 for students receiving special education services.

For advanced students, we recommend using the support portal to access more challenging lessons, practice materials, games, and word problems to enrich students’ experiences during core math instruction or during the supplemental enrichment period. 

A case example from a second-grade student is provided to the right. This child was receiving services under the category of autism. He scored well below same-grade peers on math measures. He began SpringMath individual intervention and showed dramatic growth with intervention. He acquired sums to 6 in three intervention sessions. The intervention adjusted to fluency building and his growth accelerated. He reached mastery of sums to 6 in only four more intervention sessions. The intervention adjusted to sums to 12, which he acquired in one intervention session after which the intervention adjusted to fluency building. Because he had trouble writing his answers, the intervention was conducted in verbal and written format for the first two sessions, at which point, he reached mastery via verbal responding. Intervention continued with fluency building for written responses for sums to 12, and he reached mastery after only three sessions. The intervention adjusted to his goal skill of sums to 20. Evidence of generalization was observed on sums to 20. During his diagnostic assessment in the first week of November, he scored zero answers correct on sums to 20, which is a grade-level expected skill. In late December, after mastering sums to 6 and sums to 12, he scored 14 answers correct in two minutes on sums to 20. The intervention for sums to 20 began with fluency building and was interrupted by the holidays. He grew from 14 to 23 answers correct on equivalent measures of sums to 20 over six intervention sessions, which exceeded the mastery target for sums to 20. The next target goal skill for him will be subtraction 0-20. Based on his progress to date, he can be expected to master grade-level skills this year. 

FAQ_Sums_to_12

How can educators use SpringMath data to write IEP goals and objectives?

Ideally, students receiving special education services would have routine access to the most intensive instruction available in a school setting. This ideal is not often (or even rarely) the reality. Most schools have to make decisions about how to deploy limited resources toward meeting the needs of all learners. Still, it is worth a team’s consideration when planning instructional calendars and daily schedules to think about the ideal of making the most intensive instruction available to the students with the greatest demonstrated needs. Systems can use SpringMath to deliver intensive intervention to students on IEPs daily in a highly efficient manner. Students on IEPs can participate in core instruction in the general education setting if that is their usual plan and can participate in classwide math intervention with their core instructional grouping. If they are identified as needing more intensified instruction, SpringMath will identify and recommend them for diagnostic assessment and intervention. If SpringMath does not recommend them, but you want to provide them with individual intervention anyway, anyone with coach-level access can schedule any student in the school for individual intervention by clicking on the dots next to the student’s name in the Students tab.  

Diagnostic assessment begins with grade-level screening skills (we call these the Goal skills) and samples back through incrementally prerequisite skills, assessing each skill systematically, to identify which skills should be targeted for intervention. We call the targeted skill the Intervention skill. Weekly assessment is conducted on the Intervention skill and the Goal skill. For IEP goals, we suggest instructional-range performance on current Goal skill scores and future Goal skill scores if the IEP crosses years. Logical objectives for IEPs include mastery of the Intervention skills. Drs. VanDerHeyden and Muyskens are always happy to advise systems on using SpringMath for IEP planning and implementation.

What accommodations are allowed?

Several accommodations are permitted, including adjusting response formats (students can respond orally, in writing, or in some cases, by selecting responses). Problems can be read aloud to students. Instructions can be provided to students in their native languages. Teachers can work one on one to deliver assessment and/or intervention. Assessments can be repeated under optimal conditions (using rewards, quiet space, warm-up activities). Timings can be adjusted and prorated to obtain the needed answers correct per unit of time equivalent scores. For example, a child can work for a minute and the score can be doubled to estimate what the performance would have been over two minutes. If a child seems anxious about being stopped after two minutes, the teacher can mark the stopping point at two minutes and allow the child to keep working, reporting the score as the number of answers correct completed at the two-minute mark.

Assessment-related FAQs

Security FAQs

References

Boaler, J. (2012). Commentary: Timed tests and the development of math anxiety: Research links “torturous” timed testing to underachievement in math. Education Week. Retrieved from https://www.edweek.org/ew/articles/2012/07/03/36boaler.h31.html 

Burns, M. K., Aguilar, L. N., Young, H., Preast, J. L., Taylor, C. N., & Walsh, A. D. (2019). Comparing the effects of incremental rehearsal and traditional drill on retention of mathematics facts and predicting the effects with memory. School Psychology, 34(5), 521–530. https://doi.org/10.1037/spq0000312 

Codding, R., VanDerHeyden, Martin, R. J., & Perrault, L. (2016). Manipulating Treatment Dose: Evaluating the Frequency of a Small Group Intervention Targeting Whole Number Operations. Learning Disabilities Research & Practice, 31, 208-220.  

Compton, D. L., Fuchs, L. S., Fuchs, D., Lambert, W., & Hamlett, C. (2012). The cognitive and academic profiles of reading and mathematics learning disabilities. Journal of Learning Disabilities, 45(1), 79–95. Retrieved from https://doi-org.ezproxy.neu.edu/10.1177/0022219410393012  

Duhon, G. J., Poncy, B. C., Krawiec, C. F., Davis, R. E., Ellis-Hervey, N., & Skinner, C. H. (2020) Toward a more comprehensive evaluation of interventions: A dose-response curve analysis of an explicit timing intervention, School Psychology Review, https://doi.org/10.1080/2372966X.2020.1789435 

Fuchs, L. S., Fuchs, D., & Malone, A. S. (2018). The taxonomy of intervention intensity. Teaching Exceptional Children, 50(4), 194–202. doi:10.1177/0040059918758166 

Grays, S., Rhymer, K., & Swartzmiller, M. (2017). Moderating effects of mathematics anxiety on the effectiveness of explicit timing. Journal of Behavioral Education, 26(2), 188–200. doi:10.1007/s10864-016-9251-6 

Gunderson, E. A., Park, D., Maloney, E. A., Beilock, S. L. & Levine, S. C. (2018) Reciprocal relations among motivational frameworks, math anxiety, and math achievement in early elementary school. Journal of Cognition and Development, 19, 21–46. doi:10.1080/15248372.2017.1421538 

Hart, S. A., & Ganley, C. M. (2019). The nature of math anxiety in adults: Prevalence and correlates. Journal of Numerical Cognition, 5, 122–139. 

Morrison, J. Q., & Harms, A. L. (2018). Advancing evidence-based practice through program evaluation: A practical guide for school-based professionals. Oxford University Press.   

Namkung, J. M., Peng, P., & Lin, X. (2019). The relation between mathematics anxiety and mathematics performance among school-aged students: a Meta-analysis. Review of Educational Research, 89(3), 459–496. hdoi: 10.3102/0034654319843494 

AERA/APA/NCME (2014). Standards for educational and psychological testing. Washington, DC: AERA. 

Powell, S. R., & Fuchs, L. S. (2015). Intensive intervention in mathematics. Learning Disabilities Research & Practice (Wiley-Blackwell), 30(4), 182–192. doi:10.1111/ldrp.12087 

Schutte, G., Duhon, G., Solomon, B., Poncy, B., Moore, K., & Story, B. (2015). A comparative analysis of massed vs. distributed practice on basic math fact fluency growth rates. Journal of School Psychology, 53, 149-159. https://doi.org/10.1016/j.jsp.2014.12.003 

Solomon, B. G., Poncy, B. C., Battista, C., & Campaña, K. V. (2020). A review of common rates of improvement when implementing whole-number math interventions. School Psychology, 35, 353-362. 

Tsui, J. M., & Mazzocco, M. M. M. (2006). Effects of math anxiety and perfectionism on timed versus untimed math testing in mathematically gifted sixth graders. Roeper Review, 29(2), 132–139. doi:10.1080/02783190709554397 

VanDerHeyden, A. M., Burns, M. K., Peltier, C., & Codding, R. S. (2022). The Science of Math – The Importance of Mastery Measures and the Quest for a General Outcome Measure. Communique, 51 (1). 

VanDerHeyden, A. M., Witt, J. C., & Gilbertson, D. A (2007). Multi-Year Evaluation of the Effects of a Response to Intervention (RTI) Model on Identification of Children for Special Education. Journal of School Psychology, 45, 225-256. https://doi.org/10.1016/j.jsp.2006.11.004

Ready to add SpringMath to your school or district?

Check Us Out