The big story today in education seems to be that K-12 students are doing far better than expected in terms of how much students are learning during Fall 2020, even showing little to no negative results in reading when compared to Fall 2019. All based on a new NWEA study.
From the NPR article we see the common theme:
A sweeping new review of national test data suggests the pandemic-driven jump to online learning has had little impact on children’s reading growth and has only somewhat slowed gains in math. That positive news comes from the testing nonprofit NWEA and covers nearly 4.4 million U.S. students in grades three through eight. But the report also includes a worrying caveat: Many of the nation’s most vulnerable students are missing from the data.
While it is encouraging to see this caveat explicitly mentioned in most media reports, there is still a large reason to be skeptical about the core findings, or at least how to interpret the findings, if you look at other data sources. To take two examples: an analysis from Fairfax County Public Schools in Virginia and a preliminary study from Illuminate both indicate that K-12 students are learning less this fall than in prior years – in math, reading, and perhaps in all subjects.
The caveat in question is highlighted in the actual NWEA brief, describing how the fall 2019 included 5.2 million students but fall 2020 had only 4.4 million [emphasis added]:
As educators work hard to support students in this challenging time, student achievement data are critical to needs assessment and instructional planning. Teachers need to know their students’ academic skills to tailor instruction, and policymakers need data to plan programs and interventions to catch up the students who have fallen behind. However, fall 2020 assessment results, administered in person or remotely, may not be capturing a significant portion of the student population. Many schools are not administering assessments at all due to technological and other challenges. Within schools that are testing, individual students are absent from school and/or opting out of testing for economic, health, technological, or other reasons unknown to educators and researchers.
Missing assessment data can lead to erroneous conclusions and decisions when students who are not assessed are systematically different from students who are assessed. COVID-19 learning losses and achievement gaps estimated from the data with such missingness will not reflect the larger student population. The most concerning scenario is that students not testing in fall 2020 are disproportionately from disadvantaged backgrounds. Not accounting for these students would produce underestimated learning loss and achievement gaps, potentially resulting in under-provision of support and services to the neediest students. Our supplemental attrition study viii used MAP Growth assessment data of nearly 5.2 million students who attended any grade between kindergarten and seventh grade in fall 2019 to
examine the patterns of missing data in fall 2020.
It is good that media coverage mentions the caveat, but what if the actual conclusions would be different without that 25% loss of students? That’s what other data sources might show – reading achievement did suffer.
Alternative Data and Conclusions
Just one week ago, Fairfax County Public Schools (the largest school system in Virginia) released a study of learning as measured by grades across all schools. In other words, this study did not have the same missing 25% of students issue. 1 As reported in the Washington Post [emphasis added]:
Online learning is causing a serious drop in academic achievement in Virginia’s largest school system, according to a Fairfax County Public Schools study, and the most vulnerable students — those with disabilities and English-language learners — are struggling the most.
Between the last academic year and this one, which for most students is taking place remotely, the percentage of F’s earned by middle school and high school students jumped from 6 percent of all grades to 11 percent — representing an overall increase of 83 percent from 2019 to 2020. Younger students were more seriously affected than older ones: Middle-schoolers reported an overall 300 percent increase in F’s, while high-schoolers reported a 50 percent increase. [snip]
“The pattern was pervasive across all student groups, grade levels, and content areas,” says the Fairfax study, published online this week. “The trend of more failing marks is concerning across the board but is especially concerning for the groups that showed the biggest unpredicted increases … namely our English learner students and students with disabilities.”
Illuminate Education is another assessment-based company that released a study comparing Fall 2020 learning achievement with previous years. While they did not have as many students in the study as did NWEA (1 million vs. 4.4 million), they also did not have the 25% loss-of-student issue either. In their study:
Learning losses in reading and math were determined by comparing annual growth from fall 2019 to fall 2020 to the average annual growth rate in prior years. Using data from a national sample of students who completed FastBridge’s adaptive reading and math assessments, results revealed modest reading losses across Grades K-8, modest math losses in early elementary grades, and substantial math losses across Grades 4–8.
These losses are generally in line with earlier predictions (Bielinski et al., 2020) that COVID-19 school disruptions would result in lower average fall screening scores.
Read the study (full results are expected in January, but the early results are descriptive) to understand the methodology, but the key chart below shows estimated learning loss in terms of months for both math and reading across K-8 grades.
None of these studies are conclusive, and there is no perfect method to measure learning. But I do not think it is reasonable to go with the headline that reading results are not suffering in K-12 during the pandemic. There is too large of a caveat in the NWEA data, and there is too much evidence from other sources with contradictory results.
1 I’m sure there were some differences in student population between Fall 2019 and Fall 2020, but the method is broader in nature and with less of an impact than that used by NWEA.