Tuesday, December 24, 2024
spot_img
HomeCommentaryCOMMENTARY: Some surprising lessons from COVID learning loss

COMMENTARY: Some surprising lessons from COVID learning loss

By Michael Hicks | Ball State University

Last month, my colleague Dagney Faulk and I published a study on COVID-related learning loss in Indiana schools (available at https://projects.cberdata.org). The results were surprising and largely positive — or, at least, more hopeful than I expected. The purpose of this work was to better understand what factors contributed to learning loss. What we know so far has mostly been limited to simple descriptive statistics about changes in test scores. That is a good start, but it cannot speak to correlation, much less causation about learning loss. To do so requires more math.

In a perfect world, we’d have detailed student data over time. Absent that, school-level data provides a pretty good basis for evaluating the effects of COVID and school-level responses to the pandemic on learning loss. Our focus was an examination of how kids of different age groups did on the same standardized test before and after COVID. This approach, along with some statistical modeling, dodges most of the well-known criticisms of standardized tests.

We looked at every public school, for grades 3-8 in both 2019 and 2021, the big COVID interruption. During this time, the median school saw pass rates on standardized tests of math and English drop by more than 10%. Some schools actually did better through COVID, but the vast majority did not. A handful of schools even experienced a 50% decline in pass rates.

In the best-performing schools, most kids didn’t pass both the math and English tests. In the worst, about half of classrooms would have no student who passed both tests. This is frightful data that potentially affects long-term educational attainment and economic growth. Just so no one arbitrarily dismisses the last 18 months, I am willing to make a $1,000 wager that the learning loss of this age cohort will still be evident in the 2060 Census. The only question is how big that loss will be and what compensating factors, like resilience and grit, will take the place of classroom proficiency in these students.

Our statistical models that test learning loss allow us to measure each variable jointly. In this way, we control for multiple differences at the same time. For example, in the raw comparisons offered by the Department of Education last summer, African-American students experienced more learning loss, as did did children in poverty. However, when we controlled for both race and poverty, the statistical significance of race disappears.

Another way to explain this is that two schools with different racial mixes but the same level of poverty experienced the same level of learning loss. Our study couldn’t say what aspect of poverty caused the learning loss, but there are many potential factors such as the lack of broadband access for remote learning. That should be fertile area for research for years to come.

Our second big finding was that schools who did better on the standardized tests in 2019 saw greater learning loss over COVID. We believe this is due to specialized programs in better-performing schools that were not readily performed during COVID. There are many other plausible contributing factors, so it may be some time before the causes are nailed down.

The rest of our findings were really “non-findings” in that most of the differences across schools we could measure were not correlated with learning loss. Race and ethnicity did not play a role in learning loss, nor did the share of English Language Learners. The type of school didn’t matter, whether elementary, middle or combined grades. The size of the school didn’t matter, nor did absenteeism through the year. There was some evidence that declining enrollment increased learning loss, but it is a small effect.

The big surprise was that the mix of instruction—in-person, online or hybrid—had no effect on learning loss. This differs from the raw numbers shared by the Department of Education, but again without controlling for other factors, those comparisons offer no useful interpretation. I think the explanation for this is outcome is pretty straightforward.

Hoosier schools, like many around the nation, struggled with scheduling and educational decisions through the 2020-2021 school year. No doubt every superintendent and school board struggled to balance several different priorities such as health, learning, and enrollment. But, in the end, most of the decisions could be reduced to two trade-offs: learning loss through online learning or learning loss due to quarantine and isolation. Here’s how this works.

Suppose that schools who chose to wholly go online minimized disease spread in the school, but maximized learning loss due to online learning. Alternatively, schools that accepted COVID risk and went fully in-person minimized learning loss due to online learning. However, in so doing, they would have experienced more learning loss due to individual student and staff quarantine and isolation. Either way there is risk of learning loss. The trade-off was not between health and learning, but between two different types of learning loss.

Here’s where the statistical modeling of this type is so absolutely necessary to understand how these policies affected learning. If, on average, Indiana schools misjudged this trade-off, and spent too much time in online instruction, or too much time in-person, that would appear in our statistical model. But, if, on average, schools balanced instructional settings effectively across the year, then no specific type of instructional form would be correlated with learning loss.

Of the many dozen statistical tests we performed, none indicated correlation between learning loss and instructional setting. This is a full and complete rejection of the scientific hypothesis that there’s a correlation between these instructional modes — in-person, online or hybrid — and learning loss in Indiana.

This is an important finding for schools, policymakers and taxpayers. The COVID pandemic was a difficult time for schools. While I believe the Holcomb Administration provided clear and consistent guidance, the federal CDC communications could have hardly been more confusing. The federal failures helped fuel mistrust and frustration that surely made instructional decisions very difficult for school boards and superintendents.

No doubt there are many lessons to be learned from COVID, and some schools did better than others. But with the available data and analysis about learning loss, Indiana schools appear to have done about as well as was possible. That should give the rest of us a great deal of confidence that they will attack the problem of learning loss with the same good judgement.

Michael J. Hicks, PhD, is the director of the Center for Business and Economic Research and the George and Frances Ball distinguished professor of economics in the Miller College of Business at Ball State University.

RELATED ARTICLES

Most Popular

Recent Comments