Is summer learning loss real? Research sends mixed signals.
This story originally appeared on HeyTutor and was produced and distributed in partnership with Stacker Studio.
Research sends mixed signals. How real is summer learning loss?
Summertime can mean many things: vacations, lazy days at home, and ice-cold sweet treats, but definitely not schoolwork.
Many kids eagerly await summer during the school year, hungry to take an extended break. Parents, too, might feel the same, excited about extra time with their children, but not without worrying all the free time could lead to the dreaded summer slide, also called summer learning loss or summer setback. "What if my child forgets everything they learned in school the previous year and shows up to the first day unprepared?"
In reality, summer learning loss is controversial, with educators and academic researchers often disagreeing on whether it is even worth worrying about. Even though American students have been taking summer vacations since the late 19th century when there was a push to standardize school calendars in urban and rural areas, research on the topic only dates back as early as 1906, with an uptick in interest in the 1970s.
For that reason, HeyTutor, an in-person and online tutoring provider, analyzed academic research to see how much summer breaks affected student learning. The results were mixed.
While many studies showed the summer slide was real, some proposed this decline in academic ability was only temporary. Others found students did forget much of what they learned in the previous year, with mathematics, in particular, showing a steep drop-off. Still, some studies found minimal evidence of a summer slide at all.
Demographics matter during the summer
The fear of summer learning loss is simple. Students leave the school year full of new knowledge but enter the next having forgotten important foundational information, unprepared to learn the next level.
When comparing test scores collected by four different studies, the Northwest Evaluation Association, an academic assessment company, found that scores tend to flatten or drop during the summer, with higher drops in math than reading. Score drop-offs also tended to be more significant for younger students. Kindergarten to fifth-grade students showed lower scores compared to prior spring test scores versus sixth- to eighth-grade students in studies, based on NWEA calculations on studies conducted by i-Ready, MAP Growth, ECLS-K:20210-11, and STAR.
There have also been noted differences in test scores across demographics, with some studies suggesting that students in wealthier households show less summer slide than students in lower-income households. Authors of the Baltimore School Study, which began in 1982, suggest it's because parents with higher incomes can provide more guidance on how to navigate the school system and also better shoulder expenses like books, games, and computers.
However, a 2019 study in the journal Sociological Science using MAP Growth test score data collected just before and during the COVID-19 pandemic found that test declines for students in high-poverty schools weren't too statistically different than those of students in low-poverty schools, suggesting that summer may be less of a factor to test score gaps across different levels of poverty.
Other studies found a higher rate of summer learning loss among Black and Latine students than white students.
It's possible that some of the differences in summer learning loss across demographic groups can be attributed to access to school-based summer programs. While these programs consistently help to raise test scores on average, they are also quite expensive and require regular participation and attendance to demonstrate positive effects—all of which might be more difficult for families with lower incomes.
Unfortunately, it's difficult to quantify just how much summer learning loss truly affects students because studies on the subject often demonstrate opposing conclusions. There are also concerns about the data used in much of this research.
Doubts about data changes views on summer setback
In 2019, Paul von Hippel, a professor at the University of Texas at Austin known for his research on summer learning, published an article challenging previous research on summer learning loss, pointing out that he and his colleagues could not replicate test score data.
Additionally, many researchers rely on dated test score data—from as far back as 1982—which only used data gathered from children attending public schools in one city and finished eighth grade at the same time in 1990. It would have been better to include more recent data from various locations. Moreover, scoring methods have changed over the years. In fact, newer test score data shows much smaller gaps.
Von Hippel also noted the gaps researchers pointed out between students of low and high income, which emerged in children under 5 and remained consistent as they progressed through their education. His line of thinking eventually led him to question whether summer holidays and supposed learning loss could really be the culprit.
Despite von Hippel's skepticism about the validity of studies demonstrating summer learning loss, other researchers insist that the phenomenon is real, even if there are some discrepancies to note in the data. When Brookings compiled summer learning loss data from multiple sources, they found that test score drop-off was consistently demonstrated, even if the severity of the issue is contested.
It makes sense too. Many adults agree they don't remember much of what they learned in school, according to a study published in American Educator. Even if the summer slide is slight, working to maintain children's knowledge over their vacation can only improve their school experience. Research from the RAND Corporation's six-year National Summer Learning Project indicates that effective summer programs encourage regular attendance and participation and run at least five weeks of the summer for three hours or more per day, which can improve students' performance and set them up for future success.
Story editing by Carren Jao. Additional editing by Kelly Glass. Copy editing by Paris Close. Photo selection by Ania Antecka.