Dive Brief:
- State level assessment officials and testing experts have expressed concerns that screening assessment results to determine learning loss levels due to "COVID slide" could have been skewed by a number of factors, including parental help.
- Curriculum Associates, makers of the i-Ready test adopted by some school districts, found students who took diagnostic exams remotely showed improvement in scores when compared to previous years. "More so than a beneficial impact of home-based schooling, the data likely reflect well-documented concerns about testing at home, even for low-stakes, diagnostic assessments," according to a report summarizing national data.
- However, whether and why results may be skewed are questions difficult to answer when students are remote, testing experts said, who cautioned against relying too heavily on any single assessment result.
Dive Insight:
Districts were looking to fall assessment data to gauge learning loss or gains that occurred during remote learning in the spring following COVID-19 closures, as well as the summer break that followed. Previously, testing experts had warned against using misusing results.
"When there is a lack of standardization, assessment results must be considered with an abundance of caution," said Jeremy Heneger, director of statewide assessment at the Nebraska Department of Education.
Jadi Miller, director of assessment for Nebraska's Elkhorn Public Schools, said she has heard from parents who assisted their K-1 children on remote assessments. The district discovered its younger grades were being helped when the students, who would usually need audio support to understand and answer K-1 questions, were answering questions at a 2nd-grade level without audio support.
"I don’t think that parent was trying to do anything other than solve a technology problem as they perceived it," Miller said. "They were not trying to game the system or have an inaccurate score or cheat."
The district will be offering remote students a chance to take winter assessments in-person.
While Miller's district was able to pick up on irregularities, other districts may find pinpointing them more difficult and "really hard to substantiate," Juan D'Brot, senior associate at the Center for Assessment, told Education Dive in an email.
D'Brot said focused and non-threatening surveys could help districts determine whether parents assisted to students, suggesting, "Something like a two-parter: 'Did your child need any assistance to successfully complete this test?' and 'If your child needed help to take this test, were you able to assist them successfully?'"
In the meantime, Miller warns against relying too heavily on any single score. "Our perception is that those scores are not necessarily reflective of what students could do. But they could be," she said.
Instead, she suggested reading the diagnostic scores as part of a larger trend. "It’s another piece of data, but if it really feels like it doesn’t fit, then that’s when you look at the whole picture," she said. "If it does fit into the whole picture, then it’s just another thing to confirm that."
"I think we as a whole need to be really careful about making high-stakes decisions from assessment this year, in part due to districts that are operating remotely," said NWEA's senior research scientist, Megan Kuhfeld. "It can introduce new variables into the testing environment such as students who may be taking assessments on their kitchen table surrounded by siblings or caregivers who may be inadvertently assisting or distracting."
Kuhfeld added that some districts are flagging students showing unusual test patterns, but scores can still serve as an effective gauge of learning and inform instruction.
When students fare worse, Miller said, "We don’t ever want to sound alarm bells just because of one score."