Test score changes

Test score changes

To err is human; corrections can be divine

How KDE, schools and districts work toward accountability accuracy

Kentucky School Advocate
January 2016

By Madelynn Coldiron
Staff Writer

For leaders in Mayfield Independent Schools, the announcement of 2015 K-PREP scores in the district was definitely sweeter the second time around.

That’s because the updated scores were sweeter. Mayfield Independent is one of five districts – and its high school one of 11 schools – that saw a change for the better in their original scores after the Oct. 1 public release of the data.

“It was a lot of excitement the second time around,” Superintendent Joe Henderson said. But, he acknowledged, the good news probably lacked some of the community impact it might have had if the improved scores had been available for the first announcement.

Mayfield Independent Schools Pupil Personnel Director Gavin Thompson and Title I secretary Lea Ann Crawford get balloons ready for a celebration of the district’s K-PREP results. The district-wide celebration featured a cookout and was held during a teacher professional development day. The district and its high school both received revised improved accountability labels after the public release of K-PREP results. (Photo provided by Mayfield Independent Schools)

During the transition when Henderson moved up from district assessment coordinator to the top job this year and the new DAC was getting established, the district did not catch that some of its college and career readiness students were not included in the 2015 K-PREP results.

“It’s totally on us,” Henderson said.

When scores were released statewide, Mayfield Independent’s news release noted that the district expected an adjustment would be forthcoming. It later turned out that with the updated data, the high school moved from proficient to distinguished, and added the progressing and high-performing labels on top of that. The district itself was bumped up from progressing to high progress.

With millions of pieces of data on tens of thousands of students tested each year, how do errors occur – or to look at it another way – why aren’t more mistakes made when crunching the information?

The process of calculating K-PREP results actually begins well in advance of the assessment window in the last two weeks of a school year, said Rhonda Sims, the Kentucky Department of Education’s associate commissioner of assessment and accountability. During the test roster process in early spring, KDE and schools begin reviewing student enrollment and demographic data submitted by schools to Infinite Campus, the state’s student data collection system, to ensure it is accurate in preparation for matching it up with individual student testing results that will come later. “Getting your rosters right” in this secure, Web-based application is a major focus with test administration training, Sims said.

After K-PREP scores come back from the testing vendor, the education department opens a “cleanup window,” usually the last two weeks of August, so schools get a final chance to check for problems before public release of results in the fall.

“Our goal has been, in a collaborative fashion, let’s have the data as clean as it can possibly be by the time we do our first public reporting,” Sims said.

After the late August check, KDE hosts what it calls a quality control day webcast in mid-September. While schools may spot errors at this point, the individual scores are already locked in – at some point there has to be a cut-off so the process can advance, Sims said – and can’t be changed yet. That’s the situation Mayfield Independent found itself in.

Because of the cut-off, the quality control day focuses on systems glitches that involve multiple districts, usually just small-scale issues, Sims explained. “If it’s something that’s a programming system problem, we correct it at that point, but we don’t make individual changes,” because those files are “locked.”

There is yet another window after the public release of K-PREP scores, this one via a longstanding state regulation that requires a 10-day period during which schools and districts can request data review changes. “The typical kind of thing you might have for data review changes is a student was not enrolled in your school or program for those 100 instructional days that we call a full academic year,” Sims said. “Every year we will get a few changes during that time but it’s really pretty minimal.”

2015 data changes
For perspective on the numbers, Sims said there were almost 16,630 changes that schools and districts requested in the spring – half as many as the previous year – and an additional 6,918 in August before public reporting this year. “Probably the highest number of those come out of looking at which kids are signed up to take the different end-of-course tests,” along with changes made due to accommodations involving special education designations, she said.

Those end-of-course exams, and the various avenues to college and career readiness, create more “clean-up” at the high school level, Sims added. Other minor changes might be a missing test score for a child, or a student not being coded for free or reduced lunch.

On quality control day, KDE found just one problem that affected about 20 schools, and made the fix. Then, after the public reporting this year, there were 388 changes requested during the official data review, but just a fraction of those triggered a change in classification for schools and districts, and almost all of those were positive.

“If a school had a label change, like they came out of priority or they were no longer focus, we will reach out by email and confirm all that with the DAC and we’ll also call the district,” Sims said. “If there’s anything that’s negative, we’re going to reach out with a phone call and follow that with an email.”

This is what turned up during the 10-day review period following public release of scores this year:

• 12 schools and five districts were impacted in ways that affected their labels.

• Four of the five districts became high-progress districts.

• One district added the progressing label.

• One school is no longer priority status

• Two schools moved out of focus status.

• Three schools moved from needs improvement to proficient.

• Three schools were changed from proficient to distinguished.

• Two schools added the progressing label.

• One school dropped from distinguished to proficient.

Sims stressed that KDE has tried to work in partnership with district assessment coordinators and looks for ways to save time for schools in this process.

“It really helps everybody to have ownership in the data being accurate because we know there are consequences tied to it, both publicly as well as parent questions, and federal and state accountability rules,” she said. 
View text-based website