SALT LAKE CITY — State-level RISE testing results are strikingly similar to previous years’ SAGE test data, according to state and national assessment experts, but local officials point to “red flags” in some school and district-level data.

The Utah State Board of Education voted unanimously Thursday to direct its staff to ask the Utah Legislature for “flexibility from the state accountability system,” which could mean no assignment of letter grades on upcoming school report cards.

Current Utah law requires that state education officials assign letter grades to schools on state report cards, and Democratic-led efforts to eliminate letter grades as part of the state accountability system have failed the past two legislative sessions.

However, SB220, Student Assessment and School Accountability Amendments, passed by the Utah Legislature in 2017, may give the State School Board latitude to forgo letter grades for the 2018-19 academic year.

The legislation says, in part, that in a school year the State School Board determines it is necessary to establish a new testing baseline to determine student growth due to a transition to a new assessment, “the board is not required to assign an overall rating ... to a school to which the new baseline applies.”

The state board also directed staff to add a disclaimer on the Utah School Report Card website that notes interruptions in RISE testing last spring, along with links to reports on three analyses of the data.

RISE stands for Readiness, Improvement, Success and Empowerment. RISE was selected by the State School Board as a replacement for SAGE testing. SAGE is short for Student Assessment of Growth and Excellence.

While individual classrooms and schools experience some challenges with assessment each year, the state board experienced “systemwide, high visibility interruptions of service” in RISE testing’s inaugural year.

Assistant State Superintendent of Student Learning Darin Nielsen presented the statewide results to the board, along with an explanation of three separate analyses of the data and a recommendation that the State School Board move ahead with accountability calculations for 2018-19.

He acknowledged school districts and charter schools had expressed “an uneasy feeling” regarding the test results after testing interruptions in the spring.

But multiple analyses indicated the statewide data was highly similar to SAGE testing data from previous years. One important difference was that fewer students opted out of RISE testing compared to SAGE. Statewide, there was nearly 95% participation in RISE testing, Nielsen said.

“The more data we have in system the more opportunity for the validity of the data,” said Nielsen.

Scott Marion, executive director of the New Hampshire-based Center for Assessment, which reviewed the RISE results under contract with the State School Board, told the board that its analysis found little difference in mean outcomes between students whose test experiences were interrupted and those who experienced no disruptions.

Marion said growth percentiles between SAGE and RISE testing were also “quite stable.”

One notable difference was the RISE test had no writing assessment, which typically means increases in language arts scores for male students because girls tend to be better writers, he said.

Otherwise, there were few perceptible changes in the mean or variability, he said.

Marion was complimentary of Nielsen and his staff’s efforts to conduct multiple analyses of the test data, which included asking assessment officials in Jordan and Salt Lake City school districts to review their district-level results for anomalies.

Marion likened the Center for Assessment’s review to buying a used car.

“We kicked the tires. We looked under the hood. We took it for a test drive on bumpy roads, on smooth roads. We just kept looking and couldn’t find it,” he said, referring to vast differences in test data between RISE and SAGE.

State Superintendent of Public Instruction Sydnee Dickson, in a statement, acknowledged that testing interruptions posed difficulties for students and teachers.

“The good news is, this analysis shows consistency between this spring’s tests and previous years, giving us a chance to acknowledge increased student performance and a chance to offer help where needed,” Dickson said.

Board member Cindy Davis said while the state board acknowledges “the macro-level, quantitative data, we want our boots on the ground to know we hear the qualitative data (stories of how the service outages affected individual schools and students) that you are bringing to us. That qualitative data is harder to quantify. We are on the same team.”

Ogden City School District Superintendent Rich Nye said he has examined the test data for Ogden schools and the district has “identified some red flags that indicate we have areas of concern, areas we know had testing interruptions, so we’re exploring that a little bit further.”

Nye said he and other educators across the state “own our students’ results. We want to see them do well. When they don’t, we make improvements where we need to so they will. Where this comes into play, though, it’s hard to own results that statewide there’s a question of confidence as to their validity.”

Despite the assurances about the consistency of statewide test data, Terry Shoemaker, who represents Utah school boards and school superintendents, said schools’ confidence in the district and school-level data “is fairly negative.”

“We don’t feel this has been a fair session for our students and I think that’s been a challenge for us,” Shoemaker told the board.

“We lack confidence that this really is going to do what we want it to do in terms of accountability,” he said.

The State School Board had a multiyear contract with RISE vendor Questar but voted in June to terminate it after technical issues and several missed deadlines.

View Comments

Two months later, the board resumed its relationship with American Institutes of Research, makers of the SAGE test, entering a three-year contract for $21.6 million.

Earlier in the meeting, Utah Education Association President Heidi Matthews urged the State School Board to ensure no student, teacher or school is harmed by flawed data.

“I’m a teacher and we invented tests. We love tests because they tell us what our students are learning and they inform our instruction so that we’re able in the moment in our classrooms to be able to make the necessary adjustment for their learning. Tests are not designed to be a tool for blame, shame or punishment,” Matthews said.

Instead, she challenged the board needs to shift its focus to “assessment for meaningful academic improvement and supporting initiatives that generate sustainable improvement through comprehensive metrics,” she said.

Join the Conversation
Looking for comments?
Find comments in their new home! Click the buttons at the top or within the article to view them — or use the button below for quick access.