Editor's Note: One of the hallmarks of Oregon's "good" and "open" government tradition has been the integrity and transparency of state agencies, from state employees to their elected CEOs.
However, the Department of Education (DOE) under State Superintendent Susan Castillo seems to be steering in very troubled waters when it comes to getting test results required under NCLB correctly scored.
Upon review DOE has had to adjust assessment scores for 2006-07 and earlier in the year reported a system failure with the computer based testing system which forced schools to move to paper-and-pencil tests.
A year or so ago there was also a story of the failure of the DOE to monitor a local school district's spending of state funds which occurred over several years at the cost of millions of dollars to Oregon taxpayers.
RAD has opined many times in this blog that "high stakes testing" is not good education and often leads to cheating by students, teachers and schools. Now we find DOE may be part of the problem too.
RAD thinks it's time for Secretary of State Bill Bradbury to do an audit of DOE to examine these two incidents and to see if other management issues exist there. As they say, where there is smoke, there is fire!
On September 24, 2007 the Oregon Department of Education sent out the following news release:
SALEM – State Schools Superintendent Susan Castillo announced mixed results today for Oregon students performance on the 2006-07 assessment tests in reading, writing and math. Reading test scores showed modest improvement in every grade level, with 6th and 7th grades showing the most improvement. Math scores went down for 3rd, 4th and 5th grades and showed modest improvement or stayed the same in 6th through 10th grade. Writing scores were up slightly for 4th and 7th grades and stayed the same for 10th grade.
This was the first year that assessments were administered under the new achievement standards (cut scores) for math and reading adopted by the State Board of Education earlier this year. For purposes of comparison, the 2005-06 data was recalculated using the new standards. The combination of raising the achievement standards in the same year we experienced the unanticipated switch to a paper-and-pencil assessment makes it difficult to identify what caused the elementary decline.
“This is a significant year for our students and schools,” said Castillo. “The benchmark for achievement has been set. I have the utmost confidence that our students, educators and parents will rise to meet these expectations. We know that as schools fine tune their focus on reading and math as part of their work on implementing the new high school diploma requirements, students will make sustained improvements in these areas. I am committed to working with schools as we move forward to implement the new diploma requirements.”
“We can be confident that the scores provide schools with a reliable tool as they continue to take strides in providing a rigorous and relevant education for each and every student,” said Castillo. “Our schools faced considerable hardship this year due to the last minute switch to paper-and-pencil. I am very proud of the hard work done everyday in our classrooms across the state. Oregon’s teachers have incredible commitment to their students and can take pride in the impact they have on student achievement.”
Oregon tests students at grades 3-8 and tests high school students at grade 10 for math and reading. This year, students were not required to test in science due to the switch to the shortened paper-and-pencil test.
Then on October 2, 2007 the Department sent out this press release:
Oregon Department of Education to Revise School Report Card Formula to Ensure Fairness, Consistency
SALEM – State Schools Superintendent Susan Castillo announced today that the Oregon Department of Education will adjust the report card formula by 2 points [lower] in the performance index scale and by 1 point [lower] in the improvement index scale to account for the impact of the use of a shorter test with fewer test items administered to students only one time. With many students having fewer opportunities to test last year and the state using a shorter test that measured very high and very low performing students less precisely, the Department chose to make a one-time adjustment to ensure that the report card ratings are fair and consistent.
There were many changes in the assessment system during the 2006-07 school year, including a change in the achievement standards (cut scores), implementation of a new extended assessment, changes in the participation rules and a temporary switch to a short paper-and-pencil assessment. In spring 2007, the Oregon Department of Education notified school districts that it would consider a revision to the Report Card formula if a review of the data revealed a systematic issue [i.e. reporting errors]. Three circumstances are significantly different in this test administration from prior administrations:
* Some students took a pencil and paper test for the first time.
* Many students had only one opportunity to take the test.
* The test provided 30 items (short version) to allow administration during one test period.
This number of items is statistically sufficient to yield an overall valid score but does not allow the same number of test items at the high and low ends of the performance scale.
RAD: What these press releases obfuscated is that not only was there a counting error, attributed to a computer coding error, but that state wide results were neither "reliable" despite Castillo's assurance above nor accurate or as high as initially reported. Why is RAD not surprised?
Oregonian reporter Betsy Hammond's article "Oregon get downgraded for scoring tests wrong" in the October 3rd Metro section reported that "The Oregon Department of Education flunked its own accuracy test last week." Hammond's article notes that the "...biggest error came in third-grade reading..." which saw scores dip by 2 points.
But the reporting error turns out to be more serious because "...nearly 100 elementary and middle schools, passing rates on state reading tests were actually 5 or even 10 percentage points lower than the state reported. Sixty schools scored at least 5 percentage points worse than first reported in math or writing or both..."
While the pointy heads in DOE have taken responsibly for the error, what if this is just the tip of the iceberg? In computer lingo what if this is another case of "garbage in, garbage out" - bad data leading to false analysis? Yes, it's time for Secretary of State Bill Bradbury to see who's minding the store in Oregon's DOE!