Last year, Measured Progress leadership and contracted staff scored more than 35 million student responses to open-response questions on statewide assessments.
For each contract, qualified scorers evaluate student work in English language arts, mathematics, science, and social studies in grades three through eight and high school. The scoring team also evaluates alternate assessment portfolios and performance tasks.
Most state general assessments include two types of test items: multiple-choice and open-response. The process of scoring the two types of questions differs significantly.
Multiple-choice items are scored electronically using scanning processes.
Open-response items—assessment questions that students to write a short or long answer—are also scanned, but must be scored by human scorers.
A Glimpse into the Scoring Process
Scoring open responses is a process, not an art. It is an objective practice subject to constant, ongoing evaluation by both scoring leadership and our scoring system, iScore.
Measured Progress scoring leadership has considerable experience in scoring short and long student responses and in both holistic and analytic scoring. The leadership team, along with contracted scorers, scores student portfolios, performance tasks, and alternate assessments.
Scorer accuracy is monitored on an ongoing basis. We recruit scorers to represent a diverse spectrum of educational, professional, and ethnic populations.
Measured Progress's Web-based, distributed scoring system, iScore, manages work flow, measuring score reliability, monitoring progress, and capturing data. iScore also enables scoring leadership to efficiently measure a scorer's ability to score responses in real-time.