Another bite at tests and the common core

Since there was a lot of interest in my posting on Florida’s reaction to the dramatic drop in writing test scores after the tests were revised to include grammar and punctuation standards, I wanted to share an Education Week article that reports similar scenarios emerging around the country.

For example, in Michigan:

The Michigan Merit Exam for high school students (in addition to incorporating the ACT, like Kentucky) has been redesigned this year to show that students who score at or above the “proficient” level on a subject should be able to get at least a B on the freshman-level college exam in that subject at a public university in Michigan.

Cutoff scores for proficiency on the Michigan Educational Assessment Program, or MEAP, given in grades 3-9 each fall, also increased significantly this school year. Students needed to get 65 percent of answers correct to pass, instead of the previous standard of 39 percent.

Based on that new cutoff—not because of a change in the test itself—math proficiency rates statewide on MEAP in all grades dropped by roughly 35 percentage points from 2010 to 2011, when the new standards went into effect, said Joseph Martineau, the director of the office of educational assessment and accountability at the Michigan education department.

In terms of common-core readiness, “We feel like we are a little bit ahead of the game, and that will serve us well when we’re going into this situation, when we’re taking this test that is more rigorous,” Mr. Martineau said.

To help the public understand the impact of the new cutoff scores, the department last November released information illustrating how much MEAP scores in each of the past four years would have dropped if the new standards and scoring had applied retroactively. For example, applying the new cutoff-scores to 2010 results, only 35 percent of 3rd graders would have scored proficient, instead of the 95 percent deemed so currently.

http://www.edweek.org/ew/articles/2012/06/06/33testing_ep.h31.html?tkn=VMQFOW6AUFfktFmY804YeCMtvF34EOkC5UmF&cmp=clp-edweek

Since I think we often set the “proficiency” bar ridiculously low, I generally applaud state moves to raise standards. But the poor job most states have done explaining the common core makes me nervous about the next phase.

3 comments

  1. Carolyn Sharette

    This is exciting news! States raising standards, even though they are completely aware that their proficiency numbers will go down dramatically. This is what is needed and it will signal the beginning of improved instruction. Why?

    Because teachers really DO want to be successful, and they want their students to be successful! States have been communicating to teachers that their students are proficient when they achieve only 40-50-60% on state exams. But they don’t clearly communicate that is what they are doing – they just give the students a “substantial” or “sufficient” score, and teachers don’t often even see what that actually equates to!

    When we educated our teachers on how low a “sufficient” score may actually be, they starting asking to see the raw scores, and learning about cutoffs, etc. Teachers who understand what is going on are the biggest advocates for raising standards! They want students to be TRULY successful, and they don’t consider 50% on an exam a success!

    Once parents are also educated, we will see a drive upward in instruction and expectations, and achievement levels will rise. Let’s encourage Utah to adopt the 1-2 punch:

    1. Communicate to parents, teachers and the public – clearly – what “passing” means on our state tests – what are the % cutoffs – instead of giving a 1,2,3 or 4 “score” which masks the true performance of the student.

    2. Raise the cutoffs – not gradually and over time – just do it immediately, and the shock factor of so many “failing” students will energize the teachers, parents and students to “get real” and step it up to true academic achievement. It would be exciting to see how quickly our achievement levels would rise when people really knew how low their student was performing.

    • Mary McConnell

      Thank you, Carolyn!

      I think a lot of teacher resistance to performance data reflects a fear that this data will be used only to punish or humiliate. This is not an unreasonable fear, but smart school districts (and legislatures) could counter it by demonstrating that data will be used to inform and drive improvement. This is one reason why I’ve come around to the view that individual teachers’ value-added scores shouldn’t be published in the newspaper. (I’m a lot more willing to make them available to parents at the school.)

      I know that I poured over the AP test statistics for my students every year, hunting for clues about where I could improve. The College Board broke down multiple choice questions by topic, for example, so I’d know right away if my students missed an inordinate number of questions on, say, the Constitution or the Industrial Revolution. I could also see what kind of essay scores they were receiving on each question. The news wasn’t always encouraging, but it was ALWAYS useful. I made some changes every year based on this data, and it still informs my online essay instruction.

Leave a comment

DeseretNews.com encourages a civil dialogue among its readers. We welcome your thoughtful comments.

*