Education, Standardized Testing — May 11, 2017 at 5:00 pm

The media’s infatuation with “data”, and why it’s so wrong

by


The Hartford Courant recently published a stunningly uninformed editorial on the Connecticut State Board of Education’s decision to eliminate the use of student test scores in teacher evaluations–a move widely praised by most of the education community.

The Courant’s position was clear–and a blend of equal parts ignorance and arrogance:

The state Board of Education recently agreed not to include student performance on state standardized tests as part of teacher evaluations, making a broken system even worse by removing almost all accountability. Now individual school districts can continue to evaluate teachers almost completely on squishy criteria that don’t need to be based on clear data about whether they are doing their jobs.

The editorial then tries to present its case for the power of “data” as a sort of “silver bullet” with respect to uncovering “the truth” about teacher effectiveness. I’ve taken the liberty of responding to each of their points below…

“Data, for all their limitations, are real.”

Except when they are not–as in using student test scores in math and reading to evaluate teachers of social studies and science.

“Data present a clear picture.”

A clear picture of unrelated events–in school factors, like student test scores, represent between 0-7% of the differences in student learning; out of school factors, such as socio-economic status, parents’ income levels, and access to the internet, account for 90+%.

“Data are impartial.”

Except for Value Added Measures, which have been found to be invalid and unreliable when used to evaluate teachers by the American Statistical Association–which is to say that the data derived from student test scores are not measuring anything that is related to teacher effectiveness.

“Data are not interested in protecting anyone.”

Except when data is used to target veteran (read = more expensive) teachers for termination, as happened in Houston’s schools 221 times last year alone. Then it’s used to “protect” the least highly paid staff, and those without the protection of tenure, who often voice their opinions about ineffective education policies.

“Data get to the truth.”

Except when it comes to evaluating teachers, where the data are narrowly defined as test scores, and aren’t telling “the truth” about anything. As my friend Ann Cronin writes:

In addition to student test scores not being solely the product of a single teacher, the test itself is not a good way to measure student performance. The editorial stated that SBAC, the standardized test Connecticut uses, has been “painstakingly designed to provide objective and uniform data about whether the students are learning their lessons”. But what lessons would those be?

The lessons of an English language arts teacher that promote literacy are lessons for students in using writing as a tool for learning, lessons in learning to write to express narrative or argumentative thinking or to explore a question, lessons in expanding and refining their thinking by revising their writing, lessons in learning to collaborate- to listen and speak to one another in order to deepen and broaden their individual thinking, lessons in learning how to question in increasingly deep and complex ways, lessons in creating meaning as they read, and lessons in exploring multiple interpretations of what they read. And none of that is on a standardized test. If the English language arts teacher teaches lessons that match the test, that teacher is teaching test prep – not literacy.

“Data are not stories concocted by administrators to “nurture” ineffective teachers. A collection of stories does not add up to data.”

Except for qualitative, and narrative, and case study, and program evaluation research–all of which rely on anecdotes, vignettes, interviews, and observations–in other words, stories, to describe the rich, interesting, and varied actions and behaviors of real, living, teachers and students engaged in learning in school communities.

Kind of like what good journalists do on a daily basis to help us understand complex, complicated human interactions in their neighborhoods and communities–except when they are using those “silver bullets” to shoot themselves in their metaphorical feet about “data” in ignorant, uninformed ways.

As I’ve written about before, here, the use of student test scores to rank and sort teachers is an invalid and unreliable use of the “data” derived from those test scores. What’s more, as we’ve seen in New York State, where veteran 4th grade teacher Sheri Lederman won her lawsuit over the use of student test scores in determining her evaluation rating, it’s becoming clear that it’s not legally defensible, either.

Congratulations to Connecticut’s State Board of Education for realizing that fact, and changing their policy on teacher evaluation.

And shame on the Hartford Courant for perpetuating old, uninformed notions of the use of “data” in teacher evaluation systems. Now, go put some bandages on those feet. You’re going to need them.

  • Jeff Gaynor

    It’s urgent that we take a critical look at how learning is being affected by what is termed, “data-driven instruction.” This is not just about teacher evaluation – though that is serious as it drives instruction – but about the fundamental experience students have throughout school.

    And yes, congratulations to Connecticut’s State Board of Education.

Quantcast
Quantcast