Detroit, Education — March 4, 2014 at 6:49 am

UPDATED: In-depth analysis of MEAP data for the EAA shows a striking lack of “achievement” & significant declines in proficiency

by

Several days ago, I wrote about how MEAP testing scores of students in the Education Achievement Authority experiment in Detroit fail to support claims that students in the system were actually making progress. Further analysis of the “cohort data” which tracks individual students from one grade level to the next shows that the situation is actually much worse than the broad mean proficiency scores suggest.

Dr. Thomas Pedroni, Director of the Leonard Kaplan Education Center for Critical Urban Studies and Associate Professor for Curriculum Studies at Wayne State University, analyzed the raw cohort data and his findings are staggering. He published his analysis in a blockbuster piece titled, “MEAP cohort data reveal stagnation and decline in EAA student test achievement”.

Because the cohort data enable MDE [Michigan Department of Education] to track individual student progress from year to year, they provide us with the most reliable picture of student test performance and test score growth over time. Unlike proficiency scores that tell us the proportion of students who met MDE’s proficiency cut score, cohort data use students’ mean scale scores to chart student growth. Since mean scale scores are based on students’ raw test scores, they give us a picture of student achievement test growth even if students have not yet obtained proficiency.

The cohort data are especially important because, as the EAA has rightly maintained, students who start out so far behind might take a few years to reach the proficiency cut score, even if they are making steady progress from year to year. Thus, while proficiency rates are not a good measure of whether or not the EAA’s students are progressing on tested curriculum, the cohort data are.

So what do the cohort data tell us?

In all, MDE successfully matched 1,377 students from their 2012 math MEAP performance to their 2013 math MEAP performance, and 1400 students from their 2012 reading MEAP performance to their 2013 reading MEAP performance. The matched math and reading cohorts, according to MDE, constituted 86.8% and 87.7% respectively of all 2013 EAA testers on those two tests.

The tracking of those students shows us, convincingly (see Table I and II below), that the majority of EAA students failed to demonstrate even marginal progress toward proficiency on the State’s MEAP exams in math and reading. Among students testing this year who did not demonstrate proficiency on the MEAP math exam last year, 78.3% showed either no progress toward proficiency (44.1%) or actual declines (34.2%). In reading, 58.5% showed either no progress toward proficiency (27.3%) or actual declines (31.2%).

That’s the data for the kids who weren’t proficient to begin with. For the handful of kids that were proficient the previous year, their results are tragically worse:

The portrait is even grimmer for the small number of students who had entered the EAA already demonstrating proficiency on the MEAP.

During the 2013 administration of the MEAP math test, there were a total of only 56 test-takers who had scored proficient the year before. Of those 56 students, only 10 stayed at the same level of proficiency or improved (see Table I). That means that 46 of those 56 previously proficient students actually declined—became less proficient. 26 of those 56 had what the MDE terms significant declines. Another way of saying this is that of those 2013 test takers who had scored proficient the year before, 82.1% declined in proficiency in just one year with the EAA. Only 7.1% increased in proficiency, while 10.7 percent stayed the same.

In fact, only 19 of the 56 students who tested as proficient in math in 2012 remain proficient now.

Much more analysis along with multiple charts to help explain the results can be found in Dr. Pedroni’s post HERE.

As Pedroni points out in his startling piece, the data not only calls into question the inflated claims of EAA administrators like John Covington and Mary Esselman or champions of EAA expansion like House Education Committee Chair Lisa Lyons, it shows that the actual results are the opposite of the claims they are making. Also, given that these administrators had access to the data well before they were released, it supports my contention several weeks ago that the mad rush to pass EAA expansion legislation was an intentional effort to have it signed into law before these results became public.

In the coming days and weeks as the implications of these MEAP test results begins to sink in, Rep. Lyons and others will continue their effort to downplay and even delegitimize the MEAP test itself as she did in an interview with WKAR radio. The fact is, the MEAP is the test that determines which schools are put into the EAA in the first place. If it’s legitimate enough for that, certainly these scores are a legitimate yardstick by which to measure the EAA itself.

One more thing: in his piece, Pedroni talks about what can only now be seen as the false promotion of the EAA by Excellent Schools Detroit, a supposedly objective group that publishes a report card on schools in Detroit. ESD was smitten with the EAA, saying, “The Education Achievement Authority is showing early promise against that high standard, especially with K8s.” One wonders how they could make this statement in light of the facts revealed by Pedroni’s analysis. Dr. Pedroni himself provides the answer:

Although Excellent Schools Detroit advertises itself to Detroit’s families as an objective source of school data compiled to help families in pinpointing and selecting quality schools for their children, the connections between ESD and the EAA suggest a much closer relationship. The current Chair of both EAA administrative boards, Carol Goss, incubated and funded Excellent Schools Detroit in her capacity as CEO and President of the Skillman Foundation. Excellent Schools Detroit furthermore administers the Michigan Education Excellence Foundation (MEEF), through which funds are funneled to the EAA from national venture philanthropies with which the Skillman Foundation works, including the Broad Foundation. Four current ESD board members (including Chairwoman Goss and EAA Chancellor Covington) and one former ESD Board Member currently serve or have recently served on the EAA board.

What is clear from the reporting that I have been doing is that Governor Snyder’s education experiment with the students of Detroit — the Education Achievement Authority — has, so far, failed to demonstrate that it is in any way an authority on educational achievement. Not only are they failing to help the students in their schools, on the whole, to progress, they are lying about their results and presenting a false picture of what is going on in their schools. Teacher after teacher that has come forward speaks about the “dog and pony show” put on for government officials, lawmakers, and wealthy benefactors when they visit the EAA. Time and again Covington and Esselman have publicly stated that their students are making incredible progress. Although they have not presented data to support this, their claims have been largely accepted as fact by legislator and the media.

I know of nobody who wants to return to the situation Detroit Public Schools were in two years ago. Nobody questions the fact that we must “do something.” What is important is that, in the process of “doing something”, we do the RIGHT thing. Dr. Pedroni’s work is an important part of the EAA discussion and suggests that this “something” IS the wrong thing and is actually making things worse for the students that are impacted. It’s time to end this failed experiment and put together a comprehensive “educational surge” in Detroit that doesn’t try to educate students on the cheap, for a profit, and with unqualified, inexperienced teachers and administrators that simply do not know what they are doing.

UPDATE: I received an email from a former EAA teacher this morning who had this to say:

When I worked at Nolan in 2012 during the MEAP we were constantly told that the scores of 2012 wouldn’t matter because they didn’t reflect us and that a good measure would be the 2013 scores.

I’m betting EAA administrators are regretting that statement now.

[CC image credit: amboo who? | Flickr]

Quantcast
Quantcast