Wednesday, September 17, 2008

What Works

Since its creation in 2002, the What Works Clearinghouse (WWC), housed at the Institute of Education Sciences, has been quietly putting out reports on the efficacy of education policy programs. Every couple days I get a new evaluation in my inbox, most often telling me that program X showed "no discernible effects" or the studies of intervention Y fail to meet "WWC evidence standards**."

This is unquestionably a good thing, as we move the field of education towards a more empirical science. It makes sense to have an unbiased resource for principals and superintendents to be able to find objective analyses of programs. They have neither the time nor the inclination to sort through long research reports on individual programs. Let alone picking the best out of all that are available. Instead of relying on this process or research peddled by textbook or program developers (who have a strong vested interest in their products), we have an outside body reviewing the research and demanding high quality experimental designs. We're introducing rigor into our analysis.

Of course, we're moving at a snail's pace. Pick any of the topic areas in the WWC website, and you'll see mostly "no studies identified" or "no studies meeting evidence standards." Of 74 interventions listed on the elementary school math page, only five passed the WWC screens to even merit a review. Of those five, four were found to have no discernible effects on mathematics achievement. One and only one program, Everyday Mathematics, is able to demonstrate potentially positive effects. Teachers, principals, district administrators should all be out buying it. It's developers and publishers should be citing this distinction on their homepages and in all their sales materials. But the news that it is the only rigorously evaluated and proven mathematics curriculum is nowhere to be found.

Implementation of what works is likely to be slow. The affiliated Doing What Works site will help, but getting the right research into the hands of decision-makers will inevitably take time. But we're moving in the right direction, and I'm always happy to see a new WWC review. Keep them coming.

**Last week a popular literacy textbook published by Houghton Mifflin earned such a rating. Although nine studies had been conducted on the textbook, none met WWC standards for experimental design. Education Week story ($) here.

Update: Catherine, in her effervescent post about "new math," made me realize I forgot to point out that we're not living in a policy bubble here. Everyday Math is used in 175,000 classrooms and 2.8 million children nationwide. That includes the District of Columbia. DC Teacher Chic has the scoop on how it plays out in District classrooms.

8 comments:

Unknown said...

The only problem with your post is that Everyday math is a bad program. It has been reviewed here:
http://www.hoover.org/publications/ednext/3220616.html

But you knew this already.

Maybe education research doesn't lend itself to empiricism?

AldeBeer said...

The "study" you reference was not empirical in the slightest.

In their review, the WWC threw out 57/61 studies conducted on Everyday Mathematics. It is the only elementary mathematics program out of 74 to survive WWC's withering review.

DC Teacher Chic said...

Thanks for the mention! You don't suppose I could be added to the blog roll? It would huge to be linked by Quick and the Ed.

Kevin Carey said...

Glad to. Your blog is really interesting.

Anonymous said...

Interesting that the only "intervention" that "worked" didn't even earn the highest ranking. In plain english it seems the ranking it received means:

1. Sometimes helps more students than it hurts
2. A fewer number of times hurts as many students as it helps
3. Never hurt more students than it helped

This is not unqualified success which is why the WWC has a higher ranking. The higher ranking seems to mean, it always helps more students than it hurts.

Since so many of the interventions have no valid studies, it also seems that it would be hard to make a rational comparisons between Everyday Mathematics and other programs. So doesn't surprise me that people would rely on their own common sense when making a decision and not rush out and buy Everyday Mathematics.

Your recommendation seems to be more of a promotion of the WWC system than the actual intervention. Is there a better way to do that?

Anonymous said...

The Deparment of Education’s “What Works Clearinghouse” which evaluates research on the various math programs, reviewed 61 Everyday Math studies. The findings: Of those 61 studies, none met evidence standards, 4 met evidence standards with reservations and 57 did not meet evidence screens. Of the remaining four, the WWC found Everyday Mathematics to have potentially positive effects on math achievement based on one study alone: the 2001 Riordan & Noyce study. Just so everyone is on the same page, Pendred Noyce has a vested interest in Everyday Math in that she has formed associations with several reform math initiatives, at least one dedicated to implementation of Everyday Math: COMAP, for which she serves on the Board of Directors

SteveH said...

"Of those five, four were found to have no discernible effects on mathematics achievement. One and only one program, Everyday Mathematics, is able to demonstrate potentially positive effects."

So, you have an allowable sample size of just one, from what range of data? And this allows you to make any conclusion?

Anonymous said...

"So, you have an allowable sample size of just one, from what range of data? And this allows you to make any conclusion?"


And that one study is hardly what anyone would call independent.