Wednesday, July 29, 2009

Shoddy Academic Study Denounces Media for Non-Citation of Shoddy Academic Studies

A couple of days ago, I received an email from the teachers union-funded "Great Lakes Center for Education Research and Practice" touting a new study written by Holly Yetick of the University of Colorado at Boulder, allegedly uncovering rampant pro-think tank bias in the mainstream media. As the policy director of a think tank, I was naturally interested--we're always looking for new ideas when it comes to prosecuting our nefarious media-manipulation plans. Alas, I was disappointed. In an analysis of 864 articles published in the New York Times, Washington Post, and Education Week, the author found that:

Although university and government sources were cited more often, a higher percentage of reports produced by advocacy-oriented think tanks were cited by both types of publications. Universities produce 14 to 16 times more research than think tanks, but the three publications only mentioned their studies twice as often as think tank reports. As a result, any given think tank report was substantially more likely to be cited than any given study studies [sic] produced by a university.


That's not a bad way of counting press hits, although I probably would have added the AP, Wall Street Journal, and USA Today. (Note also the K-12 bias -- Ed Week but no Chronicle or InsideHigherEd). It's the denominator that really throws these numbers out of whack. Presumably, nearly every one of the think tank studies in question was written with the hope of garnering some media coverage. The universe of academic studies, by contrast, was calculated in two ways: the total number of papers accepted at the 2008 meeting of AERA (8,064), and the total number of articles published in 2007 in 176 peer-reviewed journals (7,172).

Now, maybe third-rate journalism is at the root of the Washington Post's failure to provide A1-coverage to articles like "Still jumping on the balance beam:continued use of perceptual motor programs in Australian schools," from the April 2007 edition of the Australian Journal of Education, one of the peer-reviewed journals in question. Ditto "Contributions and challenges to vocational psychology from other disciplines: examples from narrative and narratology," from the International Journal for Educational and Vocational Guidance. And maybe Ed Week needs to take a long, hard look at its standards and practices after failing to cover "Complicating Swedish Feminist Pedagogy" and "Complexity Theories at the Intersection of Hermeneutics and Phenomenology" from the 2008 AERA.

Then again, maybe not.

The article also alleges a conservative bias in news coverage, as evidenced by the fact that newspapers tend to cite studies from notorious right-wing outfits like...Education Sector, where I work. Without going into the political and professional histories of our staff at length, let me assure you that this view is completely absurd. If we're on the "right" side of the spectrum and "centrist-libertarian," why is the Cato Institute always insisting I'm wrong? 

What accounts for the relatively high think tank batting average? In announcing the paper, the Great Lakes center said, "Yettick indicates that this is likely due, at least in part, to the skill and resources think tanks devote to publicity for their reports, using sophisticated marketing campaigns targeting journalists, policy makers and the public for which university professors generally lack the resources and motivation to do."

You hear this a lot. Well, I've worked at three of the think tanks covered in the report--the Center on Budget and Policy Priorities and The Education Trust are the other two--so I have pretty good sense of how they operate. And I probably shouldn't be revealing the sophisticated marketing secrets that allow us to crowd out allegedly more-rigorous university-based research with our "ideologically driven" work. But what the heck. Here's my secret recipe:

1) Before a report is released, send an email to editors and reporters at publications where you'd like it be covered. Describe the findings, briefly, and explain why it might make a good story.

2) Give them a copy of the report, for free.

3) Include your email address and phone number, in case they have any questions. Check your messages. If they email or call back and say "I'm on deadline for five o'clock," respond before five o'clock.

4) Be succinct. Don't, for example, write "It is, in fact, true that advocacy-oriented think tanks rarely have their research peer reviewed and repeatedly have been found to engage in research practices that cast suspicion on the validity of the findings..." If something is, in fact, true, then it's true. Moreover, as a reader, my assumption is that you're not deliberately lying to me. If you say it, I assume you believe it's true. So the sentence should begin "Advocacy-oriented think tanks..." and go from there. These things matter! See Strunk and White for further advice. 

Also, proofread. "study studies"?

In the end I think the marketplace of ideas is quite a bit more efficient than Yettick believes. Reporters aren't all idiots and think tanks don't succeed through P.R. witchcraft. If the media isn't covering your research, it's probably not my fault. 

7 comments:

Daniel L. Bennett said...

Great commentary, Kevin. It should also be noted that think tank reports are generally written in language intended for a broad audience and generally involve topics that the public is interested, rather than, as you suggest, highly obscure topics.

And yes, think tanks do engage in PR because media coverage is a measurable outcome and an effective way to spread ideas.

Jacob said...

great post

Kristen said...

Nice post. I think, as you point out, the biggest difference here isn't the "PR machine" but the topics of the research. Think tanks tend to research things that are hot policy debates or things related to the public interest in education. On the other hand, if you flip through the AERA program, you see many hundreds of papers that, while they may be important in the long view, hold very little interest for the public at large.

I blog about peer-reviewed education research and some days it is hard to find a paper that would be interesting for a general education audience!

john thompson said...

Kevin,

What is your response to the following?

“ The President of the United States and his Secretary of Education are violating one of the most fundamental principles concerning test use: Tests should be used only for the purpose for which they were developed. If they are to be used for some other purpose, then careful attention must be paid to whether or not this purpose is appropriate. This position was developed jointly by the American Educational Research Association, the American Psychological Association, and the National Council on Measurement in Education in their document “The Standards for Educational and Psychological Testing.”

Academics wouldn’t ignore that principle. Neither would they allow a study that defines a school with a 50% poverty as high-poverty, as the Ed Trust does. And when will your old shop retract its statement of that there are1200 or so schools that beat the odds when the actual number was 23 if I recall correctly?

I’ve been asking when the TNTP will apologize for the inaccuracy on the Toledo numbers in it last student and its incomplete numbers in its last two studies. When will the Citizens Committee on Civil Rights retract its unfootnoted statement on Comparability or at least explain that it was poor proofreading?

I’ve been thinking about the following all week after hearing James Heckman. The social science is overwhelming on the role of preschool, parenting, early education, and cognitive science. What’s the bang for the buck in fighting teachers as opposed to investments that we know would help poor kids more? Why did the Ed Trust ridicule Ed Week over publishing its Pew Trust data on readiness to learn? Of course, the answer is political. When academic social science studies make carefully worded statements on the role of parenting, they get condemned for blaming the victim or providing excuses for teachers.

We know that teachers alone are not responsible for what happens in their classrooms. We know that the Value Added models aren’t ready to be implemented. They haven’t even been tested in high school. Why do some “reformers” backed by think tanks want to move ahead even though the statistical tools don’t exist?

And what is the difference between a McKinsey Group report and an Onion parody? What academic would allow a McNugget like 6 (as I recall) states have lower White student performance than Hispanics in Ohio?

I understand hardball politics. Still we need institutions that respect the rules of evidence. We need falsifiable hypotheses. Yes, there are problems with teacher training programs, just like there is excellent social science in Ed Departments. In retrospect, wouldn’t we all be better off if “reformers” had worked to reform programs from within. At least in academics we have mores and practices that make for structured, collegial, and evidence-based conversations.

Kevin said...

Hi Kevin.

Notwithstanding the dismissive tone of your post, the work that Holly presents in this report is a nice piece of scholarship. (Please note that while The Great Lakes Center sent out a press release, the research was commissioned and published by our CU-Boulder EPIC policy center, not by the GLC. See http://epicpolicy.org/publication/research-that-reaches)

In the real world, researchers need to come up with workable metrics. Holly’s report carefully explains each choice, each categorization, each assumption, etc. so that you or anyone else could go back and replicate the analysis or change parameters to explore the sensitivity of her choices — their effects upon her findings and conclusions.

Moreover, Holly’s analysis was clearly presented as an opening attempt to address some very important questions. Without addressing Ed Sector specifically, I do think that it’s likely that advocacy-oriented think tanks in general devote a considerably larger portion of resources to packaging, marketing, media relations and the like than do university researchers — who focus almost entirely on the research itself. If you believe that to be incorrect, I'm very interested in learning more. I've never worked in an advocacy-oriented think tank and don't claim to have any inside information. So this is just my outsider's perspective.

Anyway, one can point out these distinctions and attempt to tease out the outcomes without engaging in name-calling or accusations of nefarious conduct, and I believe that is what Holly did.

You appear to be upset that your organization was labeled as “centrist-libertarian”. Holly does acknowledge that such categorizing required subjectivity. But she specifically addressed this point: “The category of “center-libertarian,” which some also define as “neo-liberal,” illustrates the difficult in pinning down the political-spectrum labels. Center-libertarians often have a stated interest in equity issues, which arguably places them on the left, while their specific, market-oriented policy prescriptions place them on the right. In this study, the focus is on the policies being advocated.”

The primary point of your post is also addressed straightforwardly by Holly herself, on p. 13 of her report:
“One explanation for the disproportionate representation of think-tank research is that think tanks may focus on subjects that are of strong public interest, and thus potentially more likely to interest journalists. By contrast, scholarly research covers a broader array of subjects, including some that primarily interest education professionals or other scholars.”

Oh, and regarding the "study studies" error on page 2 of the report, mea culpa. I or our editors should have caught it.

Cheers,
Kevin

Alexander Russo said...

kevin --
an unusually unreflective post from you. what about the legitimate issues involved? sponsored research vs. independent? original research vs. repackaging?

Dean said...

Thank you for reviving attention to Strunk and White. Attention to their advice would improve our communication immensely.