Friday, May 30, 2008

Learning From Not Learning From Toyota

James Surowiecki's business column in the The New Yorker this week is about Toyota, which is poised to end General Motors' three-quarters-of-a-century-long reign as the biggest automobile manufacturer in the world. The secret of Toyota's success is no secret: understanding what the customers wants, combined with a constant focus on steady, continuous improvement. Literally thousands of books have been written about the Toyota Production System, and the underlying ideas have long passed into received wisdom bordering on cliche. Yet despite this transparency, Toyota continues to beat the pants of its competitors every year.

There is a lesson for education here. (No, not that schools are like automotive assembly lines). There is a way of thinking about school reform that goes something like this: Lack of success stems from lack of knowledge about how to be successful. So we need to locate the most successful schools, figure out how they work, and then communicate those "best practices" findings far and wide. This theory is appealing on several levels. First, it implicitly supports the work of researchers and analysts. Second, it's non-judgmental: a school's lack of success isn't because anyone purposfully did anything wrong, it's because they simply didn't know how to be right.

I don't think we know all we need to know about good educational practices. As a rule, the more knowledge to use and more models of success to study, the better. At the same time, a lot of the basic elements of successful schools have been understood for a long time, because they're the elements of any successful organization: leadership, resources, human capital, a functional organizational culture, understanding what your customers (students) need, a constant focus on steady, continuous improvement.

In other words, the biggest policy challenges from an information perspective aren't supply-side, but demand-side: not creating more information about success but giving people better reasons to seek out and use the information about success that already exists. That's really the essence of accountability, or should be--not necessarily telling educators how to improve, but giving them more support and incentives to do so, in a way that makes sense for their communities and students.

G.M. could have embraced the Toyota Production Model decades ago, and if it had, it would still be the number one automaker in the world. But it didn't. I hope our schools, colleges, and universities make better choices.

Thursday, May 29, 2008

Another Look at Performance Assessments

Just returned from Providence where I spent two days learning about Rhode Island's diploma system, which includes a number of performance-based assessment requirements. Today at Portsmouth High School I saw students present their senior projects to groups of teachers, classmates, and outside community judges. Beginning this year, to graduate, all 200+ seniors at Portsmouth are required to complete a year-long senior project, consisting of the "4Ps" -- a research paper, a tangible product, a process portfolio, and today's oral presentation. Students select their projects, submit a letter of intent, and work closely with a school or community mentor. And, the projects really are diverse. The first student I saw today presented the stage set she'd designed for the school production of "A Midsummer Night's Dream." Another student's project consisted of running a marathon and fundraising to support leukemia research.

The students were, of course, outstanding. But, what surprised me most were my conversations with the principal, teachers, and state officials about the cultural changes that were emerging from the senior project requirement. Roy Seitsinger, Director of RI High School Redesign, was emphatic that this work was "about transformative cultural change."

Portsmouth Principal Littlefield welcomed us by noting that the past two days he'd felt an "energy he has not seen in this building." And, he compared the buzz and excitement/anxiousness around senior projects with the lethargy that plagued last year's senior class. Numerous teachers echoed his observation of increased student engagement. At the symposium I attended the day before, teachers from Coventry High School, also in Rhode Island, talked about how implementing a portfolio requirement had made teachers' work more transparent. Their work was "no longer self-contained" because each and every teacher saw other teachers' signatures on student work in portfolios. And, this transparency had led to a level of peer pressure for rigor among teachers when assigning and grading student tasks. Likewise, I saw dozens of community members, parents, and business volunteers at Portsmouth serving as presentation judges. The students' senior projects provided a vehicle through which the entire school community engaged together--especially faculty. And, they also provided an opportunity to engage the surrounding community as mentors and judges.

Of course, performance assessments are not new. And, issues of reliability, validity, and seriousness plagued many states' efforts in the 1990s (see Jay Mathew's article in Education Next for a quick history lesson). These are important issues and you'll see much more about accountability and student assessment from Education Sector over the next year. (FYI, Rhode Island students also participate in the standardized New England Common Assessment Program, but as of now it is not a high stakes test.)

But, my main takeaway is that it's just as important, if not more, for us to think about how these reforms can drive other critical goals such as student engagement. And, as technology helps to broaden access to digital portfolios and presentations, public transparency and direct feedback and involvement in students' work products can provide powerful mechanisms for new conceptions of accountability. I'm certain I'm not the only parent that would use these tools extensively to help select a high school for my child.

If you'd like to see what a performance exhibition looks like, the Coalition of Essential Schools is sponsoring a Webcast presentation on Friday, May 30th, at 2:00 p.m. Eastern. You can also check out Portsmouth High's senior project handbook and videos of student presentations.

New test-prep data

A new report from the University of Chicago supports Danny's conclusions. Its title, "From High School to the Future: ACT Preparation--Too Little, Too Late," says everything needed.

It documents that teachers and students in Chicago Public Schools believe test scores are mostly determined by test-taking skills, that almost all classes containing juniors focus on test-taking (even if there are significant percentages of seniors who had already taken the test), and that these phenomena are worse in schools with large concentrations of minorities.

There's a difference between teaching the basics and teaching the basics of test-taking. In a particularly illuminating passage, the report quotes a student discussing his or her expectations for and reaction to the ACT:
In March (before the ACT)
Interviewer: Do you know what score you're shooting for?
Student: At least the mid 20's.
Interviewer: Any reason?
Student: So I can pick my own colleges.... If I don't want to go to Daley [a community college], I don't have to go to Daley. I can go to, like I said, [University of Illinois] Champaign or even a better place.

In May (after the ACT)
Interviewer: Do you have a list of schools that you're going to apply to, that you're interested in?
Student: Well, right now I'm basically going to Daley for, like, the first year and a half, so I can get the general, basic classes, and then transfer them out to...IIT, I guess.
Interviewer: Do you think it's going to be hard to get into IIT?
Student: I have a 3.5, and I have a 25 percent [class rank]. The only problem will be the ACT, 'cause I got a 16 on it.
[The student needs a 21 to get into IIT program.]

Wednesday, May 28, 2008

TAKS-ing Work

Former Education Sector intern Danny Rosenthal weighs in on the test-prep debate from his Houston high school:

Nearly 900,000 Texas high school students recently took the Texas Assessment of Knowledge and Skills, or TAKS. All of the state’s students must pass the test’s math, English, science, and social studies sections to get a high school diploma.

But where I teach math, at Hastings High School in Houston, only 54 percent of students passed the math section last year. So Hastings, a typical urban school serving 3,200 mostly poor black and Latino students, puts intense effort into boosting its students’ scores. Along with other math teachers at Hastings, I did test prep with my students every day for two months leading up to TAKS. As the test approached, that’s all we did in class. The Hastings math department also taught courses devoted entirely to TAKS prep. Some students were assigned to them year. Others were moved into the classes closer to the test.

I’m OK with test prep. When standardized tests are well-crafted, as they are in my state, teachers should use tests to shape their classroom instruction. Done thoughtfully, “teaching to the test” is a good idea. But at my school, and others in Houston, we execute test prep so poorly that it ends up hurting students more than it helps them.

Our problems started early this year, at our first planning meeting two months before the TAKS administration. We began by identifying questions that more than half of our students missed on a diagnostic test. Then we matched the questions to topics covered on TAKS, putting colored stickers for each question on posters around the room. The idea was to focus our work on the skills where students needed the most help.

But neither teachers nor administrators tried very hard to draw meaningful conclusions from the posters when we were finished. The data itself was incomplete because some teachers didn’t bother to post most of their stickers. And we didn’t control for things like the number of times each topic was tested. As a result, the sticker exercise told us little about our students’ needs. Nonetheless, we used the posters to guide classroom study sessions, as well as the test-prep work students did in our computer labs and in weekend tutoring.

Our classroom preparation for TAKS aspires to be “drill and kill,” though the term suggests a level of focus and thoroughness in our work that didn’t exist. Mostly, teachers made worksheets with questions only loosely related to each other taken from previous TAKS tests, or, in some cases, from math textbooks that are largely unaligned with the TAKS test. Think panicked college students poring over Cliffs Notes for the wrong novel.

Sometimes, the school made all math teachers work off of the same worksheets, regardless of the fact that they taught different subjects. One day, my freshman Algebra class was expected to review quadratic equations, a complicated topic that my students had never seen before and lacked the background to understand. On another day, my Algebra II class was expected to review graphing lines, a topic they had already studied in depth for the previous two weeks. Almost every teacher uses the worksheets. It is how things are done.

Our test prep worksheets aim to review important skills. But oftentimes students have not learned these skills in the first place. And the worksheets don’t fix that. Most of the sheets require students to answer multiple-choice questions. Motivated students work through the problems. Others guess. After a few minutes, teachers show students the correct answer, focusing their explanation on the particulars of the problem instead of any broader concepts. Then, the class shuffles on to a new problem, usually unrelated to the first. In this haphazard process, there are few opportunities to make connections or think critically. And students don’t master basic skills either.

Perhaps because these worksheets are so ineffective, Hastings administrators encourage teachers to take their students to the school’s computer labs to use test-prep software. Some students are more engaged on a computer, and the software lets students work at their own pace.

But for others, computers are just a bigger distraction; they spend their time finding ways to bypass the school’s internet filter to get to YouTube. Busy checking their email, teachers often fail to notice. The computer software itself is designed to provide practice, not teach new skills. Students can read a short tutorial summary, but this summary is confusing and almost always ignored. So kids learn little of what they don’t already know.

Hastings also sponsors tutorials after school and on Saturdays to prepare students for TAKS. But only a few dozen kids out of 3,200 typically attend. At one Saturday session I attended, teachers outnumbered students. And the motivated students who do show up are not the ones who need the most help.

The larger problem is that most students just don’t care if they do well on the TAKS test. Last year, several confessed to me that they guessed on most of the questions. The school attempts to combat this indifference with simplistic incentives, offering students a chance to win an iPod just for showing up on TAKS test day and threatening those who fail the test with extra math classes.

But such incentives don’t work very well, and they miss the larger point: Students choose not to try mostly because they think they have no chance to succeed. That’s not their fault. At Hastings, we are far too willing to exchange gimmicky test-prep and other instructional shortcuts for real teaching.

Tuesday, May 27, 2008

Money-Sucking College Sports Programs

Per last week's ongoing discussion of higher education, one thing nobody seems to dispute is the assertion by "Professor X" that classes like his are "a substantial profit center" for most colleges. The math isn't hard; adjunct professors typically only get paid a few thousand dollars to teach a course that each student pays a few thousand dollars to take. Even taking into account the pro-rate costs of facilities and administrative overhead blah blah blah, the colleges are making a lot of money here. Which begs the question: what are they spending those profits on?

Among other things: phenomenally expensive, money-losing sports programs. That's the conclusion of a new report from the NCAA (InsideHigherEd summary here). It found that median spending on sports among the 119 NCAA Division I football-inclusive sports programs grew by 15 percent in 2006, to $35.8 million. Revenues to support those programs, meanwhile, grew by only 9 percent, to $26.4 million. In other words, the typical D-I university loses millions of dollars per year on sports, and the deficit is getting bigger by the year.

And that's just the median. A handful of schools--19, to be exact--made money, while bunches of them lost $10 million or even $20 million or more. Where does your alma mater stack up in all of this? There's no way to know--the NCAA won't release the numbers for individual colleges.

In its defense, the NCAA has put out a lot more information here than it ever has before, and is quite candid about the the fact that most of its members are essentially siphoning off huge amounts of money that could be used for education, research, need-based financial aid, etc., in order to subsidize sports. To be clear, not all of that money goes to support quasi-professional men's football and basketball teams; these numbers include all sports, men's and women's.

As with all spending decisions, this comes down to priorities: given the choice, most Division I colleges and universities would rather spend money on an activity whose benefits accrue substantially in the form of entertainment for non-students and the greater glory of the university, at the expense of the more mundane task of helping academically vulnerable students stay in college and earn a degree.

Research on Incentives

A new study was just released today from the Center for Research on Education Outcomes (CREDO) at Stanford that examined the use of incentive programs in charter schools. Do incentives like field trips, t-shirts and iPods--rewards that have a short time frame between behavior and consequence-- provoke students to do better? It was an exploratory study that looked at a limited non-random sample of charters so, like all research, there are limitations to this study. But the report, written by ES board member (disc.) Macke Raymond, suggests that when school staff is on board and committed to the reward program and the rewards are given in a consistent and continuous way, these kinds of incentives can motivate students to improve.