How do you recognize a good school? Test scores tell us something, but not enough. We need instead an array of evidence that students are-or are not-learning the things that matter. And the public must join school people to understand and weigh that evidence, then plan together how to improve the work of our students.
A FEW MORE HARD-WON triumphs came this year to Hoover High School, a Coalition member school that serves the poorest students of all San Diego’s 161 schools, and half of whose students have limited proficiency in English. After five years of steady change efforts, the school has now more than doubled the percentage of its graduates who go on to post-secondary studies. All students are learning key academic skills through a groundbreaking program that links them to the world of work. And the U. S. Department of Education has named Hoover one of five exemplary urban high schools in the nation.
But that’s not what the San Diego school district wants to talk about just now. Based on its test scores and grade point averages, Hoover has just showed up on its list of “low-performing” schools, risking a dramatic district intervention that could dismantle the school, scattering or firing its faculty.
As pretty much any Hoover student, parent, and teacher can show with their own “hard data,” this school prepares kids well for the future-despite the fact that 95 percent of its students qualify for free or reduced lunch, and a quarter of them will move in and out of the school during the course of a year.
Hoover students maintain digital portfolios, holding their work against school-wide standards that derive from the Secretary’s Commission on Achieving Necessary Skills (Scans). The school’s efforts to integrate work-based learning with academics have drawn attention from university admissions officers, private business, educators, and the press. Education Week and Teacher magazine featured Hoover recently in a long, admiring article, and the National Association of Secondary School Principals named its principal, Doris Alvarez, as 1997 Principal of the Year.
The Hoover school community worked out a plan in 1992 for its own improvement. Since then it has kept track of its progress by collecting and analyzing a broad range of data: not only college entry and retention information, but student and teacher portfolios, comparative test scores over time, the records of students that stay at Hoover till graduation, student and parent surveys, and more.
Still, sorted against all the kids in San Diego, the test results here do look bad. The way norm-referenced tests work, somebody has to stand at the end of the line, and the kids from this very poor neighborhood south of Route 8 are those most folks might put there.
So is Hoover succeeding or is it failing?
“One-shot test scores and aggregate grades simply don’t measure the kind of performance Hoover students actually accomplish, nor do they come close to measuring the impact of the school on its students,” says Rob Riordan, who directs the New Urban High School, a joint project of the U.S. Department of Education and the Big Picture Company.
Ironically, he adds, “our project picked Hoover precisely because its methods of assessment place it firmly in the camp of those who argue for school accountability.”
But while this story unfolds, other Essential schools around the country are watching with dismay as more districts unleash similar instances of what Larry Rosenstock, the president of Price Family Charitable Fund in San Diego, has termed “the pedagogy of public humiliation.”
They employ one-time, centralized measures that cannot get at certain critical information and that sort students by mean scores. They dictate the fix: curriculum, teaching method, more tests. And the chilling and punitive climate that results, teachers say, strips them of their right and responsibility to exercise professional judgment in class.
The results can be disastrous to student learning. From January onward, one research group discovered this year, Chicago teachers now virtually abandon any instruction that does not directly relate to the city’s narrowly prescriptive yearly multiple-choice achievement tests. Though the results can close down their schools, teachers say, the tests do not measure the most important things they teach.
Nor do they measure anything with much accuracy, or even spur kids to learn more. In fact, the most recent report from the Commission on Teaching and America’s Future shows that 1990s test-based reforms in Georgia and South Carolina had no effect on student achievement, while states that invested in teaching had large increases in achievement.
But under increasing pressure to look good, school people say, they have learned to rig test results by teaching kids test-taking tricks (answer every question; wrong answers don’t affect the score), retaining students in lower grades so their scores will look better, or simply selecting out students who will not score well.
And many schools shy away from even collecting information about student learning patterns. If they reveal their trouble spots, they fear, the state or district will slap them with sanctions and interventions that take away their ability to improve.
A Better Body of Evidence
But collecting and revealing that information is exactly what Essential schools must do, contend Coalition leaders who aim to demonstrate the value and effectiveness of teaching to a different kind of “test”-based on human judgment, and measuring both essential knowledge and inquiring habits of mind.
To balance a political climate that puts its money on the numbers, Essential schools around the country are assembling an array of additional and better evidence-portfolios, public exhibitions, outside reviews by visiting educators, and more-that represents the real work they know their children accomplish.
They are cutting through the accounting-book rhetoric to take a closer look at student work and at the assignments that prompt it. They are charting the many factors that contribute to higher student achievement but are difficult to quantify-parent involvement, teacher reflectiveness, supportive connections with other schools, and more. They are launching long-term efforts to follow how their graduates do in college and the workplace.
And they are opening up the conversation about student performance to their own communities-coming to a deep and common understanding of what their students know and can do, the reasons for that performance, the best possible ways of improving it, and each person’s individual responsibilities in doing so.
In the San Francisco Bay Area, former Oceana High School teacher John Larmer works with Kate Jamentz at the Western Assessment Collaborative at WestEd to help schools and districts look closely at their standards and how teachers use them in assessing student work.
“Does the school say it wants kids to reflect on their own learning process and assess their own work?” Larmer asked a workshop group of Essential school teachers at the Coalition’ s 1997 Fall Forum. “Let’s check out four pieces of student work. Is there evidence that these students know the standards on which they’re being assessed? Do they seem to have the habit of monitoring their own progress?”
“We want to see students who are actively engaged in meaningful work, who know important things and can use them, and who can tell us what they’re doing and why,” Kate Jamentz says. “Five years from now, kids will be talking about their work in terms of quality, not just in terms of ‘getting it done.’ They’ll have the habits of rehearsal and revision, and they’ll be able to ask teachers for the help they need.”
But to make that happen, teachers have to cultivate “the habit of relentlessness,” as Larmer and Jamentz call it-by systematically designing class activities that get at their standards, then “reading” students’ work against those standards -not only to “score” it, but to analyze what further help kids need.
To build and sustain those skills, they assert, school communities must look together at student work-to define common standards, and to examine whether students are getting the help and opportunities they need to reach them.
Examining the Assessments
In the San Mateo Foster City district, Jamentz gathered parents, teachers, administrators, and community members for one intense afternoon to talk about how their district was measuring children’s reading skills. Hunkered down intently over a language arts test, their pencils sometimes slippery with nervousness, the clock ticking away as they figured which answers of the multiple choices were likely to be right, these Californians got a close-up look at what tests could show about their kids-and what they could not.
“This is fine as far as it goes,” one parent said after several hours of trying out a standardized test, a directed writing assessment, a benchmarked reading task, and more. “But there’s a lot it doesn’t show. Are you getting any information, for example, on whether my child likes to read?”
Other schools invite their constituencies to help shape, review and tune school standards in collaboration with teachers. Parents, students, and teachers at the Francis W. Parker Charter Essential School in Devens, Massachusetts gather periodically to sort writing and math problem-solving samples at different levels into piles labeled “good enough,” “not good enough,” and “better than good enough” for their students.
At intervals, Parker also invites veteran teachers from comparable schools to review a sampling of its year-end portfolio assessments, giving important feedback on how reliable Parker ratings are compared to those of outsiders. Last year, the outsiders’ ratings agreed with the insiders’ ratings anywhere from 86 to 100 percent of the time, providing important corroboration for any who worried about objectivity in portfolio assessments.
Human Judgments Matter
At Fannie Lou Hamer Freedom High School in the Bronx, 16-year-old Evelyn Abreu stood in front of a graduation committee and in a lilting Spanish accent described her social studies research into the “English-only” movement in this country. “I talked to my own family about their immigration experience,” she said. “I followed the Congressional debate and the Supreme Court decision on a case in Arizona. I read the book Hold Your Tongue and an article by Senator Daniel Akaka, and I interviewed someone from the office of Jose Serrano here in New York.”
Before she graduates in June, Evelyn must successfully present and defend seven completed subject-area portfolios before a committee made up of two teachers, an outside adult (sometimes family), and an eleventh-grade student.
“It’s stressful,” she says. “You need to have confidence in yourself. But you gain this by concentrating on your work, and remembering that you have already done what you needed to-now you just need to explain your work.” Right now Evelyn is revising a science portfolio presentation on tuberculosis that she tried out last spring for her committee. “They told me what I needed to fix,” she says, “and I’m working on it. It’s useful, because we learn to use the habits of mind.”
Such graduation practices are spreading among Essential schools committed to knowing their students well and demonstrating knowledge and thoughtfulness through public exhibition. Though they may be more subject to human fallibility than a standardized test, advocates say, they demonstrate the very purpose of schooling in a democratic society: to teach and practice human judgment.
“In the name of objectivity and science, the testing industry has led teachers and parents to doubt their own judgments about their children,” declares Deborah Meier, who recently founded the new Mission Hill School in Roxbury, Massachusetts.
Personal judgments hold even more power, she argues, because the evidence they depend on comes from people close in to the school’s experience. In small schools like Hamer, where students, teachers, and parents know each other well, it’s hard to fool them on quality.
“Kids start coming home and talking to parents about what they’re doing in school, or asking them about current issues, and a new level of intellectual exchange starts,” says Ann Cook, who heads the Urban Academy, a small school that, like Fannie Lou Hamer High School, is affiliated with the Center for Collaborative Education (CCE), a regional Coalition center in New York City. “Most high school parents aren’t in schools much, you know. They make judgments about us based on their own close-up looks at student learning.”
“Hard” and “standardized” data about student achievement also show up plainly through human observations, Deborah Meier points out. At her new Mission Hill School, teachers tape record all children twice a year to document how they handle written text and how they talk about books and language.
“It’s standardized, because all kids are interviewed in a common fashion,” she notes. “And it’s public. And interested parties can verify or contest its conclusions. Most important, students, parents, and teachers have immediate access to common information-unlike test results, which arrive months later in a format that makes it impossible to tell what kids got wrong or why.”
Multiple-choice tests that must remain secret to work properly, Meier points out, provide only indirect data about student performance. Portfolios or exhibitions provide direct data; so does information on what happens to kids after they graduate, which Central Park East Secondary School (also founded by Meier) has been gathering for several years now.
Assessment as Videotape
Human observations need not displace other information-gathering efforts; in fact, they can inform research, as schools explore the efficacy of their new designs and practices. Many Essential schools work with university-based researchers, investigating their own questions about the central issues of teaching and learning they face. In New York, CCE follows up such “action research” with events where small networks of schools develop, share, and critique their ideas and projects.
But because school change involves changing an entire culture of expectations and habits, its patterns emerge slowly and are often hard to chart. “We need a videotape, not a snapshot, to see the truth about student performance,” one Essential school teacher observes.
That videotape may come about when schools and districts begin to work with state offices to limit the extent and stakes of testing, and to develop supports for thoughtful standards-driven assessment by teachers in schools.
In Maine, for instance, the state’s new performance-based assessments may soon serve not as a graduation exam, as some states have it, but as a checking measure against locally devised and driven assessments, says David Ruff, who helps direct the Southern Maine Partnership, a regional CES center.
“Now that we have our state “Learning Results’ standards in place,” he says, “the state is hoping to have local schools take 90 percent of the responsibility for aligning their own curriculum, instruction, and assessments with them.” In a perfect world, Ruff observes, what the school knows through its portfolios or other assessments should match kids’ scores on the “extended-open-response-item” state tests they take in grades four, eight, and eleven.
“If some scores show drastic differences,” he says, “it could reveal a problem at the school level-or it might just mean the student was sick that day. The school gets a chance to defend the child,” who might well show very powerful learning in another time or form.
Such ownership of statewide standards by local communities is very powerful, Ruff says. “But it will take a mammoth professional development effort,” he cautions, “to come up with a system where local assessments are valid and reliable.”
Essential School principles have had a deep and lasting effect on state educational leaders in Maine, spurred by its early membership in the Re:Learning initiative of the Education Commission of the States. The results stand out on the national scene; the National Center for Fair and Open Testing (FairTest) praises Maine’s state assessment system as a “modest and exceptional approach.”
But in most other states, FairTest shows, tests are not even based on the new standards states have adopted. And their largely multiple-choice questions typically fail to provide a range of methods for students to demonstrate their learning. (See sidebar)
Moreover, test scores alone reveal more about student demographics than about student performance, critics assert. When Paul Harrington of Northeastern University’s Center for Labor Market Studies recently compared statewide student test results in Massachusetts against a list of communities rank-ordered by per-capita income, the ten highest-scoring communities were the ten wealthiest, and the bottom ten in test scores matched nine of the poorest communities.
High Stakes, Low Learning
The picture can get even bleaker when states attach high-stakes outcomes like high-school graduation or teachers’ careers to such tests, whether or not they reflect a community’s consensus on standards, or correlate to anything important.
In Louisville, teachers bristled recently when the state used test scores to slap an “in crisis” label on the J. Graham Brown School, long regarded as a model of Essential School practice from kindergarten through high school. It dispatched “distinguished educators” to coach teachers on how to improve, but the school climate had turned so tense and defensive that even good advice, observers say, largely fell on deaf ears.
Chicago holds back eighth-graders who score below grade level on the content-driven standardized tests given at every grade level every year. If a school’s scores are too low, it risks an “intervention” in which the school board conducts a public hearing, closes the school, and evaluates all the employees-who can lose their jobs without due process protections if they do not pass inspection.
Even if money then flows to a low-performing school to help improve its performance, such a climate can work at cross purposes, points out one Chicago principal.
“When your scores start getting better, the money that’s helping you dries right up,” she observes dryly. “That doesn’t exactly make schools want to come forward with data.”
But some Essential schools in states with high-stakes assessments are augmenting or even replacing their state’s measures with more authentic-and often more difficult-accountability plans and demonstrations of what their students can do.
Three Essential “pilot schools” in Boston, freed from many district regulations, will substitute for the district’s usual Comprehensive School Plan a three-year review cycle in which they identify key areas needing attention, assemble portfolios of evidence, and invite outsiders to review their progress. If it works well, the plan may end up informing district policy.
Ohio makes all students pass a standardized proficiency test to graduate. But juniors at Reynolds-burg High School must also demonstrate their understanding of history through completing a semester-long “junior thesis”course, in which they either carry out a community or political action project or defend a historical thesis centered on an essential question of their choice.
“For logistical reasons, we had to exempt from this requirement those students who were already taking a year-long Advanced Placement (AP) European history course,” says teacher Steve Shapiro, who developed the junior thesis seminar with his colleague Rob Sass. “It’s ironic: students find the thesis course so rigorous that our AP students have actually tripled in the year since.”
Because they accept only certain strictly labeled “A to F” high school courses for admission, California’s selective state universities make it hard for Essential schools to show student learning in courses that cross disciplines or mix students instead of tracking them. But at Homestead High School in Cupertino, Lauri Steel and other teachers have been developing an alternative transcript to document students’ learning in more authentic ways.
The new transcript does list courses grade by grade, but the “restructured” interdisciplinary courses do not appear with letter grades beside them. Instead, they show up on an attached “transcript of portfolios,” which scores student knowledge, communication skills, work habits, and habits of mind in relation to the school’s standards. (See sidebar)
College entry requirements also perpetuate the use of Carnegie units, which chill efforts to cross disciplinary boundaries in curriculum and teacher training and allocation. Some Essential schools have resorted to keeping an odd kind of double books on their transcripts-listing an integrated course called “The Craft of Science” for instance, with the extra “physics” label for any who need their sciences neatly sorted.
In the best of cases-and usually when teachers help create them-new state standards and assessments actually line up well with the bold changes schools are making. Four years ago, Grass Lake High School in Michigan eliminated math tracking and adopted the Core Plus math program, developed at Western Michigan University. All students now take the three-year integrated math program, says teacher Larry Poertner, and despite detracking, the percentage who pass the state’s mandatory high school proficiency tests has not dropped.
“The test dovetails nicely with the program, in which kids can approach a problem from any method they want as long as they justify their methods,” he says. “I see all our kids becoming more rigorous thinkers, and working together more cooperatively.”
Those districts willing to face the expense can also turn to the tough New Standards assessments, created by Lauren Resnick’s group at the University of Pittsburgh and the National Center for Education and the Economy. Instead of ranking students as norm-referenced tests do, these aim to determine whether students have learned a set of tasks, concepts, and skills set out in accompanying standards for English language arts, mathematics, science, and “applied learning.” New Standards hopes its tests will be flexible enough to assess a range of school-crafted curriculum, though many Essential schools express worry about that. And the tests are quite hard; schools where classes are still driven by old-style tests will also have to scramble to do well on them.
Information Out in the Open
But creating richer assessments only addresses part of a school’s responsibility to publicly examine and analyze what they find out about student learning. And it takes courage and trust, Essential school people agree, to have honest conversations-with parents, teachers, and higher-ups-about student performance and how to make it better.
Teachers may know when their students are not meeting the outcomes they set, says Linda Belarde, who heads a small Essential high school on a Zuni reservation in New Mexico. “But it often doesn’t feel safe to disclose that to their colleagues and the principal,” much less to outsiders.
The more schools make kids’ work public, through exhibitions and other community gatherings, the more they build a context in which the public knows students better and can participate in, not just judge, a school’s improvement.
“We’re aiming for our school to work along the model of an artists colony,” says Teri Schrader, who leads the Arts and Humanities faculty at the Parker School. “Teachers can share work in progress, even if it’s not yet successful, with the faith that the whole school community will be able to help bring it along.”
At a recent parents forum, for example, Schrader asked parents to help think through sensitive issues in an upcoming family history unit. And in twice-yearly conferences with teacher-advisers, Parker parents and students help create “personal learning plans” that spell out their strengths, goals, and strategies for each year.
Trust also grows when school people use plain language to talk about student performance. “I tell parents and kids straight out when the kids cannot read, write, and do the math they need to,” says Fran Vandiver, who at Florida’s Fort Lauderdale High School began a “prerequisite academy” to bring entering students to the point where they could handle the school’s expectations. “Then we have to decide how to address that problem over time, and how to measure and value their progress fairly.” Vandiver uses the Test of Adult Basic Education instead of conventional school tests, because “it tells us clearly when kids’ reading and math applications are up to speed.” Demystifying standardized tests can also clarify the community’s goals, Essential schools who have tried it say. For example, when Massachusetts parents were shown samples of items on the new state exams their children will soon have to pass for graduation, even upper-middle-class families with high levels of education-the ones everyone assumed would support the measures-were taken aback. Must all tenth graders, they wondered, know the structure of DNA?
In fact, four very different curricula make up an American public education, Stanford professor of education Larry Cuban has noted: the standardized curriculum that appears in all the textbooks, the curriculum actually taught from day to day by teachers, the tested curriculum-and, finally, the learned curriculum, which students carry with them into the world.
“Essential schools aim for the learned curriculum,” declares Larry Rosenstock. “And their standards are actually much higher than the standards that show up in these laundry lists of content.”
Meanwhile, at Hoover High School in San Diego, teachers are on the phone tracking down students from the class of 1996-the ones who did so badly on the district’s tests that their alma mater is in trouble now. Of the 197 graduates they have reached so far, 129 are in two- or four-year colleges. Five are in trade schools; eight are in the armed services; 49 are working; and the rest are at home.
One of those HHS graduates is Manassa Abraham, who came here with his parents from a starving Somalia a few years ago. Now he is working toward a degree from San Diego State, with a full scholarship from Future Educators of America.
Back at Hoover, Lia Thao-who left Laos a decade ago with her non-literate Hmong family and now lives with 14 relatives in their San Diego apartment-watches his path, and wonders why the district thinks her school is failing. Her own future, she realizes, depends largely on whether Hoover succeeds in helping her learn what she needs to know.
Can the tests adequately measure that knowledge and those skills? Can they reveal what we need to know about Hoover? As Essential school educators take up the challenge to help their communities as well as their students learn in more effective and authentic ways, they persist in the hope that children-not tests-will soon move to the center of the public gaze.