Documenting Whole-School Change in Essential Schools

What actually changes in Essential schools? Reporting and reflecting on the answers can supply long-term data to guide new decisions. But to be helpful, such information must reveal the interrelated aspects of change, and provide many lenses through which to look for evidence of success.

If they just asked the right questions, students in the Research and Development class at Puyallup High School speculated, they could suggest a way to improve the school’s attendance rate. Working as a team with several teachers and parents, they investigated state regulations and policies from nearby schools, surveyed and interviewed classmates about what made them go to class or not, and finally pre-sented their analysis to the staff.

“It was eye-opening,” says Linda Quinn, the principal of this 1,800-student Essential school near Tacoma, Washington. “They re-vealed some of our practices that contributed to poor attendance, and some ways that kids were beating the system. And then they recommended that we reorganize our impersonal central attendance office so that students instead could deal directly with classroom teachers who know them.” Taking the advice seriously, the faculty revised its procedures. And with attendance rising again, this unusual sociology class chose its next task: to analyze baffling fluctuations on Puyallup’s standardized test scores.

Though few invite students so directly into the act, almost every school bent on improvement must face the problem of backing up its decisions with meaningful information. But what kind of questions should schools ask, and of whom? Who should be hearing the answers, and what will they do with them? Puyallup’s R&D course goes to the heart of these issues, with its unequivocal insistence that research should first shed light on whole-school problems, and then contribute to solving them. More, it demonstrates that a school can chart and analyze its progress from within, not just submit to outside “evaluation.” By helping frame the questions, collect the data, and assess the results, an entire school community can create as well as document a cycle of ongoing improvement.

A Decade of Demonstration

As the Coalition of Essential Schools enters its second decade, it has placed new emphasis on schools’ demonstrating and documenting their implementation of Theodore Sizer’s Nine Common Principles. A Futures Committee report urges CES members to help create “nationally shared but locally defined” measures combining objective, subjective, and performance-based data, in order to show how school initiatives support greater student achievement.

Such efforts at measuring progress serve a number of functions. With some 900 schools using its philosophy, both the Coalition and its public (whether funders or families) need assurance that its work makes a difference to student learning. State education departments want evidence that Essential schools meet their newly emerging standards, and districts want to compare student achievement with that in schools using different educational strategies. Parents and students keep an eye out for individual improvements in everything from kids’ attitudes toward learning to the kind of assignments they bring home. And teachers watch for signs that changes in their own practice are bearing fruit.

The questions one asks to satisfy each of these audiences may be very different, though at times they intersect. But to collect and organize that information into useful and meaningful forms can be a daunting and time-consuming affair. If schools want to look honestly at how they are doing, they must first identify and prioritize their goals, then select “indicators”-questions for which they can obtain reliable and valid data to follow across categories and over time. Taking the time early on to involve key stakeholders in making these matters explicit may, in fact, prove the critical step in the success of a school’s change effort.

Whose Voices to Hear?

Who chooses the questions and how they go about answering them inevitably affects the picture that a school constructs of its progress. At Philadelphia’s Academy for the Middle Years (AMY), for instance, a team of parents, teachers, and students worked for three years with researchers from the University of Pennsylvania and the nonprofit group Research for Action to define their central question and collect qualitative data that addressed it. “What we found caused us eventually to shift the question,” says ethnographer Jody Cohen. “First the school asked how well it prepared students for high school; but after we saw how little the conventional high schools they went on to reflected our own aims, it made more sense to ask how well it prepared kids for life.”

Coached by their university partners, AMY students, parents, and alumni conducted focus groups and interviews, analyzed transcripts, and shadowed students through the school day before they came up with their recommendations for action. This “rich, delicious” process of “participatory evaluation” at once empowers its participants and makes it more likely that their programs will work, argues City University of New York researcher Michelle Fine, who led a similar study at New York’s Crossroads School. It accustoms schools to a culture of inquiry in a “safe context,” and because feedback comes at various levels and in many voices and perspectives, it nurtures multiple constitu-encies for reform.

Peggy MacMullen, who has assembled and analyzed for the Coalition an array of 141 research studies that in some way involve Essential School reforms, calls such documentation efforts “invisible studies”; they are meant not for an external audience but rather to help the school staff improve its practice based on the answers to questions posed internally. Nonetheless, she suggests, giving the school community a sampling of the methods and results helps establish that serious self-study is under way.

Outside researchers conducting case studies in collaboration with the school offer another voice providing evidence of change without stripping data of authenticity and context. Over the last five years more than two dozen case studies have described CES member schools (often concealing the school’s identity when the work sees print). Some report on individual schools; others analyze data across sites to draw more general conclusions.

Donna Muncey and Patrick McQuillan’s School Ethnography Project, for instance, documented the consequences of reform efforts in eight early Essential schools from 1986 to 1991 and analyzed their common problems and characteristics. In contrast, the Coalition’s School Change Study, led by researchers Patricia Wasley, Richard Clark, and Robert Hampel, produced 25 individual “snapshots” of five schools with whom research teams had worked for three years. The Muncey and McQuillan study lent an outsider’s more distanced perspective on certain school reform issues; the Wasley process aimed both to study each school and simultaneously to support its change efforts.

Less formally, many teachers have begun to document and reflect on their own experiences in writing. The Annenberg Institute for School Reform is working with the National Writers Project and the Breadloaf School of English to encourage teachers to form action research teams and write up their findings. And increasingly, teachers’ voices are showing up in regional and national publications affiliated with school reform.

Advances in communications technology, such as desktop publishing and cable TV, can also help schools publicize their work to their communities. The Coalition’s series called Performance, which features individual schools’ progress in 1,500-word articles intended for an audience of funders, the media, and the public at large, provides a useful model for such communication. Linda Quinn puts out a weekly “Direct Line” chronicling Puyallup High School’s progress, as well as publishing an impressive annual report that lays out the larger picture. Mt. Everett Regional School in Sheffield, Massachusetts broadcasts a twice-weekly talk show on cable television to five surrounding communities. And the new Francis W. Parker Charter School in Fort Devens, Massachusetts publishes a regular newsletter describing its efforts to implement a project-based integrated secondary curriculum.

To keep track of the perceptions and priorities of students, teachers, parents, and community members, many schools conduct periodic surveys of these groups. The National Study of School Evaluation in Schaumburg, Illinois customizes such inventories for schools in both English and Spanish, as well as tabulating and analyzing their results; the National Association of Secondary School Principals offers its own surveys and tabulation services. And many schools, such as Noble High School in Berwick, Maine, work with nearby university partners to devise and tabulate surveys based on their particular concerns.

All these forms of documenting school change reflect a wide array of voices and standpoints. The more diverse such efforts, the deeper and richer a picture emerges of just what a school has accomplished in its change efforts, and where it has yet to go. Some means of documentation necessarily duplicate or overlap each other; some represent more distanced, objective perspectives than others. As a result, new questions arise, new ways to seek answers emerge, and new goals can reflect the information schools have acquired.

What to Measure?

It helps if schools throughout the Coalition use similar measures to keep track of what they are doing. Comparing progress along the same indicators used by large empirical studies gives Essential schools added credibility when they argue their effectiveness. And agreeing en masse to follow alternative indicators allows the Coalition to exert considerable leverage as to what kinds of information large studies collect.

Toward this end CES researcher Molly Schen is preparing a list of “common” and “uncommon” measures that, taken together, can provide a multifaceted picture of school progress. (See below) Before collecting any such data, Schen notes, schools should think about how they might someday want to sort it (by gender, race, ethnicity, grade level, teams, or other variables), so they can obtain this information from the start. It makes sense as well to establish a baseline and gather the same data from year to year, and to budget time and effort on someone’s part to collect it.

Rather than sampling successive cohorts of students for the same information, Schen and MacMullen recommend picking one group to follow throughout its Essential school experience. Unless student mobility is a major problem, this technique reveals more about how changes work out; Noble High School, for example, showed substantial gains across the board for its class of 1995, the first to experience restructuring for all four high school years. Other schools, like Baltimore’s Walbrook High School, also keep tabs on how student fare during the four years after graduation.

Essential schools have the particular task of charting progress by indicators that reflect their common beliefs. Ted Sizer’s Nine Common Principles call for a dual focus on intellectual development and a sense of community; so to document its effectiveness, an Essential school must sort out clear ways to show that such a focus is developing.

In Illinois, the Alliance of Essential Schools offers seven categories of information that reflect the Nine Common Principles, so member schools might keep track of their progress. (See: What Indicators Might an Essential School Follow?) And in a more empirical and scholarly setting, a list of indicators for “restructuring practices” came in 1995 from a set of longitudinal studies conducted by the U.S. Department of Education’s research center, which Fred New-mann directs at the University of Wisconsin.

That five-year research effort looked at the effects of restructuring on the achievement of a huge national sample of students through high school, and took a more intensive look at 24 schools (twelve of which were Coalition members). It identified specific areas of school organization that most affected student learning-in particular, the movement away from a bureaucratic or more traditional form of organization to a more communal one. It defined “authentic instruction” and linked it definitively to improved student achievement. And it demonstrated that the positive effects of restructuring show up equitably across lines of race, ethnicity, and socioeconomic status.

In setting out clear constructs by which to measure the elements of school restructuring, the Wisconsin Center added scholarly weight to the efforts of Essential schools whose practices can be similarly defined. Its studies defined and made measurable many things Essential schools care about; and its coldly empirical findings seem clearly to justify and support the effectiveness of Essential school principles.

Documenting Thoughtfulness

Intellectual quality in the classroom, Fred Newmann and his colleagues concluded, mattered even more to student achievement than innovative organizational structures, techniques, or procedures. But exactly how does one document thoughtful teaching and learning in a school that has declared intellectual focus to be a top priority? What “indicators” of thoughtfulness exist, and how can schools follow them?

Newmann answered by developing definitions and standards for “authentic” instruction, assessment, and student work and a scoring method with which faculty could document its presence in each other’s classrooms. His 1995 Guide to Authentic Instruction and Assessment helps teachers use these standards to assess and improve their schools’ intellectual focus.

Researchers Tom McGreal and Marci Dodds, working with the Illinois Alliance of Essential Schools, suggested such indicators as the ratio of paper and pencil tests to performances and exhibitions, the ratio of teacher talk to student talk in classrooms, and the ratio of heterogeneous to homogeneous groups. An increasing number of Essential schools are using the “tuning protocol” that Joseph McDonald and David Allen devised as a way of assessing the quality of student exhibitions, and Allen is publishing several other such protocols in a forthcoming book. “These measures may not lend themselves to large aggregated databases for comparative information,” says Peggy MacMullen, “but networks of schools could certainly use the same measures and study the results together.”

Along the same lines, how can Essential schools document their growing sense of community? A shared commitment to common beliefs and principles leads unequivocally to higher student achievement, recent research by University of Chicago sociologist Anthony Bryk and others has shown. Keeping track of whether the adults in a school share values, a common agenda of activities, and a collegial pattern of relations can shed meaningful light on a school’s improvement. The presence of such “professional community,” as some researchers call it, can be followed through focus groups, “cognitive maps,” or surveys, as well as by tracking the kinds of practices that indicate it.

Making Research Live

Solid research puts steady legs under a school reform effort, but it can also breathe life into a school’s work. By broadening a community’s understanding, Bryk reminds us, ongoing research can catalyze new ideas, signal problem areas, offer conceptual frames in which to discuss issues, provide useful information for brainstorming about possible solutions.

One intriguing approach that supports these aims involves compiling an “inventory of assets” that a school community commands in its quest for improvement. Conceived as a community development tool by researchers at Northwestern University, it is being tried out at the Memphis site of the Atlas Commu-nities Project, a consortium funded by the New American Schools Development Corporation (Nasdc) and jointly developed by the Coali-tion of Essential Schools, Harvard University’s Project Zero, Yale’s School Development Program, and Education Development Center.

“When we gather information on a school’s resources, we’re going beyond data collection,” says Ron Walker, a leadership coordinator for Memphis. “No matter how impoverished the community, everyone in it has skills and expertise that can support the work of the school. This replaces the deficit model of improv-ing schools with information about our assets, so we can define and solve our problems together.”

With the same aim of folding accountability into action, California asks schools in its state restructuring initiative to participate in an ongoing self-review process. “We build the system around reviewing real kids and their work,” says Steven Jubb, a director at the state Center for School Restructuring. “Data like standardized test scores are just not rich enough to make the link between teaching strategies and the work kids do.” Schools also host visits from “critical friends” and prepare a portfolio that demonstrates their progress using a common performance rubric. (See below)

New York’s New Compact for Learning has undertaken similar work in its School Quality Review Initiative, a two-part process of self-study and “external” review that takes place in a five-year cycle. An internal review team involves the whole faculty in a four-year assessment of teaching and learning, and prepares a school portfolio to document its collective perspective, questions, and expectations. And a team of teachers and administrators from other districts as well as parents and community members visits the school for an intensive week of observation, interviews, and looking at student work, and writes a report to the staff. The upshot is a faculty-generated plan of action aimed at continuous improvement and at a “culture of ongoing review.”

In the end, any documentation effort worth its salt will put good questions at its center, answering them with as many different kinds of evidence as possible. At the level closest to students, it will seek out evidence that kids are engaged in meaningful work and experiences. At the district and state levels, it will look for policies and spending decisions that support schools’ capacity to make changes and provide equitable opportunities for student learning. The synergy among these factors complicates the task of documenting school progress, but it also keeps it honest. The messy, living process of changing an organization may not submit to review in any less messy, living way.