a young girl writes in a booklet with a blue pen

PHOTO CREDIT: RIDO/STOCK.ADOBE.COM

The U.S. Department of Education paused the testing requirements of the Every Student Succeeds Act (ESSA) during the pandemic, and the states followed suit. This assessment pause is an opportunity for school board leaders to reframe the concept of accountability and discuss how evidence collected for accountability purposes can inform policy development. Too often, data is used only in service to accountability systems, aiming for higher test scores year-to-year for different groups of students of the same age.

This is the chance for school leaders to think beyond that, to take a fresh look at your students and teachers. It’s an opportunity to redefine what vital signs of your district matter most, and to build the evidence that will help you measure how healthy your district may be. It is a time to ask different questions of your superintendent and research staff that can better inform your governance discussions.

This pandemic-caused interruption also provides a moment to question how your district measures the progress of learning. It is so frequently mismeasured and misunderstood that a review of your district’s method of measuring the progress of learning would be useful to leadership, staff, and the public.

Are you relying on the right test to estimate growth?

When state assessments and the National Assessment of Educational Progress (NAEP) results were released in the fall, they showed lower scores for students whose schooling was interrupted. Lower scores were to be expected, as many of these students were in remote learning programs or not in school at all for a significant part of two years. Yet these results have been characterized as unprecedented “learning loss,” implying that students had lost skills and knowledge. What students actually lost is critical instructional time and learning opportunities.

What students “lose” or “gain” becomes clearer when tests designed to measure growth, like interim assessments, are given to students several times a year. Interim assessments provide results for each student that can be useful for teachers. State assessments are designed to see the degree to which students are meeting grade-level standards, and they yield useful results for classes, schools, and districts. Any discussions of your district’s data should make clear the distinct purposes of these different sets of data.

The degree of imprecision and uncertainty also vary for each type of test. One state assessment given each year, with 45-60 questions covering some standards at each grade level, is unlikely to provide evidence as sound as interim assessments given three times a year, delivering a total of 150-200 questions.

Are you viewing growth from the right vantage point?

Results in some states are reported by state departments of education longitudinally, comparing the same students year-to-year. Others report results cross-sectionally, comparing, for example, the results of this year’s fifth graders to fifth graders in prior years. This is an important distinction. These two methods of measuring progress answer entirely different questions. By closely connecting your evidence to the question you’re attempting to answer, you’ll be more likely to keep interpretations of the evidence within the boundaries of reason.

To expand your board’s knowledge of measured progress beyond what state assessments have made visible, consider asking questions like these:

How does our district assess writing, and at what grade levels and within what subjects?

Do our district’s assessments measure emerging readers’ skills, including measurement of reading fundamentals like phonics skills? Can teachers see multiple measures of reading, for example, comprehension vs. decoding.

Do we screen for dyslexia, and act on that information?

What is the distance or spread at each grade level and subject between the lowest and highest scoring students? What tools are we providing for teachers to address this challenge, and is it more difficult now because of the interruption of instruction?

the hands of a teenage boy are seen as he writes in a booklet

PHOTO CREDIT: PANITAN/STOCK.ADOBE.COM

Are you gathering data beyond academic achievement?

Along with results from state-mandated assessments and your interim assessments, governance teams should continue to explore what other data the district is collecting. How could that data be built into evidence you use to better understand your district’s vital signs? Is it used to inform recommendations to the board? This is particularly important now in the wake of two disrupted school years. Your district may, in fact, possess information that is only available for a short time and could answer new questions.

Will this kind of discussion be a new practice for your community? Exploring evidence to learn about your district’s vital signs may require some new habits of mind. The accountability era has led many boards to limit their concerns. But now is the moment to think big and think differently.

The interruption of schooling is an opportunity to measure factors other than progress of learning. A potential treasure trove of information is available to you that can be the basis for deeper thinking and better policy decisions. These have included the commitment to development of measurements of social-emotional learning that usually includes student, teacher, administrator, and parent surveys.

Those surveys of attitudes and beliefs, for the most part, make it possible to analyze engagement. They allow you to anticipate students’ emotional needs. They provide clues to the reasons why parents may not reenroll their kids in your schools. They may enable you to gauge elements of teacher satisfaction and dissatisfaction. You may want to ask principals questions about the support they need. Exploration of far more subtle and complex questions suggested by this kind of data also may serve to support more positive governance team relationships.

Some state accountability systems report factors like attendance rates, but perhaps your board wants to see your district’s attendance rates analyzed differently. You may look in more detail about excused and unexcused absences, going deeper than the catch-all category of chronic absenteeism. Graduation data, while notoriously difficult to compare and standardize, also will be very important as you analyze the impact of recent events. Clarifying what your board wants to know, and what your district’s leadership wants to be mindful of, should guide your data analysis requests. Staff time is precious. Pick your questions with care.

Board members could ask questions about the range of student responses to remote instruction. The evidence of students whose pace and depth of learning improved while learning from home—rare but important outliers—could reveal opportunities to more fully engage some students who prosper outside the classroom.

Questions like these may enable your board to see opportunities to improve attendance and student learning, while affirming a previously unrecognized skill some teachers command.

Which teachers reported higher personal satisfaction when delivering instruction remotely?

For those students who favored online instruction, in which courses and with which teachers did they report better learning experiences?

Which students have continued in remote learning after we reopened schools, and why?

What data do we have to assess any differences in achievement between in-person and remote learning students?

Are there significant differences among the achievement of students in remote learning in different grades and subjects? What about the impact of various teachers and school leadership?

Many districts are deciding whether to continue to offer supervised remote learning. Before you get to the discussion of the costs and additional administrative supervision required, you will want to explore what benefits can be seen and measured. Of course, the board will need to respond to students’ and parents’ wishes and to think about all the complex issues that will arise.

Exploring questions like those listed above can provide governance teams with evidence on which to base a thoughtful and serious policy debate. This year following the pandemic is our opportunity to broaden and clarify our interpretation of education data.

Pursue research-practice partnerships

New questions open up research opportunities. Many universities and colleges have professors and graduate students seeking opportunities like those your governance discussions may develop. Partnerships with research institutions represent an opportunity to unearth new insights and change our idea about what school is and can be. Research-practice partnerships supported by foundations can leverage academic work with governance and program development. See what the William T. Grant Foundation (https://rpp.wtgrantfoundation.org/about) has to say about research-practice partnerships.

Most large, urban school districts and county-level school districts have their own research and data analysis departments. Their staff likely will have the capacity to respond to many of the requests for information that may arise from your governance team discussions. However, the potential research ideas that suggest themselves may be a wealth of opportunities for educational researchers. Districts with smaller and mid-size enrollment may be able to add analytic talent through these partnerships.

I hope that school boards will take this opportunity to deepen their knowledge of assessment and broaden their discussions about evidence on other indicators of the district’s vital signs. Excessive attention has been paid to accountability data, and we have paid too little attention to other evidence. We can change that by asking the right questions.

Jill Wynns (jillwynns@gmail.com) is a 24-year veteran of the San Francisco Board of Education and served as the president of the California School Boards Association. She is the co-author with Steve Rees of Mismeasuring Schools’ Vital Signs: How to Avoid Misunderstanding, Misinterpreting and Distorting Data, recently published by Routledge.

 

Around NSBA

Six students conduct a science experiment with potatoes and electrodes.

2024 Magna Awards: Silver Award Winners

The 2024 Magna Awards program recognizes 15 exemplary district programs in three enrollment categories as Silver Award winners.