My Take on Dan Walters’ Take on the Legislative Analyst’s Take on Closing the K-12 Achievement Gap

You knew it was going to happen.  Dan Walters, formerly with the Sacramento Bee and now with CalMatters, has weighed in with his take on the recent report from California’s Legislative Analyst’s Office on closing the K-12 achievement gap.  His post on the CalMatters website is here.  The LAO report is here.  

After reviewing data showing the size and scope of the achievement gaps, the LAO proposes four options for legislative action:  (1) make achievement gap information more readily available, (2) monitor efforts to improve school leadership, (3) create standards for reviewing districts’ academic plans, and (4) establish an academic assistance program for the lowest-performing districts.  It is this fourth recommendation that Walters finds the most “intriguing.”

He starts with the premise, that, as he puts it, “LCFF’s [Local Control Funding Formula] underlying assumption was—and is—that expanding spending would automatically produce better outcomes.”  This is patently untrue.  As one who was at the table when the LCFF was being negotiated, I can report that many hours were devoted to tying LCFF dollars to developing a system of local planning and accountability.  It was never assumed that improvement would automatically follow from the redistribution of education dollars.

The planning process must involve the entire school community and locally developed action plans must address eight state priorities.  District plans are subject to review by county offices of education.  A system of targeted support and assistance is being developed for districts that fail to meet performance benchmarks.  One can argue that the system has not worked as intended, but one cannot argue that the LCFF dollars were expected to “automatically” produce improved results.  

Nor can one argue that the LCFF resulted in “expanded” spending.  Rather, it redistributed among districts dollars that would have been provided anyway pursuant to the minimum funding requirements of Proposition 98.  True, it results in some districts getting more than they otherwise would have, but it did not increase total statewide funding.  In fact, it took several years after the LCFF was enacted for funding to return to pre-recession levels.  

These misperceptions underpin his focus on the proposal for an academic assistance program as the “most intriguing” of the four recommendations.  This, he states, would be a way to “crack down” on chronically underperforming districts.  This big stick approach has been at the core of federal and state accountability systems since the No Child Left Behind Act, and it’s still not working.  However, it does appeal to those who believe that bad schools shoulder the primary blame for poor student performance.

On this point, the LAO report includes an important caveat.  While acknowledging that achievement gaps are influenced by many factors outside as well as inside the education system, the report—pursuant to legislative direction—focuses only on the school factors.  Outside factors, according to the LAO, include “family characteristics, residential segregation, health disparities, and public safety issues.”  We could also add childhood trauma and food insecurity to this list.  I’m about to propose a theorem:  the desire to crack down harshly on underperforming schools is positively correlated with the lack of desire to address seriously the out of school influences on student achievement.

It must be acknowledged that Mr. Walters’ preference for recommendation #4 is not without reservation. Whereas the LAO recommends that state academic intervention be voluntary, he believes it should be mandatory.  He supports this opinion by drawing a comparison to the Fiscal Crisis and Management Assistance Team (FCMAT) and noting that few districts volunteer for FCMAT services: “School officials would be just as unlikely to self-report educational failures as they are to call attention to their financial shortcomings.”  In other words, not very likely.  Here again he subverts the truth.  It took me about 90 seconds on the Internet to learn that roughly 80% of FCMAT’s local assistance activities are at the request of the local education agency and only 20% are pursuant to direction from the Legislature or other agency.

For what it’s worth, I believe the most important recommendation from the LAO report is to monitor implementation of the California School Leadership Academy.  I would go further and say we need to increase investments in leadership training.  Among the many schools and school districts I have visited, the common thread among the top performers is strong, intelligent, dedicated, and even inspirational leadership.  Policy makers readily acknowledge this, but it is more often in the form of lip service than actual funding.  Making wise and significant investments in developing effective school leaders could reduce, if not eliminate, the number of districts we need to crack down on.  Isn’t that what we really want?

California’s Local Control Funding Formula Fails Moderate-Need Districts

The Public Policy Institute of California (PPIC) has just issued a report on how (or if) the Local Control Funding Formula (LCFF) has changed school funding in California (“School Resources and the Local Control Funding Formula”).  The report asks, “Is increased spending reaching high-need students?  The answer is:  somewhat.

The report finds that spending per student increased by an average of $500 in high-need districts as compared to low-need districts, but that, on average, funding for the typical high-need student increased by only by $350 relative to non-high-need students.  (A high-need district is defined as one with more than 55% high-need—or “unduplicated count”—students.  A low-need district has fewer than 30% high need students.  And a moderate need district has 30 to 55% high-need students.  I know, 30% and 55% fall into two different categories, but this is probably an inconsequential oversight.)

The disparity between the $500 increase for high-need districts and the $350 increase for high-need students is explained by two factors. One, the LCFF provides less additional funding for high-need students in low- and moderate-need districts (due to the concentration factor that is provided only to high-need districts); and two, districts may not necessarily spend all of their incremental LCFF dollars on the students who generate them.

This second explanation is troubling, because, if true, it would mean that districts are not complying with the spirit of the LCFF. It is also troubling, because school-level spending data are not available, so we don’t really know.  As the author acknowledges, “Because it is drawn from district-level spending data, the average difference in spending between high- and low-need schools, and high- and low-need students is based on the spending levels in the district in which that school or student is located [italics in the original].”  Furthermore, the analysis “is predicated on the assumption that districts spend a roughly equal amount on each student.” Accordingly, I believe the conclusion about per-student spending needs to be tempered with a great deal of caution.  

My own takeaway from this report is that the impact of the LCFF is a mixed bag.  The primary purpose of the new formula is to improve funding equity by allocating more dollars to districts with high-need students.  However, the report shows that high-need districts were already receiving a higher level of funding than other districts under the old revenue limit system—a trend that may have continued, or even increased, depending on funding for categorical programs like Economic Impact Aid.

Specifically, the report shows that, in 2003 high-need districts received 9% more revenue than low-need districts and 11% more than moderate-need districts, while in 2017 they received 12% more than low-need districts and 16% more than moderate-need districts.  So, high-need districts gained relative to both of the other types of districts.

Meanwhile, as compared to low-need districts, moderate-need districts received 2% less in 2003 and nearly 4% less in 2017. In other words, high-need districts gained relative to low- and moderate-need districts, while moderate-need districts lost relative to high-need and low-need districts.  In fact, between 2003 and 2107, funding for moderate-need districts (12%) grew at a lower rate than funding for, both, high-need districts (17%) and low-need districts (14%). This finding is counter-intuitive, given that low-need districts receive only the base grant and moderate-need districts receive the base grant plus supplemental grants. This finding may have something to do with the fact that the LCFF was not fully funded in 2017 and the inclusion in the formula of the economic recovery target, which ensures that nearly all districts will have the funding restored to pre-recession levels after accounting for inflation. Still, it’s not what was expected.

(Note:  These calculations are done based on a line graph in the report.  Neither the report nor the technical appendix provides data points for the graph, so the percentages may not be exact.  However, the direction of the changes is accurate.)

For moderate-need districts, the impact of the LCFF has been the opposite of its intent, at least so far.  This is a serious issue that must be addressed. This concern is not mentioned in the PPIC report, which instead focuses on the school- and student-level distribution of LCFF dollars.  But fixing this problem would be a big step toward achieving the improved school- and student-level equity that the PPIC favors.

One solution would be to reduce or even eliminate funding for the concentration factor and use it to increase funding for the supplemental grants.  I know this would be politically difficult (to put it mildly), but so was the shift to the LCFF.  I remember having reams of computer printouts on my desk comparing the district-by-district impact of different versions of the LCFF with each other and the revenue limit system.  Any potential changes to the current LCFF formula will necessarily involve a similar exercise in balancing good policy with political realities.