The Demeaning Power of PISA

Last December PISA released the results of its 2018 international assessments.  PISA is the acronym for the Programme for International Student Assessment, which is developed by the Organisation for Economic Co-operation and Development (OECD) and administered to 15-year-old students once every three years.  In 2018 it was administered in 78 countries.  In the United States it was administered to 4,811 public and private school students from 162 schools.  Statistical techniques are used to extrapolate the scores of that sample to all of the more than 4 million 15-year-olds in the U. S.  Countries are ranked on the basis of standardized scores in the areas of science, math, and reading.  In 2018, the United States ranked 37th in math, 18th in science, and 13th in reading.  Some have questioned the validity of these techniques and argue that the “true” ranking of a country can vary significantly from that reported by PISA.

The rankings for the U.S. haven’t changed much since PISA was first administered in 2000.  And every three years, the scores trigger a fresh round of handwringing and angst over the sad state of American education and the failure to improve.  This time is no different.  The headline in the New York Times read, “It Just Isn’t Working: PISA Test Scores Cast Doubt on U.S. Education Efforts.”  In the Boston Globe: “School Reforms Fail to Lift US on Global Test.”  And the LA Times chimed in with, “Why do U.S. schoolchildren underperform academically compared with students in other countries?”  Based on prior experience, the U.S. reaction to the PISA scores would probably have been more intense if the release of the scores had not coincided with the impeachment of the president and the daily diet of the latest news on Trump, Ukraine, Giuliani, Biden, etc.

And the U. S. is not alone.  For example, the Sidney Morning Herald wonders, “Should Australia Be in PISA Shock?”  And the Guardian reports, “UK school reforms to come under scrutiny as world rankings released.” 

Such is the demeaning power of PISA.  It has the ability to throw entire nations into doubt over the state of their education systems.  But is this power justified?  In a word, no.  Here’s why.

First, and most critically, any comparison requires the populations to be compared to be similar, but that’s not the case with PISA.  For example, the scores for China are based on only four provinces (Beijing, Shanghai, Jiangsu and Zhejiang), which are the most affluent in the country.  The remaining, lower-performing provinces are excluded.  The OECD permits this, even though it admits that affluence is positively correlated with test scores.

Similarly, Macau, which ranks third on PISA, is expected to become the richest country in the world (as measured by gross domestic product per capita) this year.  Singapore, which ranks second on PISA, is the third richest country in the world based on the same GDP per capita measure.  Singapore’s scores also reflect an education system that is based on rote memorization and test preparation beginning in preschool.  Most students in Singapore also receive several hours per week of private tutoring in addition to attending public schools.  Test scores are used to place students in one of four tracks in high school

In contrast, the U.S. has proportionately more students in poverty than most other post-industrial countries.  This brings the U. S. scores down, even though low-income students in the U. S. actually outperform their peers in other post-industrial countries.  They simply account for a higher percentage of the U. S. test taking population.  To make matters worse, there is some evidence that the U. S. oversampled low-income students in the past.

Participating countries also treat special needs students in different ways.  PISA has non-binding “guidelines” that govern the exclusion of students from the testing sample.  Students may be excluded if they have “functional” (i.e., physical) disabilities, “intellectual” (i.e., mental or emotional) disabilities, or “insufficient language experience.”  This further reduces comparability of testing samples, because counties decide for themselves which (if any) students to exclude and what criteria to use.  The U.S. includes special needs students in its sample.  These students comprise the lowest scoring subgroup on standardized assessments.

Valerie Strauss of the Washington Post (who I believe is the best education reporter in America) notes that in 2014 more than 100 academics from around the world called for a moratorium on PISA.  She writes that the reasons include: “PISA feeds into an overreliance on standardized tests and an emphasis on learning that can be easily measured and, some experts say, has major flaws with how the tests are administered, how samples of students are determined, and how some of the test questions are constructed.”

Strauss’s article (available here) includes a post by Yong Zhao, a Foundation Distinguished Professor in the School of Education at the University of Kansas, which summarizes common criticisms of PISA.  These include:

  • The claim that PISA measures knowledge and skills essential for the modern society or the future world is not based on any empirical evidence. 
  • PISA’s assumption that there is a set of measurable skills and knowledge that are universally valuable in all societies, regardless of their history and future is problematic. 
  • PISA treats economic growth and competitiveness as the sole purpose of education.

This article is well worth reading and is a good start to understanding the problems with PISA.

My Take on Dan Walters’ Take on the Legislative Analyst’s Take on Closing the K-12 Achievement Gap

You knew it was going to happen.  Dan Walters, formerly with the Sacramento Bee and now with CalMatters, has weighed in with his take on the recent report from California’s Legislative Analyst’s Office on closing the K-12 achievement gap.  His post on the CalMatters website is here.  The LAO report is here.  

After reviewing data showing the size and scope of the achievement gaps, the LAO proposes four options for legislative action:  (1) make achievement gap information more readily available, (2) monitor efforts to improve school leadership, (3) create standards for reviewing districts’ academic plans, and (4) establish an academic assistance program for the lowest-performing districts.  It is this fourth recommendation that Walters finds the most “intriguing.”

He starts with the premise, that, as he puts it, “LCFF’s [Local Control Funding Formula] underlying assumption was—and is—that expanding spending would automatically produce better outcomes.”  This is patently untrue.  As one who was at the table when the LCFF was being negotiated, I can report that many hours were devoted to tying LCFF dollars to developing a system of local planning and accountability.  It was never assumed that improvement would automatically follow from the redistribution of education dollars.

The planning process must involve the entire school community and locally developed action plans must address eight state priorities.  District plans are subject to review by county offices of education.  A system of targeted support and assistance is being developed for districts that fail to meet performance benchmarks.  One can argue that the system has not worked as intended, but one cannot argue that the LCFF dollars were expected to “automatically” produce improved results.  

Nor can one argue that the LCFF resulted in “expanded” spending.  Rather, it redistributed among districts dollars that would have been provided anyway pursuant to the minimum funding requirements of Proposition 98.  True, it results in some districts getting more than they otherwise would have, but it did not increase total statewide funding.  In fact, it took several years after the LCFF was enacted for funding to return to pre-recession levels.  

These misperceptions underpin his focus on the proposal for an academic assistance program as the “most intriguing” of the four recommendations.  This, he states, would be a way to “crack down” on chronically underperforming districts.  This big stick approach has been at the core of federal and state accountability systems since the No Child Left Behind Act, and it’s still not working.  However, it does appeal to those who believe that bad schools shoulder the primary blame for poor student performance.

On this point, the LAO report includes an important caveat.  While acknowledging that achievement gaps are influenced by many factors outside as well as inside the education system, the report—pursuant to legislative direction—focuses only on the school factors.  Outside factors, according to the LAO, include “family characteristics, residential segregation, health disparities, and public safety issues.”  We could also add childhood trauma and food insecurity to this list.  I’m about to propose a theorem:  the desire to crack down harshly on underperforming schools is positively correlated with the lack of desire to address seriously the out of school influences on student achievement.

It must be acknowledged that Mr. Walters’ preference for recommendation #4 is not without reservation. Whereas the LAO recommends that state academic intervention be voluntary, he believes it should be mandatory.  He supports this opinion by drawing a comparison to the Fiscal Crisis and Management Assistance Team (FCMAT) and noting that few districts volunteer for FCMAT services: “School officials would be just as unlikely to self-report educational failures as they are to call attention to their financial shortcomings.”  In other words, not very likely.  Here again he subverts the truth.  It took me about 90 seconds on the Internet to learn that roughly 80% of FCMAT’s local assistance activities are at the request of the local education agency and only 20% are pursuant to direction from the Legislature or other agency.

For what it’s worth, I believe the most important recommendation from the LAO report is to monitor implementation of the California School Leadership Academy.  I would go further and say we need to increase investments in leadership training.  Among the many schools and school districts I have visited, the common thread among the top performers is strong, intelligent, dedicated, and even inspirational leadership.  Policy makers readily acknowledge this, but it is more often in the form of lip service than actual funding.  Making wise and significant investments in developing effective school leaders could reduce, if not eliminate, the number of districts we need to crack down on.  Isn’t that what we really want?

Finding the Right Balance under California’s Local Control Funding Formula

Several years ago, the Ford Motor Company’s advertising claimed that, in its cars, quality was built in, not added on.  I thought that was a pretty catchy slogan and I am reminded of it when I think about the problem of how to balance supplemental educational services with the “base” program under the requirements of California’s Local Control Funding Formula (LCFF).  The LCFF provides funding to districts in three tiers:  base, supplemental (based on the number on high needs students), and concentration factor (provided to districts whose percentage of high needs students exceeds 55%).  High needs students are defined as those who are eligible for free or reduced-price meals, English learners, or in foster care.

Just as leather seats and fancy chrome trim cannot make up for an underpowered engine or whatever we had before disc brakes, so high quality supplemental educational programs cannot fully compensate for a low-quality base program.  While providing supplemental programs and services is often necessary to address the needs of high-need students, we also want those students to have access to the highest-quality classroom instruction—the “base” program—where they spend most of their school day.  In other words, the value to students that may come from supplemental services may have less effect if they come at the expense of maintaining a high-quality base program.  I think most people would agree that quality that is built in is better than quality that is added on.

Finding the right balance between base and supplemental programs in a zero-sum environment is not easy. This has been a bone contention since the enactment of the LCFF, with advocates for high-needs students often objecting to the use of supplemental and concentration funds for base or core educational expenses instead of increasing or improving supplemental programs and services to specifically benefit the target students.  

Two recently introduced bills in the California State Assembly, AB 1834 and AB 1835, both by Assembly Members Shirley Weber and Sharon Quirk-Silva, are sure to provide focal points for this debate during the upcoming legislative session.  AB 1844 requires the development of a mechanism to better track the expenditure of supplemental and concentration grant funds at the local level.  AB 1835 requires carryover supplemental and concentration grant funds to continue to be used to increase or improve services for the target students in subsequent fiscal years.

Nowhere does this debate loom larger than in the area of teacher compensation, which—after all—accounts for the largest single share of any school district budget.   In response to a question from the Fresno County Superintendent of Schools as to whether supplemental and concentration funds could be used to pay for across the board salary increases for teachers, the California Department of Education (CDE) released a letter dated April 14, 2015 that stated that “In some limited circumstances, it might be possible to demonstrate in an LCAP [Local Control and Accountability Plan] that a general salary increase will increase or improve services for unduplicated pupils.  However, the burden on a district to justify use of supplemental and concentration funds for such an increase is very heavy.”  The letter went on to state that “in our view, this additional burden…is extremely, if not impossible, to meet…”

However, this letter was followed by a notice to school administrators dated June 10, 2015, which sought to clear up some “misunderstandings” from and “supersedes” the April 14 letter.  The June 10 notice stated that “A district may use supplemental and concentration funds for a general salary increase in a manner consistent with the expenditure regulation and LCAP Template regulations.”  To do so, however, a district “must demonstrate in its LCAP how this use of the grant funds will increase or improve services for unduplicated pupils as compared to services provided to all pupils.”  And, “For example, a district may be able to document in its LCAP that its salaries result in difficulties in recruiting, hiring, or retaining qualified staff which adversely affects the quality of the district’s educational program, particularly for unduplicated pupils, and that the salary increase will address these adverse impacts.”

Objections to this interpretation were typified by Assembly Member Shirley Weber’s (the lead author of AB 1834 and AB 1835) comments to the LA Weekly:  “Once you open this up, you open up something else, and then you find yourself in a position having taken this money for these schools and these kids and not being able to produce the results.”  Using money for teacher compensation, she argued “is not what we intended at the state level.”

The fact that a major portion of the LCFF revenue stream is called “supplemental funding” strongly suggests that it is to be used for programs and services that would supplement—or be added to—the base program, rather for directly improving the base program itself.  Advocates for the students that generate the supplemental and concentration grant funding are insistent that those funds be spent on additional programs and services for those students, and they want the expenditure of those funds to be transparent in school district budgets.  But does an unbending adherence to this practice run the risk of having schools in which the quality is added on, but not built in?

In this context, it’s worth revisiting the paper that provided the analytical and policy underpinnings of the LCFF, “Getting beyond the Facts:  Reforming California School Finance” by Alan Bersin, Michael Kirst, and Goodwin Liu.  You can read it here.

In that paper, the authors lay out four principles for reform:  (1) revenue allocations should be guided by student needs; (2) revenue allocations should be adjusted for regional cost differences (the LCFF does not do this); (3) the system as a whole should be simple, transparent, and easily understood by legislators, school officials, and the public; and (4) reforms should apply to new money going forward, without reducing any district’s current allocation.  

The first two principles are relevant to my point and are based on the authors’ findings that “high-poverty districts receive only slightly more revenue per ADA that low-poverty districts” and “district revenue per ADA does not reflect the regional cost of hiring school personnel.”  The main problem with schools in these areas was not that they didn’t have enough supplemental programs, but that the base program was thread bare.  They had insufficient instructional materials, science labs that were either poorly equipped or non-existent, dilapidated facilities, higher student-teacher ratios, and more teachers who were not fully credentialed. 

These are the problems the LCFF is intended to address.  (Incidentally, these are the same kinds of issues that were the subject of the complaint in Williams v. State of California, which charged the state with failing to provide students from low income communities and communities of color with the “basic necessities required for an education [emphasis added].”) 

Specifically, with regard to teachers, Bersin, et al. write: “Indeed, high-wage regions of the state tend to have higher student-teacher ratios and a higher percentage of teachers with emergency credentials.  A rational school finance system should strive to ensure that education dollars have the same purchasing power from region to region, especially when it comes to hiring and retaining high-quality teachers.”

It’s true that allocating supplemental and concentration factor dollars within a district to address these problems would benefit all students, not just those who generate them, but it could be more beneficial to the targeted students than alternative expenditures.  I’m not suggesting that the additional funds never be used to provide supplemental services.  Rather, I’m saying that if the purpose of the LCFF is to improve outcomes for needy students, the importance of a strong base program to achieve that goal should not be lost.  

The Public Policy Institute of California recently released a report showing that, since the adoption of the LCFF, districts serving low-need students have increased teacher compensation by a larger amount than districts that serve high-need students despite the former districts receiving smaller total annual revenue increases.  This suggests that the local pressure to avoid using supplemental and concentration funds on teacher compensation is having at least some effect.  It also means that high-need districts may be losing the competition for high quality teachers.

Ohio State University Study: “Schools Don’t Cause the Achievement Gap”

In 2004 Richard Rothstein wrote Class and Schools, which makes a meticulously documented and compelling case that schools alone cannot overcome the effects of poverty to close the academic achievement.  The idea that education needs the support of other programs to be successful goes back at least to President Johnson’s proposal for the Great Society.  The Elementary and Secondary Education Act (ESEA) of 1965 was one component of that initiative.  President Johnson recognized that education, although fundamental, was just one of several avenues to the Great Society.  In remarks to the White House Conference on Education in 1965, he said, “Education will not cure all the problems of society, but without it no cure for any problem is possible.”   Neither can education, by itself, overcome all of the problems of society.

But by the time the ESEA was reauthorized in the form of the No Child Left Behind (NCLB) Act, this idea was flipped on its head into the notion that education, by itself, indeed can and should overcome the effects of social and economic inequality to close the academic achievement gap.  All we need is better schools, and the only thing we need to get those better schools is more accountability. 

When NCLB was enacted, I sometimes quipped that, if we were really serious about it, then we should have at least three other supporting laws:  the No Child Goes to Bed Hungry Act, the No Child Goes without Medical Care Act, and the No Child is the Victim of Violence Act.  Although important to his goal of leaving no child behind, George W. Bush did not launch or expand any federal programs to address these problems.  In fact, he vetoed a bill to expand the Children’s Health Insurance Program (CHIP) and signed a bill to prohibit the federal government from negotiating discounts with drug companies.  He also vetoed the 2007 Farm Bill, which increased food stamp benefits, but his veto was overridden by Congress.  He did nothing to speak of regarding neighborhood safety.  Education stood alone as the one program where “failure” would not be tolerated, even as the out-of-school needs of students went increasingly unaddressed.

Any talk of the obstacles to learning presented by poverty or under-resourced schools was shot down as making excuses.  “No excuse” schools became all the rage, and serious consideration of the need to address directly the conditions of poverty in order to improve academic performance was discouraged.  Better schools were all we needed.

Since the publication of Class and Schools other studies have documented the losing battle that schools are waging against the effects of poverty.  For example, studies have documented that low income (largely inner city) students retain less learning during the summer vacation months.  This results in learning deficits the next school year.  Even if the school produces a full year of academic growth, that deficit is not overcome, and is just compounded the following year.

A new study from The Ohio State University sheds fresh light on this problem.  As reported in the Sociology of Education (you can read it here and read about it here), the study finds no difference in reading gains between schools serving poor or black students and those serving nonpoor or white students.  In announcing the study, the lead author, Douglas Downey said that it “suggests schools are neutral or even slightly compensate for inequality elsewhere,” and challenges “the traditional story about how schools supposedly add to inequality.”  Schools are not the “engines of inequality” they are often portrayed to be.  

Don’t get me wrong.  I will never argue that it is unreasonable to expect more of schools.  But neither is it unreasonable to expect our state and our country to do more to directly address the effects of poverty, such as hunger, poor physical and mental health care, and housing insecurity. To be sure, there is a fine line between using poverty as an excuse and acknowledging poverty as an obstacle that schools alone cannot overcome.  But we’re not doing our students any favors if fear of excuse-making causes us to turn a blind eye to poverty and its effect on learning.  As Downey concluded, “We are probably better off putting more energy toward addressing the larger social inequalities that are producing these large gaps in learning before kids even enter school.”

Amen to that.

Grading ALEC’s “Report Card on American Education

Every year the American Legislative Exchange Council (ALEC) issues a “Report Card on American Education.  ALEC is a partly libertarian, partly conservative—some would say right wing—not-for-profit that issues reports and writes model legislation in a number of policy areas, including education.  Its primary interests in education are the expansion of vouchers and charter schools and ridding public schools of unions.  It is funded in large part by the Charles G. Koch Foundation and other like-minded organizations. The 23rdedition of the Report Card was issued this past September.  

According to ALEC:

The education policy grade on each state’s Report Card is based on six factors: state academic standards, charter schools, home- school regulation burden, private school choice, teacher quality, and digital learning. Because the Education and Workforce Task Force at ALEC focuses the most on private school choice and charter schools, those factors were given double weight in the calculation over overall rank and grade. The weighted grades were converted into a GPA average and an individual rank [emphasis added]. 

Since most studies (including the most recent one from the National Center for Educational Statistics, which you can read here) conclude that charter schools do not outperform traditional schools it may seem puzzling that they get double weight in the rating system.  But at least ALEC is upfront about the fact that this methodology is without any analytic foundation:  it’s simply because charter schools, along with private school choice, is the focus of its Education and Workforce Task Force.

Here’s a quick look at the six factors ALEC uses for the Report Card and my own letter grade for each factor.  

State academic standards.  This is based exclusively on “the difference in the percentage of students considered proficient by the state exam and the percentage of students in that state who scored as proficient on NAEP.”  NAEP provides state-level reports for only two subjects—reading and math—and reports them for only three grades—4, 8, and 12.  Using only this measure to assess a state’s education system is a little like using only temperature to assess a patient’s health, and ignoring blood pressure, cholesterol levels, body mass index, and other vital measures.  In addition, because each state has its own methodology for selecting NAEP test-takers, across-state comparisons are iffy at best.

Moreover, ALEC is not clear on how it interprets these data.  If the percentage of students scoring proficient or above in a state is higher than the percentage scoring proficient or above on NAEP, does ALEC take that to mean that the state’s cut scores are set too low, or that the students are high scoring?  Alternatively, if the percentage of students scoring proficient or above in a state is lower than that state’s NAEP percentage, does that mean the state’s cut scores are too high or are its students underperforming?  We don’t know.  But in the end, it doesn’t matter anyway, because drawing either set of conclusions relies on each state’s assessments being perfectly aligned with NAEP so that the scores would be interchangeable.  That is not the case.

Grade:  F

Charter schools.  According to ALEC:

The charter school grade on the Report Card is based on a publication of the Center for Education Reform, which grades charter school laws across a series of factors. The Charter School Law Ranking and Scorecard takes into account features of a state’s charter law that influence how well charters are able to flourish, such as availability of independent authorizers, lack of growth caps, autonomous operation free from legal or regulatory red tape, and funding equity. The handful of states that have not yet passed a charter school law received Fs on their Report Cards. 

Alright, then, let’s take a look at how the Center for Education Reform (CER) grades charter school laws (you can read for yourself here).  CER evaluates state laws based on the extent to which they do the following:

  • Enable citizens to create schools that are independent from traditional school bureaucracies in oversight and operations.
  • Provide schools wide latitude to operate and innovate without onerous administrative rules and regulations, which dictate what they can do and how they can do it. 
  • Give parents numerous, meaningful school options, allowing them to provide their children with an education tailored to individual needs.

Notice the absence of any mention of oversight or accountability.  This is especially remarkable, because the first sentence that CER uses to introduce its guide to charter school laws is this:

The simple and original principle of charter schooling is that charter schools should receive enhanced operational autonomy in exchange for being held strictly accountable for the outcomes they promise to achieve [emphasis added]. 

But accountability plays no role in CER’s evaluation of state charter school laws.  Following CER’s lead, ALEC gives the highest rating to states that have the laxest laws governing the establishment of charter schools without regard to the quality of those schools.

Grade:  F

Home school regulation burden.  According to ALEC:

The policy grades in this category correspond to the Home School Legal Defense Association’s analysis of state laws, which categorizes the burdens states place on parents who wish to homeschool, from relatively-innocuous notice requirements, to high-regulatory environments that may make it difficult for parents who choose this form of education for their families. 

This groups places each state in one of four categories based on how restrictive or unrestrictive it is regarding home schooling.  The categories are:

  • States requiring no notice to homeschool
  • States with low regulation
  • States with moderate regulation
  • States with high regulation

California is a “low regulation” state and may be fairly unique among states in that homeschooling can be publicly subsidized through charter school enrollment.  All well and good, but because homeschoolers intentionally separate themselves from formal public education, laws regarding homeschooling have no place in an evaluation of a state’s public education system, unless the objective is to weaken that system.

Grade:  Not applicable

Private school choice.  This refers only to the availability of publicly funded vouchers to pay for private school education.  ALEC uses three criteria to evaluate private school choice:  size and scope (the bigger the better), purchasing power (the more the better for voucher recipients), and flexibility and freedom (the less regulation the better).

As with charter schools, ALEC couldn’t care less about the quality of the schools or programs that are supported by public dollars.  Accordingly, there is no need for assurances that taxpayer dollars are spent either appropriately or for schools and programs with at least a modicum of quality.

Grade:  F

Teacher quality.   ALEC punts entirely to the National Council on Teacher Quality (NCTQ) by using its annual ranking of the states.  Without bothering to define what a high quality teacher is, NCTQ does not consider actual teacher quality in its rankings.  Instead, it ranks states on the basis of how closely their policies align with what NCTQ believes (rarely correctly) lead to high quality teachers.  These policies include differential pay, merit pay, and outcome-based accountability for teacher preparation programs, among other factors.

Regarding the teacher preparation programs, NCTQ has been taken to task by Diane Ravitch and others for its slip-shod methodology for evaluating teacher preparation programs.  Noting that NCTQ does not even visit the programs it evaluates and instead relies on course catalogues and Google searches, Linda Darling-Hammond says it’s like a restaurant reviewer reviewing a restaurant on the basis of its menu without ever tasting its food.  In reviewing NCTQ’s 2018 teacher preparation report, the National Education Policy Center wrote, “..the report has multiple logical, conceptual, and methodological flaws. Its rationale includes widely critiqued assumptions about the nature of teaching, learning, and teacher credentials. Its methodology, which employs a highly questionable documents-only evaluation system, is a maze of inconsistencies, ambiguities, and contradictions.”  You can read NEPC’s full report here.

Grade:  F

Digital learning.  ALEC uses the rankings of the “Digital Learning Report Card” produced by the Foundation for Excellence in Education (Jeb Bush is the CEO). However, the report card has been removed from the foundation’s website, so its criteria and methodology cannot be reviewed.

Grade:  Incomplete

There you have it.  As a measure of actual school quality across the states, ALEC’s Report Card is an utter failure.  However, it is a good measure of how closely each state’s education policies align with ALEC’s misguided notion of good policy.  In that sense, a low grade is to be coveted.  California scores a D-.

California’s Local Control Funding Formula Fails Moderate-Need Districts

The Public Policy Institute of California (PPIC) has just issued a report on how (or if) the Local Control Funding Formula (LCFF) has changed school funding in California (“School Resources and the Local Control Funding Formula”).  The report asks, “Is increased spending reaching high-need students?  The answer is:  somewhat.

The report finds that spending per student increased by an average of $500 in high-need districts as compared to low-need districts, but that, on average, funding for the typical high-need student increased by only by $350 relative to non-high-need students.  (A high-need district is defined as one with more than 55% high-need—or “unduplicated count”—students.  A low-need district has fewer than 30% high need students.  And a moderate need district has 30 to 55% high-need students.  I know, 30% and 55% fall into two different categories, but this is probably an inconsequential oversight.)

The disparity between the $500 increase for high-need districts and the $350 increase for high-need students is explained by two factors. One, the LCFF provides less additional funding for high-need students in low- and moderate-need districts (due to the concentration factor that is provided only to high-need districts); and two, districts may not necessarily spend all of their incremental LCFF dollars on the students who generate them.

This second explanation is troubling, because, if true, it would mean that districts are not complying with the spirit of the LCFF. It is also troubling, because school-level spending data are not available, so we don’t really know.  As the author acknowledges, “Because it is drawn from district-level spending data, the average difference in spending between high- and low-need schools, and high- and low-need students is based on the spending levels in the district in which that school or student is located [italics in the original].”  Furthermore, the analysis “is predicated on the assumption that districts spend a roughly equal amount on each student.” Accordingly, I believe the conclusion about per-student spending needs to be tempered with a great deal of caution.  

My own takeaway from this report is that the impact of the LCFF is a mixed bag.  The primary purpose of the new formula is to improve funding equity by allocating more dollars to districts with high-need students.  However, the report shows that high-need districts were already receiving a higher level of funding than other districts under the old revenue limit system—a trend that may have continued, or even increased, depending on funding for categorical programs like Economic Impact Aid.

Specifically, the report shows that, in 2003 high-need districts received 9% more revenue than low-need districts and 11% more than moderate-need districts, while in 2017 they received 12% more than low-need districts and 16% more than moderate-need districts.  So, high-need districts gained relative to both of the other types of districts.

Meanwhile, as compared to low-need districts, moderate-need districts received 2% less in 2003 and nearly 4% less in 2017. In other words, high-need districts gained relative to low- and moderate-need districts, while moderate-need districts lost relative to high-need and low-need districts.  In fact, between 2003 and 2107, funding for moderate-need districts (12%) grew at a lower rate than funding for, both, high-need districts (17%) and low-need districts (14%). This finding is counter-intuitive, given that low-need districts receive only the base grant and moderate-need districts receive the base grant plus supplemental grants. This finding may have something to do with the fact that the LCFF was not fully funded in 2017 and the inclusion in the formula of the economic recovery target, which ensures that nearly all districts will have the funding restored to pre-recession levels after accounting for inflation. Still, it’s not what was expected.

(Note:  These calculations are done based on a line graph in the report.  Neither the report nor the technical appendix provides data points for the graph, so the percentages may not be exact.  However, the direction of the changes is accurate.)

For moderate-need districts, the impact of the LCFF has been the opposite of its intent, at least so far.  This is a serious issue that must be addressed. This concern is not mentioned in the PPIC report, which instead focuses on the school- and student-level distribution of LCFF dollars.  But fixing this problem would be a big step toward achieving the improved school- and student-level equity that the PPIC favors.

One solution would be to reduce or even eliminate funding for the concentration factor and use it to increase funding for the supplemental grants.  I know this would be politically difficult (to put it mildly), but so was the shift to the LCFF.  I remember having reams of computer printouts on my desk comparing the district-by-district impact of different versions of the LCFF with each other and the revenue limit system.  Any potential changes to the current LCFF formula will necessarily involve a similar exercise in balancing good policy with political realities.

Pulling the Curtain Back on CCSA

Leaked documents received by the blogger Michael Kohlhaas (a fictitious name, check it out) at provides an interesting insight into the long-term goals of the California Charter School Association (CCSA).  At its executive summit that took place last October, CCSA considered several changes to its “Strategic Snapshot,” including a proposed change to its vision statement.  Documents prepared for that meeting show that the current vision is “Increasing student learning by growing [sic: “growing” is an adjective, not a verb] the number of families choosing high quality charter schools so that no child is denied the right to a great public education.”  The focus of this statement is exclusively on increasing the number of charter schools.

The proposed new vision is “Providing absolutely every young person a great public education by growing the number of families choosing high quality charter public schools and encouraging all public schools to become more charter-like” [emphasis added].  

The CCSA documents acknowledge that, up to now, it has been “silent on what is supposed to happen to the rest of public education [and] we have not been able to agree about what we want the public education system to evolve into.” But now that they’re getting “blowback” from people “who assume that the charter school movement is one big replacement strategy” they need a vision statement to allay those concerns.  In other words, the purpose of the change is to quell concerns within the traditional school community that the long-term goal of organizations like CCSA is some sort of hostile takeover of all public schools. Quoting again from the document, “By showing that we believe that all public schools can evolve to become more charter-like, we are signaling that we believe the end state we are moving to is one where, yes, all public schools will be charter schools or charter-like schools, but that we believe a part of the equation is that many existing public schools will be able to remake themselves so that they may play an important role in the future of public education.”  

The phrase, “we believe that all public schools can evolve to become more charter-like” is both condescending and arrogant.  It assumes a superiority for charter schools that does not stand up under close scrutiny and ignores the troubling level of corruption that distinguishes charter schools from traditional schools.  In addition, nowhere does CCSA explain what it means to be “charter-like.” This is a curious omission given the centrality of this concept to its new vision statement.  Let me take a stab at it.

At the classroom level, there is little, if anything, to distinguish between a charter school and a traditional school.  (Of course, I’m excluding virtual schools and charters that are organized as home schools).  Both address the same content standards with similar curricula and instructional materials. Both employ certificated teachers. And both are subject to the same state testing requirements.  

But apart from the classroom there are some important differences.  For example, a “charter-like school” is a school that:

  • Is only indirectly accountable to a publicly elected governing board for management and is instead governed by a self-appointed group of individuals.  
  • Is beyond the reach of a publicly elected governing board to take any corrective action for academic or financial failures short of the “nuclear option” of closing the school through non-renewal or revocation of the charter.
  • Can expel a student for poor academic performance.
  • Does not have to follow statutory due process procedures when suspending or expelling a student.
  • Allows teachers to teach classes for which they are not credentialed.
  • Is not required to participate in PERS or STRS.
  • Is exempt from the Field Act (earthquake safety requirements).
  • Has (on average) a higher rate of suspensions/expulsions.

There may be other differences that don’t come to mind right now.  But the point is this is a troubling vision for the future of public education.  It will be interesting to see how CCSA responds to this disclosure. 

Public Funds for Private Fun

How to Go to Disneyland on the Public Dime

On July 24, the Los Angeles Times ran an article that detailed how home schools that operate as charter schools use taxpayer dollars to pay for trips to places like Disneyland, Medieval Times, and SeaWorld as well as private horseback riding lessons and other extracurricular activities.  Public funds have even been used to purchase family memberships at the San Diego Zoo.  Some charter schools provide home school families as much as $3,200 per year for these purposes.  You can read the article here, and Diane Ravitch also covered it here.

Some expenses, like ice skating classes, or acting classes (!) qualify as physical education, while trips to Disneyland or SeaWorld are considered field trips.  Generally, home school families can use public funds to make purchases from a list of charter school-approved materials and activities.  Vendors typically must have their products and services approved by the charter school to have them on the list.  In this sense, the public funds are like script that can only be spent at the “company store.” 

At first glance this would seem to be a clear violation of the prohibition against making gifts of public funds that is contained in Article XVI, Section 6 of the California Constitution, which prohibits “the making of any gift, of any public money or thing of value to any individual, municipal or other corporation whatever…”  This prohibition applies to all units of government in California, including school districts.

However, the courts have determined that a “thing of value” may be provided to an individual if the private benefit is incidental to a public purpose.  As far back as 1940, the court, in County of Alameda v. Janssen, cited several prior court cases in stating that, “It is well settled that, in determining whether an appropriation of public funds or property is to be considered a gift, the primary question is whether the funds are to be used for a ‘public’ or a ‘private’ purpose.  If they are for a ‘public purpose,’ they are not a gift within the meaning of section 31 of article IV [now Section 6 of Article XVI].”  Presumably, home charter schools have determined that the private benefit (in the form of personal entertainment) of a ride through Pirates of the Caribbean is incidental to the value the public receives from it.  Sure, that may sound ridiculous, but how else would it be legal?

The same court also ruled that, “The determination of what constitutes a public purpose is primarily a matter for legislative discretion.”  The term “legislative” refers to the action of the legislative body of any government entity (including school district governing boards) and not just the state legislature.  In other words, the courts have granted substantial discretion to legislative bodies (including school district governing boards) in determining whether a private benefit is incidental to a public purpose.

The California Education Code is silent on this issue.  The California Department of Education, however, specifically prohibits the use of public funds for the cost of admission for students or staff to amusement/theme parks or other similar social events, but that prohibition applies only to the California Partnership Academies.

The ruling that “the determination of what constitutes a public purpose is primarily a matter for legislative discretion” has, in every case, been made with reference to a publicly-elected legislative body.  Charter school governing boards, by contrast, are non-elected and self-appointed private bodies with jurisdiction over public funds.  The question of whether this same deference should be granted to charter school governing bodies, therefore, may be ripe for a court challenge.  

But legal action would not be necessary if the Legislature exercised its authority to prohibit—or at least reduce—the inappropriate use of public funds by home charter schools.  The courts have made it clear that the Legislature has the authority to strictly prohibit the use of public funds for specific out-of-school activities, such as admission to amusement parks, etc.  

Alternatively, if the Legislature prefers to allow for a case-by-case approach, it could require the governing body of a charter school’s authorizer to approve each such expenditure at a public meeting, as a non-consent action item.  At least this would bring a publicly-elected legislative body back into the decision-making process and provide for public scrutiny.  In any event, the longer these questionable expenditures are allowed to continue and as the number of families benefiting from this public largesse continues, the harder it will be politically to stop or curtail this practice. 

Dan Walters Gets It Wrong (Again)

Yesterday CALmatters ran an editorial by Dan Walters called, “California tax revenue is soaring.”  You can read it here.  The Santa Cruz Sentinel published it today, but it seems to be behind their paywall. 

Anyway, the thesis is that tax revenues are soaring, and shame on “Democratic politicians and their allies in public-employee labor unions” for not acknowledging that fact and instead continuing the “drumbeat of impoverishment [that] is clearly aimed at persuading Californians, particularly voters, that vital services can be rescued from imminent collapse only be raising taxes.”  To prove his point (at least with respect to “soaring” revenues) Mr. Walters notes that 2018-19 state revenues were “a whopping 71.5% more than the state was collecting a decade ago.”  

True enough, but it seems there are a few things that Mr. Walters prefers not to acknowledge himself. Like, for example, the fact that a decade ago it was 2008-09, when California and the nation were in the depths of the Great Recession and state revenues took a dramatic dive.  In fact, revenues plummeted more than $15 billion from the prior year.  So, much of the “whopping” increase that he cites was used to restore whopping cuts that the state was forced to make during those years.  

In fact, if we use 2007-08 instead of 2008-09 as the base year, then the increase in state revenue is only 37%, even though this is an 11-year instead of a 10-year time span. This is a little more than 3% per year, and far less than the 71.5% that Mr. Walters uses.  It’s worth noting that, during that same time, California personal income increased 56%, so it can hardly be said that rising state revenues (which come mostly from the personal income tax) are taking a bigger percentage bite out of our pocketbooks.

Of course, Mr. Walters lays much of the blame for this greed on K-12 schools and their “skyrocketing” pension costs.  He states (wrongly, of course) that schools have “benefited from ever-rising property-tax revenues.”  Anybody who has taken California School Finance 101 knows that only a very small number of “basic aid” schools benefit from increased property tax revenue.  For all other schools an increase in local revenue is offset by a decrease in funding from the state.  But it’s a good line to use if you want people to believe that schools are drowning in money.

Just How Many More STEM Graduates Do We Need?

Science, technology, engineering, and mathematics (STEM) education is one of the few areas in education policy that has drawn bipartisan interest and support.  In the California State Legislature 33 STEM-related bills were introduced during the 2017-18 were introduced.  So far in the 2019-20 session, only 10 STEM bills have been introduced, but there’s still another year to go.

This interest arises from concerns that there will be a shortage of STEM workers in California and the U. S. in the near future and that the U. S. is in danger of ceding STEM supremacy to countries like India and (especially) China.  These fears are typified by a recent Forbes editorial written by Arthur Herman, a Senior Fellow at the conservative Hudson Institute, called “America’s High Tech STEM Crisis.”  Herman writes that, “leading trends in our higher education suggest that the U.S. is fast approaching a STEM crisis like no other—one that systematically benefits foreign countries and companies, at the expense of our own.” The main countries we are chasing in this race to the high tech topaccording to Hermanare China and India.  Hepoints to data showing that China has at least 4.7 million recent STEM grads as of 2016, India has 2.6 million as of 2017, while the U.S. had only 568,000.  He does not give a date for the U.S. numbers.

To address this problem, Herman suggests taking a page from the Sputnik-era playbook.  He writes: 

We are fast approaching another Sputnik moment, we can’t afford to ignore. Our national security, as well as economic security, depending [sic] on addressing it. We need major high-tech companies like Google and Microsoft; leading universities and colleges; the White House, the Department of Education and the Department of Defense; to come together to craft a high-tech STEM education strategy that can lead us forward to the future.

California has been responsive to this clarion call.  Countless STEM magnet and charter schools have been established, as well as STEM programs within comprehensive high schools.  Numbers are hard to come by, but I would venture to say that a substantial majority of California’s high school students have access to a STEM curriculum.  

But is that enough?  What about the dire estimates of a shortage of STEM workers?  It’s true that many STEM occupations are among the fastest growing in percentage terms. According to California’s Employment Development Department, the need for software developers, for example, is expected to grow by 40.1% between 2016 and 2026.  This is the third fastest growing occupational area and is significantly greater than the statewide average growth rate of 10.7%.  

But when we look at the projected number of new jobs instead of percentage growth, a different picture emerges. EDD projects that California will need an additional 53,800 software developers by 2026, which accounts for only 2.8% of all new job openings.  In fact, the top 11 occupations that will see the most job openings are non-STEM and they account for 94% of all new job openings.  (EDD projects 1,933,100 new openings during this 10-year period; this is net of 2,087,900 new openings less a reduction of 53,800 in shrinking occupational fields.) Computer and mathematical operation, the top STEM field in terms of the number of projected openings (116,200) represents 5.5% of all new job openings.  The top 10 STEM field openings, mostly computer-related occupations like computer systems analysts and software developers, account for 16% of all projected job openings. 

Okay, maybe the number of future job openings is not as big as popularly believed, but what about the current need to fill the existing vacancies that we hear employers complain about?  Surely filling those currently vacant positions will help prevent an oversupply of STEM graduates, right?  Well, according to the PEW Research Center, nearly half (48%) of all STEM graduates work in non-STEM fields.  This is despite the fact that, on average, STEM graduates working in STEM fields have higher earnings than STEM graduates working in non-STEM fields.  Either a significant number of STEM graduates choose to take a lower paying job in a field other than that for which they prepared or there are not enough STEM jobs to employ all STEM graduates.

A more nuanced analysis from the U. S. Bureau of Labor Statistics suggests that the STEM labor market is more heterogeneous than we typically think, and that there are both shortages and oversupplies of STEM graduates depending on level of the degree, specific STEM field, and geographic area.  This report, which you can read here, is worth reading just for its taxicab queuing metaphor.

Labor market concerns aside, some—like Mr. Herman—argue that there is a widening “STEM-gap” between the U. S. and our major competitors, like China, who are producing engineers and other STEM graduates at a much faster rate than the U. S.  This, the argument goes, is a threat to our national security as well as our global competitiveness. 

With respect to Bachelor’s-level engineering degrees, Duke University’s Pratt School on Engineering (no relation, alas) states herethat the U.S. graduates about 70,000 engineers annually, compared to 600,000 for China and 350,000 for India.  (This refers only to Bachelor’s-level degrees.  The U.S. produces more Ph.D. engineers that China according to the National Science Board).  This would seem to substantiate Herman’s point.  However, Duke’s analysis of employment data in the U.S. indicates there is no shortage of engineers, while anecdotal evidence from companies doing business in India and China report shortages in those countries.

So, which is it?  Are we losing the numbers game or not?  Well, it turns out that China and India use different definitions of “engineer,” and comparing U. S. engineering degrees with Chinese and Indian engineering degrees is not an apple to apples comparison.  In China, each province reports the number of engineer graduates to a central agency, but definitions are not consistent across provinces and they often fall short of what we think of as engineers.  For example, “engineer” can refer to a motor mechanic or technician or someone with a 2- or 3-year degree or certificate that is equivalent to an AA degree in the U.S.  These are many of the “engineers” in Mr. Herman’s numbers.

In addition to ignoring definitional differences, Herman also fails to account for qualitative differences among the degrees offered by the three countries.  And apparently the differences are substantial.  According to Zhang Duanhong, the director of the Education Policy Research Center at Tongji University, “China’s undergraduate programs are notorious for low standards and easy classes — and once you’re in, you’re practically guaranteed a degree.”  This assessment is validated by a reportfrom the Proceedings of the National Academy of Sciences, which states:  

undergraduate students at the end of their CS [computer science] programs in the United States have much higher levels of CS skills than their counterparts in three major economic and political powers: China, India, and Russia. Seniors from the average CS program in the United States score far ahead of CS seniors from the average program and are on par with seniors from elite programs from these three countries. Furthermore, seniors from the top quintile of CS programs in the United States are far ahead of seniors from elite CS programs in the other countries. Notably, the advantage of the United States is not because its CS programs have a large number of highly skilled international students.

So maybe we can take a deep breath, sit back, and make a realistic assessment of what our STEM needs really are. I’m actually a supporter of STEM (and especially STEAM—the “A” stands for arts) programs.  And I’m especially supportive of programs that expand STEM opportunities to females and students of color.  But I’m also wary of promising every student with a STEM degree a high paying STEM job.  The price of labor is subject to the same law of supply and demand as the price of any other service or commodity.  An oversupply will push wages down.  That would be a disservice to our students.  In addition, an overemphasis on STEM could displace educational opportunities to prepare students for perfectly rewarding careers in non-STEM fields. That would be a loss.