Quality Public Education for All New Jersey Students

 

Testimony
     DEI--Letter from Mercer County Association of School Administrators 4-25
     Testimony--Rachel Goldberg--Chronic Absenteeism, School Avoidance--3-11-25
     Information on Beta Testing for SEL Course
     Critical Issues--Mental Health--Article by Joseph Isola on Community Support for Student Mental Health
     Testimony--Kari McGann--School Funding--1-8-25
     Testimony--David Aderhold--NJ Teacher Evaluation Task Force--AED, 11-14-24
     Testimony--Steven Forte--JCPS Hearing on School Security--11-1-24
     Critical Issues--Joint Statement--NJ Teacher Evaluation Task Force--9-30-24
     Testimony--QSAC--State Board 8-7 Jimmy Alvarez
     Op-Ed and Testimony--QSAC--Isola--August 2024
     Testimony--QSAC--State Board--8-7--Jamil Maroun
     Testimony--QSAC--State Board--8-7--Rachel Goldberg
     Testimony--QSAC--State Board--8-7-Kari McGann
     Testimony--QSAC--State Board--8-7--Colleen Murray
     SEL--Maurice J. Elias, Ph.D.--Returning To Polarized Schools in 2024: Recommendations For Educators
     Testimony--Heat Stress--Ginsburg 5-24
     Testimony--Online Education--Aderhold--2-24
     Testimony--Online Education--Ginsburg--12-23
     Testimony--Teacher Evaluations--Goldberg--12-23
     Testimony--Special Education Census Bill 12-14-23--Ginsburg
     Joint Organization Statement on Employee Sick Leave Bill
     Testimony--Bauer--FAFSA Requirement 6-23
     Testimony--Ginsuburg--Asembly Budget Committee 3-27-23.docx
     Testimony--Sampson--Senate Budget Committee
     Testimony--Aderhold Testimony on Student Suicide-3-2-23
     Testimony--Aderhold Testimony (ASA) on Exit Exams--A4639--3-9-23
     Testimony--Ginsburg Statement on S3220 (on behalf of education organizaitons
     Testimony--Ginsburg Testimony on Assessments, 12-6-22, Joint Committee on the Public Schools
     Testimony--Superintendents on Delayed Learning 10-22
     Testimony--Goldberg Testimony on Learning Delay
     Letter Protesting Cut-Off of School-Based Youth Services Program
     GSCS--2022-2023 CRITICAL ISSUES SHEET
     Start Strong Concerns Letter and Response from NJDOE
     Senate Education Committee -- Volpe Testimony (EdTPA) 3-7-22
     Joint Committee on Public Schools Hearing 2-22 Aderhold Testimony (Staffing Shortages)
Testimony--QSAC--State Board--8-7--Rachel Goldberg
My name is Dr. Rachel Goldberg. My testimony today supports the changes proposed by the NJDOE...'

Testimony of Dr. Rachel Goldberg

New Jersey State Board of Education 

August 7, 2024

 

Public Testimony

 

My name is Dr. Rachel Goldberg. My testimony today supports the changes proposed by the NJDOE. I would like you to approve restructuring the I & P calculator to make a more equitable and valid measure of district accountability.

 

I would like to thank Acting Commissioner Kevin Dehmer, Deputy Commissioner Dr. Jordan Schiff, and the team of dedicated NJDOE members. If you, as a policymaking body, are willing to develop a thoughtful and responsive leadership model in the NJDOE, this decision is a critical start.  

 

I come before you as a taxpayer, a parent, and an educational leader.

 

First, as a taxpayer, accountability for our tax dollars is critical. All schools receiving public funds, including charter and private schools, should be held to the same high standards. 

 

Second, as a parent of three children, the QSAC score is not a metric I use to measure the quality of my children’s school.  Between the NJDOE School Performance Reports, Niche, USNWR Rankings, and other metrics, I have a much more robust set of data available than QSAC offers. 

 

Finally, as Superintendent of Springfield Public Schools, I agree with those who spoke before me that the QSAC metric is inequitable and unreliable for a district's quality.  

 

I compiled the QSAC data from the last 18 months of State Board of Education agendas; I found several critical data points that should be considered:

 

First, using only data gathered from Cohort 1 districts, as that is a mostly complete data set and includes test data from before the pandemic in the I&P rating:

 

  1. QSAC I&P averages are directly correlated to the economics of the area 

    1. Districts known as former Abbotts or A districts have an average I&P score of 72, while the wealthiest districts, in the J category have an average score of 87. 

    2. This is further seen in score averages in the different counties, with the top 7 averages including 6 of the eight counties with the highest per capita income.

  2. District grade configurations impact QSAC scores

    1. Districts serving only high school students appear more likely to receive a passing score than those serving grades eight and below.  

 

Second, I also compared the last two years of QSAC data from Cohorts 1 and 2, taken from the Board agendas. 

 

Of the 274 QSAC Ratings compiled:

  • 136 did not meet the 80% mark in I&P

  • The other indicators ranged from 6 to 16 districts that failed to meet the 80%

  • Of the 50 districts with an interim report, 35 still failed I&P

  • Of those 35, one also failed Governance, and one also failed Personnel

 

These findings support the argument that the current I&P calculation is inherently flawed and inequitable and, therefore, may be an invalid model for accountability.  It is time to make constructive changes to QSAC so that the results more accurately reflect effort and student growth, reinforcing the state’s often-stated commitment to equity and excellence.  Please collaborate with the Department leadership to provide a true growth mindset for our educators and children.

 

Written Testimony

 

The perspectives I am sharing as a practitioner are informed by my experience as a taxpayer, parent of three, teacher, and district administrator.

 

First, as taxpayers, accountability for our tax dollars is critical, and all schools receiving public funds, including charters and private schools, should be held to the same high standards. The ratings of all systems should be available for the public in a searchable database, similar to the ESSA and school performance report data. 

 

Second, as a parent of three children, the QSAC score is not a metric valued to measure the quality of my children’s school.  Between the NJDOE School Performance Reports, Niche, US News, and World Report Rankings, I have a much more robust set of data available than QSAC offers.  The school performance reports, aligned to the ESSA requirements, provide a clear picture of student achievement, and they do so every year, so there is the ability to look at a school and district over a period of time.  QSAC’s timeline is only a snapshot of a single school year and is subject to the variations of the “cut-scores” or testing instruments (i.e., NJGPA vs. NJSLA vs. PARCC). 

 

I am entering my fifth year as Superintendent of Schools in Springfield, Union County.  Springfield Public Schools is a small, diverse, suburban district in Union County.  I am proud that we achieved the designation of a “High Performing” district by receiving a passing score in all areas before serving in Springfield. I served as an Assistant Superintendent of Curriculum and Instruction in Passaic Public Schools, a large urban district.   While I was there, Passaic met the other indicators; we were never able to “high performing” in I&P, but that in no way reflected on the incredible work of the educators, but instead was a precise measurement of the community’s socio-economic status.  That continues today, with Passaic receiving passing scores in all areas except I&P. 

 

As a superintendent, my experience is that the QSAC calculation is inequitable and does not provide a reliable measurement of a district’s quality, which is most evident in the Instruction and Program component.  In addition, such an inequitable calculation results in an inefficient use of state and district personnel.  Finally, allowing the current calculations to persist encourages a deficit mindset over valuing growth in learning. 

 

Data Review

 

To go beyond my personal experience, I looked for a source to review QSAC data.  I was disappointed that I could not find any such data source.  I found that the ratings for districts were reported in the State Board of Education agendas, so I compiled ratings from October 2023 through last month to create a data set to study. 

 

Specifically, I first worked to compile a complete data set for the Cohort 1 districts that went through QSAC during the 2022-2023 school year.  I then gathered data from other districts and distinguished between the cohorts and those who received full or interim ratings. Currently, the spreadsheet contains 274 scores, of which 10 are districts with two sets of scores.   

 

The data from Cohort 1 provides a robust sample, primarily because the data utilized was pre-pandemic assessment data.  This data is particularly valuable for evaluating the quality of the I&P calculation because it was collected before the pandemic and does not reflect any inequities exacerbated by the pandemic.

 

Table 1 provides clear evidence that the current I&P score is not a measure of student achievement but rather a measure of community economics.  As you can see in the table, the average I&P scores increase as the district factor group (DFG) moves from former Abbott (A) districts through some of the areas with the highest averages of household income and family education levels. 

 

Below, please find tables related to the Cohort 1 data analysis.

 

Table 1: Average Cohort 1 Scores by DFG Group

 

DFG

AVERAGE of I&P Score

A

72

B

73

CD

78

DE

76

FG

80

GH

83

I

84

J

87


 

Table 2 demonstrates how the district grade spans impact the Cohort 1 average I&P scores.   As shown in Image 1, in K-8 districts, this calculation is 10 points, while in a K-12 district, this calculation is 5 points.  9-12 districts have no points allocated to the science assessment. It is unsurprising that the districts with the lowest averages are K-8. It leads us to question whether comparing the quality of education in a K-12 versus a K-8 is appropriate, as the different measures and calculations of those measures result in such variation among the averages. 

 

Table 2:  Average Cohort 1 Scores by Grade Span

 

Grades

AVERAGE of I&P Scores

7 to 12

84

HS

81

K-12

79

K-8

75

PK-5

80

PK-6

81







 

Image 1: DPR Snapshot, I&P Calculation, Science Indicator

 

At this time, the individual indicator level data for districts for I&P indicators is not available, but I hypothesize that the differing values of the indicators significantly impact these averages.  While there are many iterations of data calculation possible, we must answer critical questions regarding why we value this data, what we expect that a “failing” designation truly means, and whether such a designation is an accurate picture of the district or simply the result of a specific point allocation.

 

Inefficient Use of NJDOE and District Resources

 

When discussing accountability measures, it is also essential to look at how this model utilizes the resources of the New Jersey Department of Education staff and County offices. 

 

For the districts that did not meet the 80% threshold, the NJDOE requires that districts create a “District Improvement Plan.”  For many districts with high percentages of free and reduced-priced lunch students, lagging student achievement is also monitored through the School Performance Reports required by ESSA.  Districts not meeting the student achievement metrics are identified and may be required to develop a “District Improvement Plan” to address those indicators.  This means the NJDOE manages two accountability mechanisms with the same data.  Instead of identifying support opportunities or providing professional support, our tax dollars support an accountability bureaucracy. Many years ago, in my graduate program, I researched the inequitable accountability measures put on districts serving the students with the highest needs, and we continue to perpetuate that inequity today.

 

One of the most critical resources for the Department of Education and school districts is time. The current requirements mean that district staff are spending time creating and presenting these plans instead of working to support instruction.  Likewise, allocating NJDOE personnel to review plans and conduct interim visits means they need to do more to engage in productive discussions regarding teaching and learning in the classrooms.

 

There is no question that there are school districts whose QSAC scores demonstrate management concerns.  However, putting NJDOE resources into districts whose only failure is operating in a high-risk urban environment is a misuse of accountability.  One solution would be to require the oversight and “District Improvement Plan” only for districts that fail one or more of the other four indicators of QSAC.  Districts that only fail to meet the threshold in Instruction and Program due to student achievement should not require more paperwork or oversight, especially if they are already meeting the thresholds of the ESSA accountability matrices. Ultimately, this is not about giving districts “passes” or “lowering the bar,” but instead utilizing the appropriate resources to ensure the goals of QSAC are met, primarily that of meeting expectations for responsible management of tax dollars and state requirements.

 

We must also ask whether the QSAC processes identify districts with critical management issues.  Over the past two years, multiple districts have had findings from auditors that are deeply concerning.  Unfortunately, I could not find those particular district QSAC scores, but we should look at whether QSAC is finding these districts or merely utilizing auditors' findings. 

 

Rhetoric of Minimizing Growth and Maximizing Failure

 

As many will point out in their testimony, New Jersey has remained among the top performers in the nation.  However, recent discussions on early literacy also show a pattern of state achievement that does not grow at the same levels as other states. 

 

The key here is growth.  If we overvalue a single metric and undervalue the process to meet benchmarks, we value the ends over the means.  This is ultimately at odds with the goals of education and the research around learning and growth mindsets.  In capital markets, shareholders look toward growth as an investment goal.  When we think about stocks like Apple, we often think about whether we had invested ten or twenty years ago.  That is because their investors have gained income based on growth over time.  Likewise, we are investing in students, and a single indicator of achievement has much less value than growth over time. 

 

Many years ago, research found that students from lower socio-economic levels entered school with significantly less vocabulary, giving them a larger expectation for growth.  What happens when we identify districts that may not meet the state achievement averages but exceed the growth indicators?  Currently, we are likely to label them as failures.  This is deeply concerning, as it is only in this sector that we minimize the value of growth.  When we do this at the state level, it is a clear message to our educators: their work is undervalued.

 

Our educators are incredible, but we are losing them faster than we can manage, and we are compounding the stresses educators already face with a constant message that whatever they are doing is not enough.  We can change how we talk about student learning, and starting at the top, we can share the value of growth.  Let’s be savvy investors in education and look for opportunities to support growth.

 

Valid and Reliable Measurement

 

One of the most significant weaknesses of the Instruction and Program measurement is its reliance on assessments that do not produce usable data on student learning and achievement.  An example of this is using the Science data in the I&P calculation.  The Science assessment is not offered consistently throughout a student’s academic career and provides no growth data.  It has been changed frequently over the past several years and has dismal student achievement throughout the state, even at some of the highest achieving districts.  Its use as an accountability mechanism is deeply concerning, as it appears to have significant reliability issues and provides little to support stronger instruction in the sciences. 

 

Another example was in the Cohort 1 calculation previously mentioned.  Due to the regulations set, districts were required to either utilize 3-year-old data sets or submit a waiver calculation for the I&P process. Our district had shown some growth despite the pandemic, but that was not utilized during QSAC.  When we submitted the waiver request, the NJDOE calculation was incorrect.  Either way, the district would meet the 80% threshold, and we chose not to appeal the waiver calculation, as we did not feel that it was an efficient use of our time.  For many Cohort 1 districts, the same data generated two iterations of the I&P scores.  This is an example of wasteful accountability.  New Jersey can do better, and we must be prepared to ensure that we are indeed measuring the teaching and learning work in our schools.

 

Additionally, using a single year of data is problematic, especially in smaller districts.  In Springfield, we have approximately 150-175 students per grade level.  A single student can determine whether a subgroup is large enough to be counted.  A single maternity leave can impact the achievement of 25 to 100 student scores, leading to dramatic year-over-year shifts.  We must have a better way of looking at our data meaningfully. Aggregating three-year averages of student achievement or three-year growth trajectories would better use the student achievement data and encourage the appropriate type of discussions around student learning. 

 

Finally, the timeline for curriculum updates and its relationship to state curriculum changes is also inherently inequitable.  Consider this school year, as districts are feverishly working to meet the implementation timelines for new and updated standards.  The size of the district has a direct relationship to the amount of resources available for this work.  For districts in Cohort 3, this is very stressful, as they have had the least amount of time to make the updates and put them into practice, while Cohort 2 districts will have three years.  In an effort to emphasize accountability over quality, we are doing a disservice. 

 

Call to Action

 

As the State Board of Education, you have the power to develop and support policies that can move New Jersey education forward.  As you consider the recommendations made by Acting Commissioner Dehmer and the feedback from the public, I will summarize the requests of this testimony:

  • Provide transparent access to QSAC data for research and accountability

  • Revise the calculation of I&P metrics to value growth

    • Eliminate the use of any assessment data that cannot be validated, i.e., the Science Assessment

    • Ensure that the metrics are equitable across districts serving all grade spans, do not allow for a K-8 district to be judged differently than a K-12

    • Include language that suspends the calculation of the student achievement scores when changes in assessment tools or catastrophic events (i.e., pandemic) result in invalid data sets or the data is not available

    • Eliminate redundancy of accountability; QSAC and ESSA should not be measuring the same things, resulting in wasteful use of district and NJDOE knowledge and skill

  • Reduce the number of districts required to develop “District Improvement Plans” for only “failing” the student achievement data calculation

  • Require enhanced oversight on districts who fail more than one indicator

  • Require charter schools to undergo the same process for accountability

  • Set a standard for responsive leadership; work with the NJDOE and districts to develop a more thoughtful, effective, and efficient set of accountability metrics 

 

In conclusion, please decide the updates to the QSAC Regulations that are not an end but rather the beginning of a more extensive review of the efficacy of the instrument. NJQSAC can and should be improved to provide greater equity and transparency to ensure a fairer and more accurate assessment of school performance.  Ultimately, we look forward to participating in an improved monitoring system to benefit our students, educators, and communities.