Monday, October 16, 2023

Auditor's Response to Department of Public Instruction's response to OSA’s Student Attendance and Truancy Analysis, 2020-2021 School Year Performance Audit

The Office of the State Auditor (OSA) is issuing this press release to clarify several misstatements and factual inaccuracies made by the Department of Public Instruction (DPI) in its response to OSA’s Student Attendance and Truancy Analysis, 2020-2021 School Year performance audit.
Raleigh
Oct 16, 2023

The Office of the State Auditor (OSA) is issuing this press release to clarify several misstatements and factual inaccuracies made by the Department of Public Instruction (DPI) in its response to OSA’s Student Attendance and Truancy Analysis, 2020-2021 School Year performance audit.

DPI chose to issue its response publicly after report issuance rather than in the report where misstatements and misleading information can be responded to with actual facts.

DPI’s Superintendent stated that the delay in getting data to OSA was due to OSA not understanding how attendance was calculated.

This statement is incorrect.   

The initial delays had nothing to do with understanding how attendance was calculated.

In an attempt to perform our analysis, OSA requested student attendance data for each school district to include all students’ total absences and types of absences. This request did not change throughout the audit. However, the following problems were identified in several of the datasets given to OSA:

  • Data was missing for the students and sometimes for the entire school.
  • Datasets contained duplicate information.
  • Datasets showed students in two different grades on the same day.
  • Datasets did not reconcile back to reports created by the DPI’s system that actually houses the data and produces these reports for DPI Management.

Later, in an attempt to reconcile the provided data to other DPI sources, it became necessary to gain an understanding of how attendance was determined and calculated. However, throughout the process, it became clear that it was DPI staff who did not understand. By the end of the audit, DPI management admitted that DPI was 100% dependent on its vendor of its Student Information System (SIS) that stores and manages student data (including attendance data).

DPI staff attempted to explain the methodology to calculate attendance to OSA. However, when OSA tried to reconcile Henderson County Public School’s attendance data back to DPI reports, OSA discovered that what DPI staff explained was not how attendance was actually calculated.

OSA Data Analytics staff studied DPI’s programming and determined that the methodology DPI explained was not how attendance was actually calculated in DPI’s SIS. Despite DPI’s claims that OSA was unable or lacked understanding of student attendance data and how attendance was calculated, OSA was able to recreate Henderson County Public School’s attendance data to agree to the Principal’s Monthly Report with 95% accuracy. OSA was able to overcome DPI’s misinformation and its own lack of understanding with assistance from Henderson County Public Schools and through OSA’s own research and knowledge.

Additionally, DPI stated that the data OSA used to perform the analysis contained data on Pre-Kindergarten students, which skewed the results.  First, the analysis was only able to be completed for Henderson County Public Schools due to the previously mentioned data quality issues. Second, if anyone at DPI had tried to confirm this statement included in its published response they would have quickly realized two things. For Henderson County Public School’s analysis:

  • DPI incorrectly included 135 Pre-Kindergarten students in the data provided to OSA. However, OSA identified these Pre-Kindergarten students during our data quality checks and removed them from our analysis. 
  • Even if Pre-Kindergarten students were included in the analysis, they would not have skewed the results. Pre-Kindergarten students made up only 135 of 13,044 (1.03%) total students used in the Henderson County Public School analysis.

Also, the pushback from DPI and the six school districts regarding “school year 2021 was during COVID” is misleading and attempts to distract the reader.  That fact “is” the point of the analysis.  Given the crisis brought about by the COVID-19 pandemic, and its profound effect on North Carolina schools, students, families, and teachers, the analysis was intended to determine whether the state’s Truancy Law was enforced (it was never waived) and to analyze attendance.

Given that there was no enforcement of the state’s Truancy Law, student attendance decreased significantly and led to “chronic absenteeism.” Which begs the question, “How many students that were chronically absent were promoted to the next grade level or graduated but were not ready?” Promoting students who are not prepared for the next grade level puts additional stress and strain on schools, teachers, administrative staff, and financial resources.

Lastly, DPI stated that OSA’s report provides nothing that schools are able to operationalize to get even a single student back in the classroom and that OSA’s report’s recommendations are without merit.

This is not true. The audit report made 10 unique, actionable recommendations to directly address issues and opportunities identified in the audit to reduce chronic absenteeism. Many of the recommendations are under the direct control of school district management and staff and include working with the student and student’s family to analyze the causes of the absences and determine the necessary steps to get the student back in school.

Further, if without merit, why would Charlotte-Mecklenburg Schools (the state’s second largest school district) have stated that, as a result of this audit, it has implemented new policies and processes to ensure that attendance and truancy laws are followed, and that all data is complete and accurate?