PLEASE ROTATE YOUR DEVICE
March 12, 2018
Ohio’s Department of Education continues to struggle with poor data reporting, as a recent state audit reveals. To counter this problem, we may need to rethink and standardize reporting protocols on the district level.
Dubbing it Ohio’s worst-run state agency, state auditor Dave Yost recently slammed the Ohio Department of Education as “a mess.”
Caught in Yost’s crosshairs is the department’s Education Management Information System, or EMIS, which he characterizes as severely outdated. Ohio House Education Committee Chair Representative Andrew Brenner confirmed that school officials often have trouble operating EMIS, which can lead to incorrect data inputs and hurt schools’ ability to get all the funding they deserve from the state.
Audits like Yost’s have revealed that the ODE has regularly reported inaccurate information about Ohio districts — specifically by exaggerating the number of students in attendance.
To name one example, auditors conducted a physical headcount in 54 schools in 2015. In charter schools (accounting for 81% of the total schools visited), the student count was on average 13.7% less than the full enrollment numbers reported, while the public schools surveyed produced a 9% deficit. At risk here is more than just a clerical error — Superintendents and Treasurers can suffer serious professional consequences as a result of over-reported attendance levels.
On the flip side, if a district accidentally underreports enrollment data, it risks losing millions in badly needed government funding. And even if there’s no accident, absenteeism remains a major concern. At dropout recovery charter schools (which are specially designed for students at risk of dropping out of school), auditors recorded an average of 34.1% attendance. This represents an enormous problem for districts, as it’s proven that chronic absenteeism severely damages a school’s funding opportunities.
Another issue with the ODE’s current reporting processes is that it’s nearly impossible to assign accountability for individual errors. Responsibility for tracking different record sets is shared between several complex hierarchies and levels of authority, which makes it difficult for anyone to know exactly with whom the buck is supposed to stop.
Given this bureaucratic confusion, should we blame simple human error, government red tape, or some combination of the two? Yost seems to have decided on the third option, as he points out the system’s shortcomings while also blaming the ODE for slow response times, scheduling issues, and a far too broad management spectrum. He cites inefficient design, claiming: “[ODE] has many, many missions, many identities, and they conflict with each other.”
As a solution, he proposes that the department split up and reallocate tasks to other agencies or state departments in order to reduce its overwhelming workload and sacrifice its (inefficient) oversight of funding and district assessment. He suggests the Office of Budget and Management may be better suited to oversee funding distribution, and that the Department of Higher Education could reasonably manage the entire EMIS reporting system. This simpler, clearer hierarchy and management structure might enable the state to focus on pressing issues like inaccurate EMIS reporting.
In our view, the problem with EMIS is not just the system, but an incomplete understanding of how it works by the people who use it the most. In any given district, there may be dozens or even hundreds of stakeholders who share reporting responsibilities. And because each district utilizes different management structures and protocols, these reporting tasks notably lack standardization. Without standardization, problems surrounding accountability and stakeholder transparency are often made worse.
While Yost is right to point out EMIS’s deep-rooted issues, we can’t wait for a huge bureaucratic shakeup to sort out a system that Ohio’s K12 students depend on for their education. Districts need a tool that will help create a streamlined management system for reporting — Vinson’s CheckPoint EMIS Platform is designed to do just that.
CheckPoint monitors EMIS reporting through a clean, three-step process. First, CheckPoint automates data processing. Then, it runs a thorough data validation process, which ensures that the numbers match up with a district’s reality. Finally, it records a comprehensive audit trail that denotes which records have been verified and by whom, which guarantees an unprecedented level of accountability.
Regardless of what happens to the ODE and EMIS in the coming years, our CheckPoint EMIS Platform is the key to ensuring the numbers are correct in the here and now.
3 EdTech “Game Changers” That Never Happened
Adopting an Interoperability Standard Doesn’t Guarantee Interoperability
Big Data Rising: The Latest IT Trend Only Reinforces the Importance of Interoperability
How School District Data Falls Through the Cracks — And Why It Matters
Why Securing Your School Data Should Be a Top Priority