GHOST

A privacy framework for ed-tech that doesn’t collect anything worth stealing.

Read full spec on ZenodoCC BY 4.0 · DOI 10.5281/zenodo.19547277

The Problem

In 2024, PowerSchool lost 62 million student records. Social Security numbers. Medical conditions. IEPs. Family income. Most of those records belonged to kids, a lot of them in elementary school.

Every ed-tech breach follows the same pattern. A system collects children’s information. Stores it. Eventually someone steals it.

And every time, the industry response is the same. Better encryption. Better access controls. Better breach detection. Better ways to protect the data after we’ve already collected it.

Here’s a different question. What if we didn’t collect it in the first place?

How it works

Most privacy frameworks tell you how to manage the data you’ve already collected. GHOST tells you how not to collect it.

No names. No emails. No accounts. No PII in the database, the logs, the error tracking, or the analytics.

Here’s what that looks like on AI Safety Adventures.

A teacher creates a classroom. The system generates a code and a handful of numbered slots. Students log in with the code and their player number. The software tracks everything Player 7 does, without ever knowing who Player 7 is.

The teacher knows. The software doesn’t need to.

If someone breaches the database, they get anonymous progress data for Player 7. There’s nothing to sell. Nothing to dox anyone with. The breach is survivable because there’s nothing of identity value to take.

The Five Pillars

GHOST stands for the five things a system has to do to qualify.

G
Gated Access

Identity never enters the system. You authenticate without transmitting, collecting, or storing PII.

H
Hollow Database

Export the whole datastore. No one’s identity should be in there.

O
Observable Outcomes

Admins see full progress, analytics, and engagement. They just can’t see who generated it.

S
Severable Records

Wipe a classroom or a session in one operation. No residue.

T
Target-Free

Whoever breaches you gets nothing they can actually use.

Who it’s for

GHOST was built for kids’ ed-tech because that’s the worst-case scenario for a breach. But the pattern works anywhere people have a right to use software without being identified.

Anonymous health screenings. Whistleblower platforms. Civic feedback tools. Research participant systems. Library activity tracking. Anywhere you’ve got a protected class of users who shouldn’t need to prove who they are to use the thing.

35 auditable requirements. Three compliance levels. Regulatory mapping for COPPA, FERPA, and GDPR. A formal audit methodology. Written for people building things, not people writing papers about things.

Deployed, not theoretical

GHOST isn’t a thought experiment. It’s the architecture behind AI Safety Adventures, a COPPA-compliant platform that teaches AI safety to kids in grades 3 through 8. Zero student PII collected. Zero student PII stored. Real teachers using it in real classrooms with real kids.


Read the spec

GHOST v2.0 is published under CC BY 4.0 on Zenodo. Read it. Critique it. Adopt it. Build on it. It’s yours.

Full spec on Zenodo · DOI 10.5281/zenodo.19547277

arXiv paper coming soon.


Why this exists

I’m Mike Jones, founder of Ethis.AI. While I was designing the auth system for AI Safety Adventures, I asked myself a question. Do we actually need to know who these kids are?

The answer was no.

GHOST is the standard that came out of that one question. If you’re building something for kids and you’re about to collect a bunch of their data, ask yourself the same question first. You might not need any of it.

ORCID: 0009-0003-2050-7706