I built Mora for one kid I love.
A child close to me lives with dyslexia. Last year, our family moved across an ocean, and English became the second language they’d learn to read. The local school’s IEP — the structured-literacy support plan that would actually help — is gated behind finishing ESL first. That’s a one-year delay before help they actually need begins.
Barton tutoring, the gold-standard Orton-Gillingham program, is wonderful and out of reach for a family-funded effort. We were the only ones in a position to fix this. So I built the tool I wished existed — built around the way this kid learns, the language they came from, and a daily mentor they want to come back to.
Because the user is a child, the answer to “does any of this leave the iPad?” has to be no. No raw audio. No transcripts. No per-trial details. The pronunciation model runs entirely in CoreML on the device. The only cloud touchpoint we’ve even considered is a future Parent Mode sync over CloudKit private DB — and that isn’t in v1.
