You're on Zillow at 11pm. You found a house you can almost afford in a neighborhood you've never been to. You click the GreatSchools rating: 8/10. Okay, good school. You move on.
But what does 8/10 actually mean? What's it based on? And who decided that number should be sitting right there next to the listing price, quietly shaping one of the biggest decisions you'll ever make?
We started asking those questions. The answers bothered us enough to build something different.
The incentive problem nobody talks about
GreatSchools is a nonprofit, but it gets the majority of its revenue through its partnership with Zillow and other real estate platforms. Its ratings appear on hundreds of millions of property listings. The incentive is to produce a number that helps sell houses — not necessarily one that helps you pick the right school for your kid.
Niche makes money from schools and colleges that pay to promote themselves on the platform. The schools you're evaluating are also Niche's customers. That's a conflict of interest that would get a financial advisor fired.
Neither platform is funded by parents. We think that matters.
SchoolScope is built to be funded by the people who use it — parents. No real estate partnerships. No school advertising. When we add paid features, they'll be paid for by the people making the decisions, not the people trying to influence them.
What "met or exceeded standard" actually hides
Here's the thing that made us build this.
California publishes detailed CAASPP test results every year. Most rating sites take those results and report a single number: the percentage of students who "met or exceeded" the state standard. Sounds reasonable. It's not.
Consider two schools. Both report that 75% of students met or exceeded the standard in math:
| School A | School B | |
|---|---|---|
| % Exceeded Standard | 55% | 12% |
| % Met Standard | 20% | 63% |
| % Met or Exceeded | 75% | 75% |
We haven't found a major rating site that surfaces this split prominently in its public-facing scores. On SchoolScope, these schools look completely different — because they are.
School A has more than half its students exceeding the standard. These kids aren't just clearing the bar; they're well past it. The school is pushing students higher.
School B has most students barely meeting the standard and very few exceeding it. The bar is being cleared, but only just. This might be a school that teaches to the test rather than one that develops deep understanding.
The exceeded rate is one of the strongest signals we've found for schools that genuinely challenge students. It gets the highest weight in our composite score — 30% — because it measures the ceiling, not just the floor.
Why growth matters more than raw scores
A school in Palo Alto where 80% of students exceed the standard in third grade sounds incredible. But what if by fifth grade, that number drops to 75%? The school might be coasting on kids who arrive already ahead — kids whose parents are Stanford professors and tech executives who'd score well regardless of school quality.
Now consider a school where 40% of third graders exceed the standard, but by fifth grade it's 55%. That school is adding value. It's actually doing the thing schools are supposed to do: making kids smarter than they were when they walked in.
We measure this through grade 3-to-5 growth trajectory — comparing proficiency rates across grades at the same school. It's not perfect (we're comparing different cohorts, not tracking individual students), but a school where 5th graders consistently outscore 3rd graders is doing something right. And a school where scores decline from 3rd to 5th grade? That's a signal worth paying attention to.
This growth signal gets 15% weight in our composite. We'd give it more if the data let us track individual students over time. It doesn't, so we're transparent about that limitation.
The signals hiding in plain sight
Beyond test scores, California publishes two other data points that we haven't found surfaced prominently in other rating sites' public-facing scores:
Chronic absenteeism — the percentage of students missing 10% or more of school days. This is one of the strongest predictors of academic outcomes in education research, and it tells you something test scores alone can't: whether families are engaged and whether the school culture is one kids want to show up for. A school with great test scores but 25% chronic absenteeism has a problem that the headline number is masking.
Suspension rate — the percentage of students suspended at least once. High suspension rates often indicate a discipline-heavy culture rather than a supportive one. They also disproportionately affect students of color and students with disabilities. We include this because a school's discipline philosophy is part of what you're choosing when you choose a school.
Both metrics are inverted in our formula — lower is better. Together they get 25% of the composite weight, because school culture and engagement matter alongside raw academic performance.
What test scores fundamentally can't measure
Here's where we have to be honest in a way that most ranking sites aren't.
Test scores measure what tests measure. They correlate with many things we care about, but they miss enormous swaths of what makes a school good for your kid:
- A teacher who makes your struggling reader feel like a reader. No test captures this. It might be the most important thing that happens at school.
- Creativity and curiosity. Standardized tests measure convergent thinking. The kid who asks "but what if the question is wrong?" gets no credit.
- Social-emotional development. Does your kid have friends? Do they feel safe? Are they learning to navigate disagreement? None of this shows up in CAASPP data.
- School culture and community. Some schools feel warm the moment you walk in. Others feel like bureaucracies. You can sense this during a tour. It doesn't fit in a database.
- Fit for your specific child. A school that's perfect for a self-directed learner might be terrible for a kid who needs structure, and vice versa. Rankings can't know your kid.
We're not going to pretend test scores are the whole picture. They're not. What we can promise is that the partial picture we show you is honest, transparent, and more detailed than what we've found elsewhere.
What we're building and why
SchoolScope is, right now, a California-only tool that analyzes public CAASPP test data, chronic absenteeism, and suspension rates through a transparent composite methodology that weights the signals we think matter most. You can see every school's breakdown, explore the data yourself, and understand exactly how we arrived at every number.
We're new. We're small. We have opinions about what matters in school data, and we've been transparent about every one of them. You can disagree with our weights, and we think that's healthy — we'll soon let you adjust them yourself.
What we're not going to do:
- Take money from schools or real estate companies
- Hide our methodology behind a proprietary black box
- Pretend test scores tell you everything you need to know
- Show you a number without showing you how we got there
The biggest platforms in education data are funded by real estate companies or by schools themselves. We think parents deserve a tool funded by parents. We think you should be able to see the actual data — the exceeded vs. met split, the growth trajectory, the absenteeism rates — and make your own informed judgment.
Why we start with one state, not fifty
National rating sites face an impossible task: compare schools across 50 states with 50 different testing systems. The only way to do that is to flatten everything into a single proficiency threshold. That works for broad comparisons but loses the detail that matters when you're choosing between two schools in your zip code.
We take the opposite approach — we show you the most useful data your state actually produces, in its native form, without distilling it into a number that's comparable to a school in Ohio but meaningless for your decision.
California's CAASPP data is unusually rich. It publishes four performance levels, not just pass/fail. It reports by grade, by subgroup, by school. That's what lets us show you the exceeded vs. met split, the grade 3-to-5 growth trajectory, and all the detail that disappears when you compress it into a score designed to work in every state at once.
Starting with one state isn't a limitation we're embarrassed about. It's a choice that lets us go deeper. When we expand to other states, we'll do the same thing: show you what that state's data actually says, using the metrics that are most meaningful locally. Not a national average. Not a lowest-common-denominator score. The actual data, presented honestly.
The honest partial picture
We'd rather give you an honest partial picture than a confident misleading one.
That's the whole thesis. Test scores are one lens, and we'll be the first to tell you that. But within that lens, we haven't found another tool that breaks apart the exceeded vs. met split the way we do, and we think the funding models of the biggest players — real estate partnerships and school advertising — create conflicts worth thinking about.
If you're a parent trying to understand California schools, start here. If you want to understand exactly how we score schools, read our methodology. If you think we're wrong about something, we genuinely want to hear about it.
We're not trying to replace school tours, parent networks, or your own instincts about what's right for your kid. We're trying to make sure that when you look at the data, you're looking at all of it — not just the parts that help someone else close a deal.