The Locality of Online Education: What the Data Actually Show

Two independent datasets complicate the "students choose local" mantra—and reveal 2020 as a structural turning point.

Was this forwarded to you by a friend? Sign up, and get your own copy of the news that matters sent to your inbox every week. Sign up for the On EdTech newsletter. Interested in additional analysis? Upgrade to the On EdTech+ newsletter.

Two days ago, Risepoint released its 2026 Voice of the Online Learner (VOL) report. That series started in 2012 through Learning House, and then Wiley (through corporate M&A) in 2021 and then Risepoint (again through corporate M&A) in 2024. The survey is great for directly getting data from students themselves and from fairly consistent survey design. From 2012 - 2020 the report was known as Online College Students (OCS), but since then the VOL naming has applied.

What I’d like to explore is a broader question about the locality of online education.

There is a mantra of online education students largely choosing programs from local institutions, despite the anywhere-anytime nature of online. And despite the rise of online education mega-universities like Western Governors University (WGU), Southern New Hampshire University (SNHU), Arizona State University (ASU), University of Phoenix, Liberty University, and Grand Canyon University. The data, however, say the reality is more nuanced than the slogan, and 2020 changed the picture in ways that still haven't fully settled.

Rather than just reporting the new VOL numbers, I combined that survey data with the cleaner institutional enrollment census data from NC-SARA and IPEDS to get a more reliable read. One metric is within 100-miles proximity and the other related metric is in-state residency.

And what we see is a bifurcated system—for a subset of large online institutions, locality doesn’t matter, but for everyone else it is a big deal. The common view is a problem of averages, and sometimes those obscure more than they illuminate.

Within 100 Miles Proximity

The VOL reports address the question of whether the institution is within 100 miles of where the student lives. These reports are based on surveys of a sample of students (1500 - 3000 for each year). The following chart tracks the responses over the past 14 years.

Notes

  • Directional rise through the 2010s. After a volatile early stretch (80% in 2012 → 54% in 2014), the share climbed steadily from the mid-60s to the high 70s by 2019, suggesting a real trend toward enrollment at nearby institutions during the pre-Covid decade.

  • 2020 registered the series peak at 82%, but the value is derived from explicit distance sub-buckets and is best read as Covid-distorted noise rather than signal — that year's remote-learning disruption temporarily made everyone "within 100 miles" of their program.

  • A sharp reversal post-Covid. The share dropped to 63% in 2021 and has ranged between 50–69% since, averaging roughly 10–15% below the 2016–2019 plateau. The late-2010s localism peak has not returned.

  • Methodology changes complicate the post-2020 trend. Three shifts in the VOL era (sample redesign in 2021, question wording change in 2023, publisher change in 2024) overlap with the decline, so the magnitude of the true drop is uncertain — but the direction (down from pre-Covid highs) is consistent across all three survey configurations.

In-State Enrollments

There are two other data sources to consider, both of which focus on institutional enrollments as census reporting—NC-SARA and IPEDS. The data measure they use is based on institutional enrollments and whether online students reside in the same state as the institution. The following chart tracks data over the past 11 years.

Notes

  • Remarkable pre-Covid stability. From 2015 to 2019, roughly 55–56% of fully-online students were enrolled at an in-state institution, varying by less than 2 percentage points across five years. This is the cleanest baseline in the study.

  • A 16-point Covid spike to 71.9% in 2020 — the largest single-year movement in either series, reflecting students shifting enrollments closer to home as campuses went remote.

  • Only a partial reversion. The share has drifted down each year since 2020 (65.9% → 62.5% → 61.6% → 62.1%) but has stabilized roughly 6–7% above the pre-Covid baseline, suggesting a durable, if smaller, shift toward in-state enrollment.

  • Methodological continuity is validated. IPEDS nationwide figures (2015–2017) match the combined NC-SARA + IPEDS CA series (2018–2024) within 1–3% during overlapping years, giving confidence the 10-year trend is real and not an artifact of the 2018 data-source transition.

A Combined View of Locality

Let’s look at both measures of locality.

Notes

  • Two independent data sources tell the same high-level story. A student-self-report survey and an institution-reported enrollment dataset both show (1) a pre-Covid tendency toward local institutions, (2) a 2020 Covid peak, and (3) an elevated-but-declining post-Covid state—strong triangulation across very different methodologies.

  • The institutional data (Panel B) provide the cleaner signal. NC-SARA/IPEDS shows a stable pre-Covid baseline, a clean spike, and a gradual post-Covid plateau. The survey data (Panel A) is noisier, more volatile, and confounded by methodology changes—a reminder that self-report surveys at n≈1,500–3,000 carry meaningful year-over-year variance.

  • The "permanent shift" conclusion hinges on which lens you use. Panel B suggests a durable new floor ~6–7% above pre-Covid (students are staying closer to home than they used to). Panel A suggests the localism trend may have actually reversed post-Covid. These are not necessarily contradictory—"within 100 miles" (proximity) and "in-state" (geography) are different concepts and can diverge, particularly in large states or near state borders.

  • 2020 was the inflection point in both series. Whatever the long-run trajectory, Covid marks the moment that both measures broke from their prior patterns—the pre-Covid era is structurally distinct from what came after, and any analysis of online-learning geography needs to treat them as separate regimes.

Discussion

The data make more sense once you understand the structure of the online market itself. Using IPEDS data, there are 19 institutions with 25,000 or more fully-online students—WGU, SNHU, ASU, and their peers—schools with national brands, significant marketing reach, and student bodies drawn heavily from outside their home state and beyond 100 miles. For those institutions, the "students choose local" mantra does not apply.

But those 19 institutions are outliers. There are more than 1,000 institutions enrolling between 1,000 and 24,999 fully-online students. That long tail is mostly local—niche programs, regional reputations, students who already have ties to the institution or the area. The mantra has always been a description of that majority, not a universal law of online education.

What the two datasets also add is a cleaner picture of how Covid reshuffled the baseline. Both series show the same three-act structure: a pre-Covid era with a modest drift toward local enrollment, a 2020 spike that was more disruption than trend, and a post-Covid plateau that has settled above where things were before. The NC-SARA/IPEDS data—the cleaner signal—suggest that new floor is roughly 6–7 percentage points above the pre-2020 baseline and has held there for three years. Whether that represents a durable behavioral shift or a slow reversion still in progress is the open question.

For institutional strategy, the implication is straightforward. If your institution is one of the roughly two dozen with a genuine national brand in online education, your competitive frame is national and the locality data are largely irrelevant to your planning. For everyone else—the overwhelming majority of institutions with online programs — you are competing primarily for local and in-state students, and that has been true throughout the entire period these data cover, including post-Covid. That is both a constraint and a defensible position: local reputation, employer relationships, and regional accreditation carry real weight with the students most likely to enroll in your programs.

AI Note

The charts in this post were built with AI-assisted analysis using Claude Code—a workflow that worked considerably better than my December attempt to do similar longitudinal work with NotebookLM, which I documented here. For premium subscribers, I'll cover what was different this time, where friction remained, and what these results suggest about AI-assisted data work in practice.

The main On EdTech newsletter is free to share in part or in whole. All we ask is attribution.

Thanks for being a subscriber.