Stale Inputs, Lasting Consequences: OBBBA's Accountability Math
An outlier year set the assumptions, but Do No Harm implementation will be applied to cohorts shaped by a very different economy

Was this forwarded to you by a friend? Sign up, and get your own copy of the news that matters sent to your inbox every week. Sign up for the On EdTech newsletter. Interested in additional analysis? Try with our 30-day free trial and Upgrade to the On EdTech+ newsletter.
You’ve probably noticed a theme in the past few months in my coverage of OBBBA. While the concept of limiting student debt and holding institutions at least partially accountable for student economic outcomes (i.e., the college premium) is laudable, the data and metrics behind the new policies are going to have a much bigger impact than people realize.
One of the central changes from OBBBA comes from Do No Harm (institutional accountability), a program-level test: Title IV aid continues only if a program’s graduates aren’t worse off financially after completion. ED measures this over multiple years, centered on an Earnings Premium benchmark against state or US peers. Programs that fail the metrics in two of the last three years face loss of Title IV eligibility. The aim is to surface programs with poor economic returns, protect students, and make the underlying data and methods transparent.
I’ve described some geographic and gender problems with the chosen Earnings Premium metric, but there is another dimension being ignored. Time.
A Snapshot in Time
The US Department of Education (ED) first released the College Scorecard in late 2015, and in 2019 it expanded to include program-level data on graduate earnings and debt levels. The data were updated again several times from 2021 - 2025.
In all that time, it turns out that there has only been one snapshot of earnings data at the program level, and that view is based on an old, outlier cohort. All of the research you may have read estimating how many programs might fail under OBBBA’s Do No Harm / Earnings Premium metrics are based a two-year cohort from the AY2015 - AY2016 graduation year, with four-year earnings measured in AY2019 - AY2020.1
When you read that “just 3 percent of bachelor’s degree programs would fail the ‘do no harm’ test,” or that “only about 1 percent of students are enrolled in programs that would fail these tests in a single year,” know that those analyses are based on old, outlier data. The reality will be far greater than that.
Exploring More Recent Data
The baseline data at the undergraduate level (what programs are compared against) are from the Census Bureau’s American Community Survey (ACS) and its 5-year estimates of state-level earnings of adults in the workforce aged 25-34 with a high school degree but no college. The ACS data is updated annually, so I looked at the trends and not just the shapshot.
Since the College Scorecard has not been updated for earnings past 2019-2020, we can’t do a direct analysis of the official data - that is the point, we’re flying blind - but we can use ACS to calculate a proxy measure of a Bachelor’s Earnings Premium. Essentially the median earnings of those with Bachelor’s degree minus the median earnings of those with a High School degree but no college.
To get this measure, I took the same collection from ACS used for the baseline and simply added adults with bachelor’s degrees. This approach has differences with the official OBBBA Do No Harm metric - the latter included people out of the workforce, of all ages, and specifically four years after program completion. But the Bachelor’s Earnings Premium2 should be directionally consistent and expose some trends that we will likely see in the Do No Harm implementation.
The first view is of the Bachelor’s Earnings Premium using median data across the US from 2014 - 2023 (the last year available in ACS data), with all earnings inflation-adjusted to 2023 dollars. I present it in raw form and expressed as the percentage difference each year compared to 2019. The idea is to get a sense of how big of the changes over time might be compared to the snapshot years of 2019-2020.

In raw numbers, the BEP ranges from $23,532 in 2017 to the peak of $24,820 in 2019. But since 2020 the premium has been decreasing.
By viewing the level of change on the bottom - percentage difference from 2019 - you can see that there was a big drop in 2023, being 4.5% lower than the Bachelor’s Earnings Premium in 2019.
Note that 2019 - 2020, the same two-year period used for ED’s snapshot of data, is an outlier. That period had the highest BEP of any in the past decade, yet we’re making policy decisions based on this outlier.
The 4.5% drop is interesting by itself, especially if 2023 is a trend and the premium lowers even further in 2024 and beyond, as is likely based on recent economic news. But the state-level data is even more interesting. Remember that OBBBA Do No Harm is measured against state baselines, for the most part.
Let’s add three states to get a broader view.

Alaska has the biggest drop, with its Bachelor’s Earnings Premium fluctuating significantly with 2023 being 38% lower than 2019. New Mexico, for some reason, is nearly the opposite, with 2023 being 31% higher than 2019. My state of Arizona has seen decreases, with 2023 being 16% lower than 2019.
The more that OBBBA Earnings Premiums drop, the more academic programs that will fail and lose access to federal financial aid. And the BEP data indicate that many states are going to have a big drop in earnings premiums, likely leading to far more program failures than predicted.
To get a sense of the geographic spread, the following view shows the change in Bachelor’s Earnings Premium by state from 2019 - 2023.

There’s a lot more red (meaning that the BEP are worse in 2023 than in 2019) than green. And this view shows the scope of changes in just four years (albeit four momentous years).
There’s a sea change coming with program accountability combined with changes in student loan limits. And the changes over time of Do No Harm Earnings Premiums are going to surprise a lot of university administrators and perhaps even federal policy makers.
And for goodness sake, can we get some new data?
The main On EdTech newsletter is free to share in part or in whole. All we ask is attribution.
Thanks for being a subscriber.
1 AY = academic year, measured generally from July to the following June.
2 For this analysis, I’ll use this term “Bachelor’s Earnings Premium” or “BEP” to designate this proxy metric from ACS.