- On EdTech Newsletter
- Posts
- A Bachelor's Program View of the Senate's Accountability Proposal
A Bachelor's Program View of the Senate's Accountability Proposal
Looking at academic program differences in terms of passing the Senate's Part of Gainful Employment for All proposal

Was this forwarded to you by a friend? Sign up, and get your own copy of the news that matters sent to your inbox every week. Sign up for the On EdTech newsletter. Interested in additional analysis? Try with our 30-day free trial and Upgrade to the On EdTech+ newsletter.
Well that took longer than I had hoped.
On Monday I described the Senate Committee on Health, Education, Labor and Pensions (HELP) proposal for institutional accountability as part of the reconciliation bill debate. It builds on Gainful Employment but using only the Earnings Premium concept, with increased scope in programs being evaluated and with changed metrics.
I finally have a data model built out that combines the College Scorecard and IPEDS and Census Bureau ACS to represent the HELP proposal. But there is one huge caveat - this new proposal looks at earnings for both program completers and non-completers, but there are no publicly-available data for the latter.
Preston Cooper at AEI has adjusted the earnings to give his rough estimate of the impact, while Jason Cohn at the Urban Institute limited his analysis to completer cohorts only. For the first pass, I will also look only at completers, meaning that the actual program performance in general will be somewhat worse (more programs failing) if the Senate HELP sticks to its current proposal.
Most analyses have aggregated the results across sectors or specific program levels. Since the accountability is program by program, I’d like to extend the available analysis to get to these lower level of details. I’ll start at the Control level for Associate’s and Bachelor’s programs, then jump to Bachelor’s programs across the US, then jump into specific Bachelor’s programs per school.
Failing Programs by Control
Given that I am using a similar approach as Cohn, I made sure that I was getting similar results, at least within a few percentage points. Here is his data usage description that mirrors my own.
I use data from the College Scorecard, the American Community Survey, and the Integrated Postsecondary Education Data System to estimate how the proposed accountability structure would affect institutions and students. The College Scorecard does not include program-level earnings 6 and 10 years after entry, so I instead use the median earnings of completers four years after graduation to approximate the earnings outcomes. Because I do not have data on noncompleters’ earnings, this analysis likely represents a lower bound for the shares of programs that would fail the earnings test. Earnings data are for the pooled cohort that completed their program in 2014–15 or in 2015–16, with earnings measured in 2019 and 2020, adjusted to 2021 dollars.
I use institutions’ state locations from the Integrated Postsecondary Education Data System to compare these earnings with the state earnings thresholds calculated from the American Community Survey. To align with the program earnings measurement timeline, I use 2017–21 American Community Survey five-year estimates, which are inflated to 2021 dollars. State earnings thresholds for undergraduate programs range from $26,500 in Mississippi to $36,634 in New Hampshire. For graduate programs, thresholds can depend on field of study, but 98 percent of programs would be judged against a threshold between $33,000 and $58,000, with a median of $44,211.
Like Gainful Employment, there are a large number of programs without sufficient data to allow a pass / fail evaluation due to low cohort size and privacy concerns. These “no data” programs essentially pass by exclusion.
The following table shows that 6% (between 4% - 20% by Control) of students are in Associate’s degree programs that would fail, and 3% (between 3% - 5% by Control) of students are in Bachelor’s degree programs that would fail.

I find it ironic that the American Association of Community Colleges (AACC) lobbied so hard against the House Risk-Sharing when those schools would be the biggest winners if that version passes, whereas Associate’s programs would be the biggest losers if the Senate HELP version passes.
Median Program Status by HELP Earnings Premium and Net Price
Now let’s focus just on Bachelor’s program and get a sense of how different majors lead to very different results. The chart below shows the median HELP Earnings Premium (less than $0 to left in red = fail; more than $0 to red = pass) versus Institution Net Price per year (tuition + fees + expenses without grants) just for programs with earnings data available, and just for completers. I have separate out Control as the color coding to give an idea of how the programs differ for Public, Private, and For-profit institutions. To keep this readable, I have limited to programs with more than 15,000 graduates per year.

As in the House Risk-Sharing, at the Bachelor’s level the worst performing major is Fine and Studio Arts (along with other Visual Arts), as the median program’s Earnings Premium is less than $4,000 for Public institutions. Private institutions also score below $4,000 but are not shown due to size filtering.
The higher performing programs are near and dear to my heart (take that, Morgan). Electrical Engineering, Computer Science, Mechanical Engineering - all with Earnings Premiums above $50,000. Private institutions are even higher (up to $72,000) but filtered out of the view above. But also note that the data is from before the gen AI explosion and would likely be different in the future, at least for those Comp Sci folks (go EE!).
You can also see in this view how Public institutions have similar net prices per year, as do Private institutions.
Two Views of Specific Programs
Now let’s jump down to the level of specific academic programs. There are more than 229,000 in the College Scorecard, and more than 36,000 with sufficient cohort size to have valid data. So I’ll pick on two majors to give a more detailed view.
Fine Arts Troubles
The first is for Fine Arts as a discipline, with four of the similar specific majors. As noted above and in the House Risk-Sharing analysis, these programs perform the worst at the Bachelor’s degree level. But they are not judged by medians across the country, they are judged school by school. This chart shows these programs with the same metrics, but without the size filtering.
Each circle is a specific program. Those to the left of $0 in red would fail with the Senate HELP proposal, those to the right of $0 in green would pass.

There is a wide variation of results, but as you can see, there are a lot of programs with a lot of students that would fail. And that big circle at the top is the Art Institute in Chicago with an Earnings Premium of less than $1,800, barely passing. Once I model in non-completers as well as completers (in future posts), even that program would likely fail.
Nursing For the Win
Enough about Engineering and Comp Sci, what about Registered Nursing, where all three versions above (Public, Private, For-profit) show Earnings Premiums above $40,000.

In this case, every single program for Practical and Registered Nursing Bachelor’s programs would pass the Senate HELP metrics. By a lot.
There are a lot more interesting findings in the data, but hopefully these initial views give a sense of how the Senate HELP institutional accountability proposal would impact specific programs.
If and when I can, I’ll model in the non-completer effect on these metrics and share more details.
Feedback and suggestions welcome.
The main On EdTech newsletter is free to share in part or in whole. All we ask is attribution.
Thanks for being a subscriber.