- On EdTech Newsletter
- Posts
- Clarifications But No Answers on Growing Discrepancy in Enrollment Reporting
Clarifications But No Answers on Growing Discrepancy in Enrollment Reporting
Last week I noted a growing discrepancy between the US higher education enrollment reporting between National Student Clearinghouse (NSC) and the Department of Education’s Integrated Postsecondary Education Data System (IPEDS). The issue is not that there is a discrepancy – different data methods after all – but that is a growing discrepancy. I had the chance to interview Doug Shapiro, executive research director for NSC, and the short answer is that we don’t know why the difference is growing, but it is concerning, and there may be data coming out within a few weeks that helps explain. For a longer explanation, let’s look deeper.
With the release of the Fall 2021 enrollment data from the US Department of Education’s IPEDS system, I had planned a post showing a comparison of three commonly-used data sources to help provide context, but as I looked further there seems to be a significant discrepancy between the National Student Clearinghouse (NSC) enrollment reports and the official IPEDS data release. The NSC data show larger enrollment declines for all but one year in the past decade (2020), leading to an estimate of a 12.0% decline (2.4 million) from Fall 2012 – 2020, when the IPEDS fall numbers show an 8.0% decline (1.7 million) in that time period.
While both data sources show decade-long enrollment declines, that difference of 12.0% vs 8.0% has big implications, and we need to understand why.
The following chart shows Fall enrollment reporting from NSC and IPEDS, along with the difference between the two. Note the scale of the left vertical axis for total enrollment reporting and the right vertical axis for the discrepancy (difference) between the two.
The big issue is that in Fall 2012 the difference was just under 500 thousand students, but in Fall 2021 the difference was nearly 1.5 million students. The clarifications below are based on my interview with Doug Shapiro. I’d like to thank Doug for being so transparent and helpful in our interview – this was very useful.
Data Methods
IPEDS is a census method, where schools accepting Title IV financial aid are required to report survey data on enrollment, based on a particular date (typically October 15th for fall enrollment). The data are typically reported more than a year after the fact (December 2022’s release was for the Fall 2021 term), and the core data element is institutional enrollment, not based on specific students.
NSC’s Current Term Enrollment Estimates reporting began in 2012, and NSC acts as an authorized agent of schools providing information to NSLDS (National Student Loan Data System). The Enrollment Estimates data reporting was never intended to duplicate IPEDS, with their focus instead on maintaining consistent counting methodology over time.
NSC aims to be a census of schools, but with 97% reporting, there is a sampling method involved. This is the only usage of IPEDS data within NSC’s reporting, used to inform weighting when adjusting the sampling data. Most schools report their fall enrollment monthly to NSC, with the udpates used for the official reports – there is no census date.
There were two primary drivers behind NSC’s public reporting:
To provide data in a more timely manner than IPEDS (December 2021’s release was for the Fall 2021 term); and
To remove student duplications due to concurrent enrollment in multiple institutions.
Possible Explanations
Let’s review the possible explanations of the growing differences in enrollment reporting.
International Students – Given NSC’s role in NSLDS reporting, it does not count international students whereas IPEDS does. Close to half of schools reporting to NSC choose to share international enrollments, but NSC excludes this information from its Enrollment Estimates reporting. International student counting does explain a lot of the base difference in enrollment numbers, but it does not explain the changes over time, particularly in the past five years. 1
Duplicated Student Counts – NSC uses a unique student identifier, typically the social securing number, along with algorithms matching other demographic information, to track enrollments at the student level. You can see both total enrollment and unduplicated enrollment in each Enrollment Estimates report, but the numbers I used above are the total enrollments, not unduplicated headcounts. Furthermore, the number of concurrently enrolled students has remained quite low, in the 1.5 – 2.0% range, and has not grown. This issue doesn’t explain the growing discrepancy.
Non-degree / Credit Students – NSC asks schools to report students enrolled in credit-bearing courses, but that does not mean that these students are in a degree program. IPEDS reports both students in degree programs and non-degree programs. Non-degree student enrollments are increasing over time, and this difference in data elements could account for some of the growing discrepancy, but we just do not know how much.
Valid Identifiers – When students do not have a valid identifier associated to help with unduplicated counts, NSC has chose to exclude them from Enrollment Estimates reporting. Historically the majority of these students were international, but for reasons that NSC researchers do not understand, the number of domestic students without unique identifiers has been growing, which could also explain a lot of the growing discrepancy.
Over time, NSC’s algorithm for student matching has gotten much better, even for students without a valid identifier. For the Fall 2022 report that is due in a few weeks, NSC has adjusted its reporting to account for these students. NSC’s expectations are that the adjusted numbers will align closer to IPEDS.
We’ll Learn More Soon
The two best guesses at explanations for the growing discrepancy in enrollment reporting are therefore 1) growing number of students reported without a valid identifier and 2) growing number of students in non-degree programs. We should find out a lot more with the Fall 2022 report. The former would overstate enrollment declines, and the latter is just confusing.
This leaves my original question of whether US higher education enrollment declines are overstated or understated based on the two data sources. We’ll learn more soon, but it sounds like the declines have been somewhat overstated.
Stay tuned.
1 Furthermore, international student enrollment peaked in 2017 according to Open Doors.
The post Clarifications But No Answers on Growing Discrepancy in Enrollment Reporting appeared first on Phil Hill & Associates.