Problems of Scale in CUNY and Ohio ASAP Programs
The much-touted programs are not as scalable as media and official reports would have you believe
Was this forwarded to you by a friend? Sign up, and get your own copy of the news that mattered sent to your inbox every week. Sign up for the On EdTech newsletter. Interested in additional analysis? Try with our 30-day free trial and Upgrade to the On EdTech+ newsletter.
Community colleges are vital for education access, especially for marginalized students, but their completion rates are way too low. Given this situation, you would think I would be excited that a program that has vastly improved completion rates at some community colleges, is being adopted elsewhere, and is achieving similar results. I would, if the experiments were as good as the reports (both popular and official) claim them to be. But not only do the reports gloss over some of the important limitations of the program, they also overstate the scalability of the approach. Both faults have consequences for our ability to, you know, actually improve community college completion and graduation. [Full-page audio link]
The original CUNY ASAP studies
CUNY's Accelerated Study in Associate Programs (ASAP) project, which first reported results in 2015, demonstrated impressive outcomes by providing comprehensive support services to community college students who attended full-time. However, specific limitations of the program, highlighted in reports but overlooked in popular accounts, restricted its broader applicability, as noted by Phil Hill at the time of the study.
the circular reason inherent in the study design;
the focusing on full-time students which excluded many students for whom full time study is impossible;
the cost which amounted to an extra $13,838 per student over the course of the program; and
the absence of any exploration of the various interventions which made it difficult to determine which approaches were effective and which were not.
ASAP replicated in Ohio
The ASAP program has been replicated at a few community colleges nationwide, including Cincinnati State Technical and Community College, Cuyahoga Community College, and Lorain County Community College in Ohio. At all three colleges, students were chosen via lottery to participate in ASAP with a similar group of students identified as a control. Some small changes were made to the program from CUNY’s original design, including tracking student employment and earnings. A recent six year report highlighted the program's achievements.
As in the original CUNY studies, students received academic and financial support but had to fulfill certain requirements that are summarized in this table.
ASAP Ohio Supports and Requirements
Additional Support Provided to ASAP Students
Requirements of ASAP Students
Enhanced advising - Required advising twice per month in first semester - adviser caseloads < 125 compared to close to 300 for non-ASAP students
Enrollment – full time enrollment required in Fall & Spring; Summer encouraged
Enhanced career development services – required to meet with career advisor & attend a career event once per semester
Taking remedial courses early.
Enhanced tutoring – Required if taking remedial courses
Enroll in first year seminar
Tuition waiver – Any difference between financial aid and tuition and fees waived
Required to graduate in 3 years
Assistance with textbook costs – received voucher for free textbooks
Enroll a consolidated or block schedule (or both) - where seats are held for ASAP students in special sections
Monthly financial incentive - $50 gas/grocery gift card
Must be Pell-eligible
Scheduling – seats held in specific sections and early registration.
The results are impressive. 44% of ASAP students earned a degree after six years compared to only 29% of control group students (see chart below). 14% of students in ASAP went on to earn a bachelor’s degree versus only 9% in the control group. Not only were graduation rates improved, but they were consistently better, which the report suggests that the additional students who graduated would not have earned a degree without the impact of the program Additionally, in Year 6 ASAP students earned an extra $1,948 compared to the control group's average of $17,626.
While impressive, these results raise several questions. An almost 50% improvement in graduation rates (29%-44%) is great, but it begs the question of why, despite all the support provided, that 56% of ASAP students were still not able to get a degree in six years. Knowing this would really help understand the barriers to student success, and the fact that this was not explored is a missed opportunity.
More importantly, the program is subject to the same flaws identified by Phil in the earlier CUNY program.
The logic underlying the program is circular, meaning it essentially creates its own results. Students are required to register full-time to be part of the program. If a student does a full course-load it is difficult to avoid graduating in a timely fashion. This is the logic behind university campaigns designed to encourage students to take a full case load and so increase their odds of graduating on time, for example the University of Hawaii’s Fifteen to Finish program.
In the Ohio ASAP, they doubled down on this circularity by requiring students to graduate on time. It is literally listed as a requirement. The outcome they want to explore is part of the treatment. Given this, and the requirement for full-time study, the fact that only 44% graduated in six years is concerning.
The Ohio program is expensive. Over three years the ASAP students cost the colleges $8,030 per student more than non-ASAP students. This is substantial and not easily replicable. The report does some complicated calculations to show that with improved graduation rates the overall cost per degree is lower. But the costs are still substantial.
In another modified expansion, California’s Skyline Community College planned to spend $2.9 million to enroll 300 students in an ASAP style program. It is difficult for community colleges to furnish that sort of capital. The advice given by the researchers to the community colleges is that they should be entrepreneurial in seeking funding to scale these projects. That's the Think Tank equivalent of Boomers telling Millennials that they should eat less avocado toast if they want to afford to buy a house.
But mostly the expansion of ASAP in Ohio illustrates the fact that the ASAP program is not easily scalable. This is especially problematic given that scalability and replicability have been a central thrust of the program from the beginning and continues to be core to the way the program is being hyped in by think tanks, media, and now in Congress.
While the program has been replicated in several different places beyond CUNY, the number of students involved has been small. The Ohio study involved just 806 students (there were 1,501 students total in the study but 695 were in the control. The original CUNY study had 896 students. As mentioned above, one of the expansions in California involved just 300 students. While the experiment has expanded geographically, from reports like those done on Ohio we don’t have insight into the critical question of how the program scales in size.
Lessons learned from actually implementing ASAP at scale.
Fortunately, the Community College Research Center (CCRC) at Columbia University conducted a study on the scaling of ASAP at Bronx Community College (BCC). The CCRC's research provides valuable insights into the challenges of an increase in numerical scale of the program (beyond the financial)
BCC successfully scaled ASAP from an initial cohort of 750 students to 5,000, representing approximately 50% of their student population. BCC achieved positive outcomes like other ASAP projects in terms of completion rates.
But we learned that, to scale ASAP, BCC had to:
work to establish stronger cross-college communication mechanisms;
integrate ASAP into normal college admissions and enrollment;
make significant changes to remedial education;
make significant changes to how they scheduled classes (anyone who has ever worked in a college or university will understand how big and tough of a job that is);
change how they did advising and implement a student success CRM and decrease advisor caseloads across the board - again, a series of giant projects; and
increase support for part-time students.
In short, they had to change a whole lot of things. The changes boil down to the kinds of shifts you need to make when you go from offering something to a very small group to offering the same services and support at an enterprise level. Scaling these programs is not easy and has implications for many tough to change systems in higher education.
This post could be seen as another On EdTech rant, parsing and debunking sloppy research and reporting. But the fact that we keep having to do it points to a larger problem. Why can’t places like MDRC and the Gates Foundation as well as education think tanks and media do a better job of reporting on experiments and dig into what happened and why and what could be done better next time? * Instead, they mostly seem to operate in a world in which the only possible outcome of a funded proposal is unmitigated success described in the most glowing way possible.
I believe that folks at these places:
are deeply and sincerely invested in student success; and
are smart and capable researchers and analysts.
To the people who produce this kind of reporting, I would argue that it is not good for student success, and student success is a hair-on fire problem we need to address. Enough of the rose-colored reporting lenses. Tough questions and good research designs will help us find the way.
* I think universities, too, are guilty of some of this behavior, and I plan a whole post about that soon.
The main On EdTech newsletter is free to share in part or in whole. All we ask is attribution.
Thanks for being a subscriber.