After the Breakup: Comparing 10 years of the Online College Students survey
One of the more useful series of research on online education over the years has come from Wiley Education Services (essentially Wiley’s OPM division) via its acquisition of Learning House (a smaller OPM). The survey series “Online College Students” has been very useful in getting the perception of students – both undergraduate and graduate – that are in, considering, or recent graduates of fully-online programs in the US. What are their perceptions of online education, what are their motivations to get an onlinedegree, why did they choose their specific program, etc? From 2012 – 2018 the survey report was created by Aslanian Research along with Learning House. After the corporate acquisition in Summer 2019, the 2019 – 2020 reports were created by Aslanian Market Research in partnership with Wiley. This year things changed.
It turns out that Aslanian Research was acquired by Education Dynamics in 2009. Education Dynamics “has focused on combining innovative marketing solutions and powerful data analytics to help colleges and universities find and attract career-focused adult students”, which sounds a lot like the core of the OPM market. In 2018 the company started expanding into bundled services and contact call centers to fully embrace its OPM status (at least the front-end, as they partner with other firms for the instructional design and academic support elements). I have seen Education Dynamics and Wiley Education Services directly compete for OPM contracts, which is likely why there was a breakup.
This summer Education Dynamics put out the latest Online College Students report, and last week Wiley put out their own “new iteration” report, the Voice of the Online Learner 2021. With the divorce and second marriages in place, it gives an interesting opportunity to compare survey results, both from a time perspective (2012 to 2021) and from a research method perspective (Aslanian-led vs. Wiley-led).
Report Demographics and Differences
The first question I typically get when referring to these surveys is whether they represent true research or just marketing-based looks at students from schools that partner with the sponsoring companies (Education Dynamics, Learning House, Wiley). The answer is that the survey reports are based on a representational group of students, and there is no bias or inside client representation. These companies put out the report, but it is not about the students they directly serve. Furthermore, I have found that they describe the demographics quite accurately, and you can cross-check the results. For example, with the earliest report I have (2012), Aslanian reported that 44% of respondents came from public nonprofit institutions, 35% from for-profit institutions, and 21% from private nonprofit institutions. Fall 2012 was the first data collection where IPEDS reported distance education data, and in 2014 we learned that the 2012 actual data for fully-online students (combining undergrad and grad) had 47% public nonprofit, 35% for-profit, and 18% private nonprofit. Not bad.
In short, these reports have real research based on meaningful student demographics.
I find both of the 2021 reports valuable, and I recommend that people read them in full. The Education Dynamics / Aslanian report has a more straight-forward and consistent set of questions, enabling cross-referencing to earlier reports, and it is more organized. The Wiley report’s sub-head captures their approach, “Amplifying Student Voices in Extraordinary Times”. That is more of an activist, telling a story, mode that deviates from straight market research. If you want a report laced with recommendations, this is the report for you. I prefer straight reporting of the survey, letting the reader draw conclusions, but there is still a lot of useful information contained in the report.
Wiley has new questions that give insight into student perceptions of online course experiences and program preferences. Education Dynamics included students who chose online programs and were forced into online programs due to the pandemic.
In 2012 the Learning House / Aslanian report had roughly 1,500 responses. In 2021, Wiley had nearly 3,000 while Education Dynamics / Aslanian had only 675. WIley clearly wins in this regard.
With all of that setup, let’s look at a few key findings, remembering that these come from students that are in, considering, or recently graduated from fully-online programs.
Plus ça Change, Plus C’est La Même Chose
The Achilles’ heel of online education in aggregate has been human engagement. Meaning interactions student-faculty and student-student. While many individual courses and a number of programs have solved this problem – even creating more engagement online than would be possible in person – the majority of online programs have not. From the 2012 Learning House / Aslanian report on the less positive [come on, just say negative or worst] features of online learning, we see that interaction and communication with instructors and students represent the biggest problem, followed by the need for self-motivation and discipline to deal with the heavy workload.
With the Wiley 2021 report we see the exact same issues – engagement and need for self-motivation and discipline to deal with the workload.
Strong Demand for Multiple Starts per Year and for Asynchronous Delivery
The Wiley report added some valuable questions asking about preferences on program design. Fully 87% of students prefer multiple start dates per year over fewer starts with a larger cohort. In a related preference, 68% of students prefer asynchronous delivery of courses with no live sessions over options with at least some synchronous sessions.
Multiple starts per year is a known preference for online programs, and most for-profits and primarily online nonprofits include this feature as a core operational design requirement. It is not rocket science, but it is useful to put a number on it – nearly 9 in 10 want multiple starts. However, this is one of the biggest weaknesses of many online programs started in schools without centralized program design (i.e. traditional schools) – they typically cannot handle more than one start per quarter, or they take too long to offer these options. Online students do not want to search for a program in January, make a decision, then start in June or September. But that is what traditional schools offer far too often.
I was somewhat surprised by the high number, 68%, of students that prefer fully asynchronous courses. Clearly the anywhere / anytime aspects of fully-online asynchronous courses trumps the problem of poor faculty and peer interaction mentioned above. But another way to look at this finding is that nearly one out of three fully-online students would like to have some synchronous components of their courses. I described this dynamic in a recent webinar with Contact North that argued that video platforms are likely to share top billing with the LMS due to changes in pandemic experiences.
I think that the challenge, or opportunity, over the next few years is for schools to figure out how to combine asynchronous methods that preserve anywhere / anytime access with synchronous methods, increasingly with video, that meaningfully increase student engagement. That’s not a new concept, but as William Gibson noted, “The future is already here – it’s just not very evenly distributed.” How do we increase the large-scale adoption of the methods that work? That is the key opportunity.
The post After the Breakup: Comparing 10 years of the Online College Students survey appeared first on Phil Hill & Associates.