- On EdTech Newsletter
- Posts
- Results from Top Hat’s COVID-19 Student Survey about Online Learning
Results from Top Hat’s COVID-19 Student Survey about Online Learning
In my most recent blog post, I stated that too few institutions are talking to students as they make plans for the fall and beyond. Little did I know that Top Hat was conducting a nationwide survey that got over 3,000 students to respond. Here’s a quote from Nick Stein, Top Hat’s Chief Marketing Officer, describing why they engaged students with a survey:
We have been closely following the impact of the abrupt and unplanned shift to remote learning this Winter, both in MindWires and directly through conversations with our customers. We were inspired to do the survey to hear directly from students about how they’ve been affected and how they’re feeling about remote learning — and whether the technology available to them today was providing an engaging and effective learning experience. We hope that by sharing our findings, we can all better understand how COVID-19 is impacting technology adoption, usage, and effectiveness in higher ed.
For those who don’t know, Top Hat is an ed tech company that both a) fosters active learning through software that goes beyond typical Learning Management System (LMS) functionality, and b) provides services related to helping faculty move their courses online. Kudos to TopHat for launching this effort! For our part, MindWires has full access to the anonymized results, is conducting an independent analysis of the survey, and will be providing feedback to Top Hat about their data collection efforts.
We know how students are feeling, but we’re not sure why
After one demographic question – over two-thirds of the survey respondents are in their first or second year in college – Top Hat’s survey jumped right into it by asking students “how you feel right now.” The results match the Yale Center for Emotional Intelligence study that I mentioned on April 9 – namely, like the educators surveyed in the Yale study these students are feeling anxious (52%), worried (38%) and nervous (37%).
Although students could pick more than one answer option, it’s unlikely that students would select both a positive feeling (happy, calm) and a negative feeling (anxious, worried, nervous). If this is true, then we have what looks like a “bi-modal distribution” – over a third are calm, over half are anxious. However, there are no neutral response options, so students may not have had enough choices to provide a real picture of their collective emotional state.
Analyzing students’ emotional state another way, many of the response options (or synonyms) are located on Robert Plutchik’s wheel of emotions. In the image below, I have plotted the results related to students’ emotions on Plutchik’s wheel. As you move from the outer ring toward the center of the wheel, the emotions go from weaker to stronger – for example, “happy” is a stronger emotion than “calm.” It’s telling, then, that of Top Hat’s four “stronger emotion” options, over half of the students chose “anxious.”
TopHat survey results (Q2) plotted on Plutchik’s Wheel of Emotions
[Image credit: Machine Elf 1735]
To obtain more useful higher ed student survey results asking about feelings, survey designers could include a) more opposites from the wheel to get a better sense of what students are feeling, b) neutral options (and/or “I don’t know”), and c) response options like “sad” (e.g., about not getting a real graduation), “acceptance” (of the situation), or “trust” (in their institution). Further they could add a question that asks “What is most responsible for making you feel this way?” We want to know what’s driving these emotions so we can offer better support or make existing support more visible during a crisis. If students are anxious about their health or their family, the follow-up response would be different than if they’re anxious about their education.
How students are feeling about “online learning”
Some of Top Hat’s other questions may also help us determine why most students are anything but “happy,” but more realistically they offer us some insights about what’s working and what’s not working with “Covid-converted” courses.
Students miss the social aspects of their learning experience
We are social creatures – even the introverts among us – so it stands to reason that not having that same sense of connectedness to others contributes to things like anxiety, worry, nervousness, and annoyance. The survey results show students’ strong desire to return to in-person connections:
Not seeing other students in person:
Over 85% said that they “miss the social experience with other students.”
Over two-thirds of student responses selected lack of “regular access to classmates” as having an impact on them.
On a scale of one (low) to four (high), students rated their “ability to stay connected with classmates” as pretty low.
Not seeing faculty in person:
84% said that they “miss face-to-face interaction with faculty.”
Over half of the students selected lack of “regular access to faculty” as having an impact as well.
Educational researchers and theorists like Vincent Tinto study and write about the need to balance the academic and social aspects of the learning experience – regardless of course format (online, hybrid, classroom) – to support student persistence. In Tinto’s model of institutional departure, students should have four types of institutional experiences: formal and informal activities in both social and academic contexts.
With the recent, rapid shift to remote learning, there are few to no formal extracurricular social activities (e.g., sports, concerts, plays), limited informal social peer group interactions, and fewer informal academic interactions with faculty and staff. That leaves only formal academic activities to keep students from thinking about leaving school. At the moment, most of those formal academic activities are made up of Covid-converted courses. If campuses are really worried about reduced enrollment in the fall, they need to address these gaps.
Students are not satisfied with some academic aspects of their learning experience
On the academic side, Top Hat did devote either answer options or full questions to understand the remote learning experience better.
Missing face-to-face interaction (again): Over 75% chose “Lack of face-to-face interaction with faculty and students during class” as a difficulty in adjusting to online learning. It would be interesting to see how many of those students have experienced a collaborative course activity that was well-conceived and well-executed.
More difficult to complete coursework: Over 75% said that they “miss access to study spaces” and 62% lack “regular and reliable access to a quiet study space.” This is not surprising given that most students are living with their families or roommates (their “quarantine cohort”), and cannot go anywhere else to get away from noise and distractions. What’s more troubling is that almost a quarter (24%) lack “regular and reliable access to the Internet” as a difficulty when learning online. That’s an equity issue that must be addressed. I’d be worried and anxious, too, if I couldn’t get online to do my online coursework.
Online course experience not engaging: Over 77% chose “Lack of an engaging in-class experience” as a difficulty in adjusting to online learning. I really would like to know how many of those students felt their previous in-person classes were “engaging.” It may be that they don’t know how to engage effectively, either synchronously in Zoom or asynchronously using discussions or other tools. Or it may be that faculty need to adopt effective engagement techniques. Before we begin again in the fall, we need to deal with both cases. Perhaps as a result, over 50% of students are spending less time on coursework. Less time-on-task could also be attributed to a lack of self-directed learning skills.
Throughout the survey, Top Hat correctly used the language students are using – ”online learning” – rather than the language institutions are using – “emergency remote teaching and learning.” However, this means it’s harder to tell what students think about online learning in general, versus their experience this spring. We also don’t know how much online learning experience these students have had or how they would rate their online learning readiness. Students need to know that online learning is different and requires additional skills.
Building on this data to address persistence
Although two-thirds of students have not changed their plans to attend school – any school – in Fall 2020, that means the remaining one-third is undecided, unsure if they can afford it, or unhappy with an online-only learning experience. Further, around a quarter (24.83%) of respondents did not commit to return to their current school in the fall, answering too soon to say, unlikely, or very unlikely. Persistence is an issue, both globally and locally.
The survey then asked what students’ schools can do to encourage them to return in the fall. Although an open-ended question potentially leaves it up to the students’ imagination, some patterns did emerge. For example, a number of the students’ written responses asked for an extension on paying fall tuition, due to tight family budgets. Another set of students wanted easier methods to find and connect with campus support staff. With campuses worried about declines in fall enrollment and tuition revenue, follow-up surveys need to go into more depth with specific examples of support – academic, administrative, financial, health and wellness, social and more.
Carolyn Hart’s review of online learning research showed that students’ satisfaction with online learning, sense of belonging, time management skills, peer and family support, and communication with the instructor all affect their persistence. Looking at one of Vincent Tinto’s more recent articles, “From Retention to Persistence,” he identified three factors that affect students’ decisions about staying in and finishing college: “students’ self-efficacy, sense of belonging and perceived value of the curriculum.” Studies like system-wide student satisfaction surveys (e.g., see the California Community Colleges’ 2017 Distance Education Report) include these factors when they ask students to rate their online learning experiences.
So if you are creating a student survey right now, don’t start from scratch. In my graduate course on needs assessment, we covered analyzing existing data before we started developing instruments to fill in the gaps. If you want to ask questions that have been asked and answered elsewhere, do so intentionally – e.g., to determine if your local students feel the same way as the national survey participants. Most importantly, draw from existing research and add questions that would help your campus or system determine the student perspective on factors related to persistence.
As an important side note, we also need to seek diverse perspectives to provide feedback about our instruments. We all have our own core values and unique way of looking at the world. So when we list answer options for multiple choice or multiple answer questions, we may force students to choose responses that don’t quite match their experience. For example, if I were to base a survey design on my own higher ed experiences as a straight, white male whose parents went to college, I might not think of response options that work for first-generation students, non-white students, or LGTBQ students. Moving courses online in the wake of COVID-19 has exposed several equity issues for different groups of students. Gathering appropriate data helps us be more proactive in removing institutional barriers to persistence and success for each and every student.
Next steps
I said it at the start – Top Hat deserves a good deal of credit for conducting a nationwide survey of students. We need to hear that student voice now more than ever. This survey is a good starting point, especially given that Top Hat is an educational software company, rather than a college, university or educational nonprofit organization (e.g., EDUCAUSE). As a next step, collectively we can build on this work by asking questions that will help us make informed decisions about things like a) training faculty to teach online courses (or online aspects of hybrid or hybrid flexible courses) and b) supporting students as they go through the online learning experience while balancing non-academic lives and life events.
As I’ve written in an earlier COVID-19 blog post (scroll down to the Data Collection section), higher ed institutions should survey their students locally, rather than ask faculty to survey students on a class-by-class basis. The information Top Hat has collected gives institutions more direction about the types of questions to ask – e.g., to confirm national trends locally. Hopefully the article above gives additional ideas about both the product and the process. I invite you all to comment below to share links to your institutions’ survey instruments for students and, if possible, to share the students’ perspective through the results.
The post Results from Top Hat’s COVID-19 Student Survey about Online Learning appeared first on Phil Hill & Associates.