• On EdTech Newsletter
  • Posts
  • Looking for Empathy in Wiley’s COVID-19 Remote Learning Survey Results (Spring 2020)

Looking for Empathy in Wiley’s COVID-19 Remote Learning Survey Results (Spring 2020)

Recently Phil Hill and I had an opportunity to give a talk together. One of the core topics we raised was empathy, which Oxford Languages defines as “the ability to understand and share the feelings of another.” We discussed demonstrating empathy for a wide range of higher education stakeholders, from students to campus leaders.

Soon after this talk, Wiley released a useful and informative infographic, “Emergency Remote Learning Satisfaction for Students and Instructors.” The infographic used a compelling visualization strategy that compared student and faculty responses to questions about their post-COVID-19 experience shifting to remote learning in spring 2020. Perhaps because the talk was so fresh in my mind, the side-by-side comparison technique also highlighted for me the challenge that we have in demonstrating empathy for our students throughout the survey process. On their own, the results showed that students rated remote learning less favorably than faculty did, showing an initial level of disconnect. Moreover, the infographic as a whole does not fully convey how students viewed the spring term. Let’s look at the data first, and then I’ll come back to the empathy issues.

Side-by-side results highlight different views on remote learning experiences

To gather the data for their infographic, “Wiley Education Services surveyed 4,280 students and 499 faculty from seven universities.” We don’t know what types of universities, but we can infer that no community colleges were included. It was not clear from the infographic or accompanying webpage when the data was collected. All that said, Wiley deserves kudos for collecting and presenting both sets of data together, which makes it easier to see how differently faculty and students viewed their spring 2020 experiences.

Satisfaction with remote teaching and learning: Over three-quarters (77%) of instructors surveyed were satisfied with the remote teaching experience in spring 2020, while just over half (55%) of students were satisfied. On the flip side, over a third (35%) of students were dissatisfied with remote learning, compared to a fifth (21%) of teachers.

Perception of the learning experience: Echoing many other COVID-19 student surveys, very few teachers or students felt that students learned better in emergency remote learning environments than they had learned in previous terms. However, there was a dramatic difference in perspective about just how poorly students felt learned in comparison. Almost twice as many students – 45% compared to 24% of teachers – felt they “definitely did not learn as well” as they had in classes before the pandemic.

Rating effectiveness of remote course elements: On average, students rated the effectiveness of every remote course element lower than faculty did, ranging from just under half a point lower to well over a full point lower on a five point scale. In the graphic below, I have converted Wiley’s percentages of people who selected each rating level to a rough average rating score for each course element. Pre-recorded lectures offered the closest perception of effectiveness (3.55 average for faculty, 3.18 average for students). Whatever “Other” course elements may be, they offered the most disparate ratings (3.56 average for faculty, 2.17 average for students). Both faculty and students rated discussions fairly low in comparison to the other course elements, which confirms what other surveys have told us about poor engagement and interaction last spring. [NOTE: I used the percentages that Wiley rounded to the nearest whole number, so the total numbers of students and faculty were close but not exact for every course element in my average rating scores.]

Timely feedback: I’m glad Wiley asked about timely feedback, which online learning research shows is a big factor in online student persistence and success. The chart showed that faculty and students have dramatically different perceptions of what is considered timely. 86% of faculty felt they provided timely feedback, while only 60% of students did. It would be useful to know what the feedback time frames actually were, and what students expect on average. Regardless, the Wiley team offers good advice that instructors should set deadlines both for students and for themselves. Faculty also should include a syllabus statement that tells students how quickly they intend to return graded work with feedback. Anything over a week after the students submit their work may be too long.

Instructor involvement: This question provided the closest side-by-side ratings of the survey, showing only an 8% difference in perspective. 70% of faculty and 62% of students felt the teachers were actively involved during the spring 2020 course experiences (also see chart below). The Wiley team provides relevant guidance for faculty to communicate to the whole class, as well as to individual students as necessary.

chart - 62% of students and 70% of teachers felt teachers were actively involved in spring 2020 remote courses

Clarity of expectations: Ironically, Wiley followed the closest ratings with the largest gap – a 34% difference in perception. 91% of faculty felt they set clear expectations, while only 57% of students did (also see chart below). In a brief statement following the results, the Wiley team refers to the Community of Inquiry framework. It’s a powerful framework, but it might be better suited as a suggestion for instructor involvement or instructor presence than for setting clear expectations.

chart - 57% of students and 91% of teachers felt teachers set clear expectations in spring 2020 remote courses

Ways to increase empathy throughout the survey process

So far, so good. We have a straightforward, side-by-side display of the results with a few brief statements that offer reasonable suggestions to faculty. So how does this tie back to empathy?

Faculty perceptions of their students’ experiences may require calibration

As I mentioned earlier, this is one of the few COVID-19 survey efforts to collect faculty and student data at the same time. We’re not looking at faculty survey data from one group and comparing it to student survey data from another. So one might expect to see results that are not just presented side-by-side, but results that reflect a common understanding of a common experience. That’s not the case here. At these seven universities, at least, faculty felt the remote teaching and learning experience was much more positive than the students did.

Why is this important? Higher education institutions around the country moved mountains to train and prepare thousands of faculty for the fall 2020 term. Odds are good that most of that training focused primarily on designing online or hybrid courses and using online technologies. All subsequent rounds of training should include sessions on empathy. In blog posts and during online events over the spring and summer, I’ve said that we faculty need to put ourselves in our students’ shoes. Try to submit one of your own assignments using only a smartphone. Park your car in a Starbucks parking lot and work on your laptop for a couple of hours to see what it’s like for some students to get a decent Internet connection. And these are just a couple of the technology challenges.

In episode 2 of an EdSurge podcast series called Pandemic Campus Diaries, an SF State student named Marjorie described the empathy challenges she faced as a first-generation student who is also a single mother of two children:

I don’t know why my teachers act like it’s a problem for me to ask questions…I’ve never been to a four-year school. I don’t know how this works and it’s even harder during COVID-19 and fires happening. Please be patient with me. My kids were not letting me, like, focus. …My teacher seems very annoyed and I have to apologize ‘cause I don’t want to get a bad grade starting the semester in, like, a bad moment. 

That quote echoes what student panelists reported during a Student Senate of the California Community Colleges webinar last spring, with statements like “…not all professors have been flexible” and “Faculty say ‘Why isn’t this done? You have all day.’ “

With all of the preparation that so many faculty completed over the summer, I expect faculty ratings of the online teaching and learning experience will be even higher this fall. As we did not do as much collectively to prepare students for online learning, I don’t expect student ratings of the same fall 2020 experience to increase as much, but I hope I’m wrong.

How we interpret and present survey results matters, too

The primacy-recency principle states that we best remember what we see first and last. That means that how organizations sum up survey results in executive survey sections at the front of a report or a summary paragraph at the bottom of an infographic carry increased significance. I raise this point because the Wiley infographic closes with a somewhat misleading summary statement: “Both students and faculty agreed they were satisfied with the remote teaching implemented, and the majority felt they were actively involved and accessible while remotely learning and teaching.”

In and of themselves, these two statements are true (to a point). However, even though 55% of students is a majority, I would not characterize that percentage as ‘students being satisfied with their remote learning experience’ last spring. Especially when over a third of students were dissatisfied. Further, the student scores were lower – often significantly lower – than the faculty scores for every indicator in this survey. The closest scores pertained to instructor participation, and that’s the only other item from the survey included in the closing sentence. By omitting the large perception gaps from the summary, it leaves a door open to maintain a status quo that doesn’t work for students.

This is not just a one-time, one-survey issue. For example, Instructure recently shared the results of months of work in its September 2020 report, “The State of Student Success and Engagement in Higher Education.” Although the report overall offers very few new insights (coming after over two dozen other COVID-19 student surveys), it makes salient points about socioeconomic factors that affect students’ access to technology and, ultimately, student success. It goes so far to recommend providing options for students to “engage on their terms.” Just a couple of pages later, though, the report recommends using immersive technologies and virtual reality to increase student engagement. When almost 60% of higher ed students faced basic needs insecurity in the spring, that type of recommendation may need to sit on the back burner for a while.

Next steps

We all measure success differently. Using hyperbole to make my point, baseball statisticians feel batters are successful if they get a hit 30% of the time. For online or remote learning, though, that’s not good enough. If we believe the research about successful online learning, then the student survey results that Wiley reported are not good enough either. As faculty participate in surveys and as organizations interpret and report the results, it’s imperative that we approach the process with some empathy. In that talk Phil and I gave, I told participants that we need to move from the golden rule to the platinum rule. The golden rule says “treat others as you would like to be treated.” The platinum rule says “treat others as they would like to be treated.” Get to know your students.