- On EdTech Newsletter
- Posts
- What Students Want When It Comes To AI
What Students Want When It Comes To AI
The Digital Education Council Global AI Student Survey 2024
Was this forwarded to you by a friend? Sign up, and get your own copy of the news that matters sent to your inbox every week. Sign up for the On EdTech newsletter. Interested in additional analysis? Try with our 30-day free trial and Upgrade to the On EdTech+ newsletter.
The Digital Education Council (DEC) this week released the results of a global survey of student opinions on AI. It's a large survey with nearly 4,000 respondents conducted across 16 countries, but more importantly, it asks some interesting questions. There are many surveys about AI out there right now, but this one stands out. I’m going to go into some depth here, as the entire survey report is worth reading.
Levels and types of AI usage
Students report making a lot of use of AI, although if frequency of use is considered then the numbers drop considerably.
For me this raises a question about the extent to which students recognize AI tools and usage. Are they counting all uses of AI or simply the use of specific tools associated with AI such as ChatGPT, Grammarly or Microsoft Copilot (the most used tools according to the survey)? As AI becomes increasingly ubiquitous and integrated into textbooks, learning management systems (LMSs), and even basic technology, are we accurately measuring overall AI usage or simply a specific type? It seems likely that many AI applications now operate seamlessly in the background, perceived as standard features rather than distinct AI tools.
I was also struck by how students were using AI. Using AI for writing (as opposed to checking grammar and helping with summaries) is fairly far down the list. Perhaps we could tone down the moral panic all about cheating. The use of AI in search is surprising though perhaps it shouldn’t be, given the rather rocky state of search right now.
But students anticipate and see far more potential uses of AI, mostly around career exploration and learning support. This is not surprising, given the strong career focus of many higher education students, and the fact that many parts of higher education institutions, outside of business and engineering schools, offer lower levels of career support than many students might want.
AI training and literacy
Related to this, the survey makes clear the extent to which students feel that colleges and universities are dropping the ball when it comes to preparing them for a world where AI plays a much bigger role.
Students expect far more from their institutions, including greater use of AI in teaching and learning, training on AI tools, and increased student involvement in AI decision-making and policy development. The students' frustration evident in this survey is unsurprising given the results of another recent survey by Wiley, which found that most instructors are not using AI.
And while a majority of instructors felt that the tools could be useful if used well, their concerns about AI focused on the potential for poor use of those tools.
Back to the DEC survey, students for their part would like to see not only more training on the use of AI to address the gaps in AI knowledge and skills described above but also see the need for training instructors and faculty in the use of the technology.
I think that the best path toward AI literacy is more likely to come from greater use of AI by faculty themselves, in teaching and in research and administration, than by courses and training. There seems to be much more talk about AI literacy than there is actually doing it.
Given students' interest in being trained in the use of AI, there appear to be many lost opportunities for instructors to teach students how to think critically about AI by using the tools (the whole "glue on pizza" issue seems like a giant teaching moment to me) and to model good use of the technology.
The importance of trust
Another issue that jumped out at me from the DEC survey was that of trust. Students are worried about the use of AI in a range of settings, including the use of AI to generate content.
Students are especially worried about AI in assessment.
This is an issue to which EdTech developers, instructors, and higher education administrators need to pay close attention. AI holds great promise in enabling the ability to provide high-quality content and feedback in a scalable and timely way. But it all hinges on students' trust in the system; they need to believe that the course they are taking or feedback or grade they receive from an AI-assisted or enabled system is fair and accurate, otherwise, it simply isn't going to work.
Guidelines, what guidelines?
An issue that perplexed me in the DEC report was the analysis of student awareness of and perceptions about university guidelines on AI use.
I would not classify students as having a low awareness of AI guidelines as being lost. From an Inside Higher Ed survey of Provosts from earlier this year, while some institutions do have policies and guidelines, they are a minority (or alternatively, many Provosts would fall into the lost category with students). Having read many of the policies and guidelines that do exist, I agree with the 93% of students who say that there is room for improvement.
Parting thoughts
Back when I was working on campus, I loved working with students more than almost any other task. They brought a refreshing optimism coupled with realism that I found invigorating. The AI survey from DEC reflects both qualities and is a valuable contribution to our understanding of the current and future role of AI on campus.
The main On EdTech newsletter is free to share in part or in whole. All we ask is attribution.
Thanks for being a subscriber.