Photo by KIRILL KUDRYAVTSEV/AFP via Getty Images
When ChatGPT was released in November last year, it sparked many conversations and moral panics.
These centre on the impact of generative artificial intelligence (AI) on the information environment.
People worry that AI chatbots can negatively affect the integrity of creative and academic work, especially since they can produce human-like texts and images.
ChatGPT is a generative AI model using machine learning. It creates human-like responses, having been trained to recognise patterns in data.
While it appears the model is engaging in natural conversation, it references a vast amount of data and extracts features and patterns to generate coherent replies.
Higher education is one sector in which the rise of AI such as ChatGPT has sparked concerns. Some of these relate to ethics and integrity in teaching, learning and knowledge production.
We’re a group of academics in the field of media and communication, teaching in South African universities. We wanted to understand how university students were using generative AI and AI-powered tools in their academic practices. We administered an online survey to undergraduate students at five universities: the University of Cape Town, Cape Peninsula University of Technology, Stellenbosch University, Rhodes University and the University of the Witwatersrand.
The results suggest that moral panic around the use of generative AI is unwarranted. Students are not hyper-focused on ChatGPT. We found they often use generative AI tools for engaged learning and that they have a critical and nuanced understanding of these tools.
What could be of greater concern from a teaching and learning perspective is that, second to using AI-powered tools for clarifying concepts, students are using them to generate ideas for assignments and essays or when they feel stuck on a specific topic.
The survey was completed by 1 471 students. Most spoke English as their home language, followed by Xhosa and Zulu. The majority were first-year students. Most respondents were registered in humanities, followed by science, education and commerce.
While the survey is thus skewed towards first-year humanities students, it provides useful indicative findings as educators explore new terrain.
We asked students whether they had used individual AI tools, listing some of the most popular across several categories. Our survey did not explore lecturers’ attitudes or policies towards AI tools.
This will be probed in the next phase of our study, which will comprise focus groups with students and interviews with lecturers.
Our study was not on ChatGPT specifically, although we did ask students about their use of this specific tool. We explored broad uses of AI-powered technologies to get a sense of how students use these tools, which tools they use, and where ChatGPT fits into these practices.
These were the key findings
Forty-one percent of respondents indicated that they primarily used a laptop for their academic work, followed by a smartphone (29.8%). Only 10.5% used a desktop computer and 6.6% used a tablet.
Students tended to use a range of other AI-powered tools over ChatGPT, including translation and referencing tools.
With reference to the use of online writing assistants, such as Quillbot, 46.5% of respondents indicated that they used such tools to improve their writing style for an assignment. In addition, 80.5% indicated that they had used Grammarly or similar tools to help them write in appropriate English.
Fewer than half (37.3%) said they had used ChatGPT to answer an essay question.
Students acknowledged that AI-powered tools could lead to plagiarism and affect their learning. However, they also stated that they did not use these tools in problematic ways.
Respondents were overwhelmingly positive about the potential of digital and AI tools to make it easier for them to progress at university.
They indicated that these tools could help to clarify academic concepts; formulate ideas; structure essays; improve academic writing; save time; check spelling and grammar; clarify assignment instructions; find information or academic sources; summarise academic texts; guide students for whom English is not a native language to improve their academic writing; study for a test; paraphrase better; avoid plagiarism and reference better.
Most students who viewed these tools as beneficial to the learning process used tools such as ChatGPT to clarify concepts related to their studies that they could not fully grasp or they felt were not properly explained by lecturers.
We were particularly interested to find that students often used generative AI tools for engaged learning. This is an educational approach in which students are accountable for their own learning. They actively create thinking and learning skills and strategies and formulate new ideas and understanding through conversations and collaborative work.
Through their use of AI tools, students can tailor content to address their specific strengths and weaknesses, to have a more engaged learning experience. AI tools can also be a sort of personalised online “tutor” with whom they have “conversations” to help them understand difficult concepts.
Concerns about how AI tools potentially undermine academic assessment and integrity are valid. However, those working in higher education must note the importance of factoring in students’ perspectives to work towards new pathways of assessment and learning.
This article first appeared in The Conversation.