Students fear over-reliance on AI ‘devalues’ higher education

Report stresses AI is ‘new standard’ and universities need to better communicate policies

August 7, 2024
AI robot sat at a computer, while a woman in glasses sits next to pointing at a paper
Source: iStock/Demaerre

Rising use of AI in higher education could cause students to question the quality and value of education they receive, a report warns. 

The survey of more than 3,800 students from 16 countries found that more than half (55 per cent) believed overuse of AI within teaching devalued education, and 52 per cent said it negatively impacted their academic performance, the Digital Education Council Global AI Student Survey 2024 said. 

Courses primarily created and delivered by AI were less favourably perceived by students, with only 18 per cent saying they are more valuable than traditional courses. 

“Students do not want to become over-reliant on AI, and they do not want their professors to do so either. Most students want to incorporate AI into their education, yet also perceive the dangers of becoming over-reliant on AI,” the report says.

Despite this, significant numbers of students admitted to using such technology. Some 86 per cent said they “regularly” used programmes including ChatGPT in their studies, 54 per cent said they used it on a weekly basis, and 24 per cent said they used it to write a first draft of a submission.

However, Danny Bielik, president of the Digital Education Council, said he expected this figure to be higher, and said many students are too nervous to admit using AI in their essays because of stigmatisation and uncertainty over whether AI is permitted at their university. 

The survey found 86 per cent of students said they were not fully aware of the AI guidelines at their university, and Mr Bielik said students often received “conflicting information”, highlighting the need for greater communication between students, staff and wider university ecosystems.

“Often communication happens at the faculty level. So, students might be in one class being told one thing about what they are and aren’t allowed to use generative AI for, and then they go into another class and they’re told something completely different,” Mr Bielik said. 

The report comes as universities are stepping up their efforts to detect AI in student essays as AI drafting becomes increasingly prevalent, with Turnitin claiming 11 per cent of 200 million submissions it received featured “at least” 20 per cent AI-drafted content, and a recent study found undetected ChatGPT-generated exam answers scored more highly than real students

Respondents said they felt they lacked sufficient AI knowledge and skills (reported by 58 per cent of students) and the report adds that they increasingly expect training to be incorporated into their studies. Consequently, training staff members on AI is a “baseline expectation” before integrating it into courses. 

“The application and use of AI in education is becoming the new standard, and universities will need to take steps to ensure that they have appropriate measures and preparations in place to guide AI integration in their organisations,” the report says.

“Students are as conflicted as the rest of us,” Mr Bielik said. “Communication is more than just writing policies and publishing them on your website.”

juliette.rowsell@timeshighereducation.com

Register to continue

Why register?

  • Registration is free and only takes a moment
  • Once registered, you can read 3 articles a month
  • Sign up for our newsletter
Register
Please Login or Register to read this article.

Related articles

Reader's comments (2)

(55 per cent of student believed overuse of AI within teaching devalued education, and 52 per cent said it negatively impacted their academic performance. 86 per cent said they “regularly” used programmes including ChatGPT in their studies, 54 per cent said they used it on a weekly basis, and 24 per cent said they used it to write a first draft of a submission So what students are saying is it's ok for them to over use LLMs to produce the coursework but not ok for teachers to overuse LLMs to create the content. Hmm seems a bit hypocritical to me. Here is the honest situation at least 67 HE institutions are facing financial crisis resulting in redundancy and restructuring. What does that mean in real terms? Fewer teaching staff to do the work. But the institutions need more students to get money to overcome insolvency. So more work and fewer staff. Without some sort of aid how are the dwindling staff supposed to meet the increasing work load? https://www.theguardian.com/education/article/2024/aug/09/english-universities-face-autumn-tipping-point-as-financial-crisis-looms
I know someone who designed a coding for biologists that encouraged students to write coding using GPT, and focused on the skills still required: working out what to ask it for (you still need to know how to think about what a program will do to ask) and how to tell if what it wrote was correct. However, they find that students were very reluctant to use it. If you are someone who knows code well, we feel (rightly or wrongly) that we can tell. Students did not have that confidence. The research that showed AI essays scored better actually showed that the first and second year essays scored better, but that third year essays scored worse. So here is the problem: We use essays in the first and second year to teach students the art of essay writing, so that when the questions that require genuinely critical and creative thought come in the third year, the writing part comes as second nature. If students come to believe that AI can write their essays in 1st/2nd year, where the questions are less challenging, they won't have the benefit of that practice by third year.

Sponsored