AI in Education - Universities Adapting to Change

Are British Universities Prepared for the AI Revolution in Education?

A recent survey of 1,000 undergraduate students has revealed a significant increase in the use of generative artificial intelligence (genAI) in academic work. The findings show that 88% of students now use tools such as ChatGPT for their assessments, a steep rise from 53% the previous year. This shift highlights the growing impact of AI in education and its role in transforming how students approach their studies.

The proportion of students using any form of AI has surged from 66% in 2024 to 92% in 2025, leaving only 8% who do not engage with these tools. This rapid change in behavior underscores the growing reliance on AI technologies in higher education.

How Should Universities Respond to the AI Boom?

Experts warn that universities must take urgent action to address this transformation in student learning. One education researcher emphasized the necessity for institutions to “stress-test” all assessments to determine whether they can be easily completed using AI.

“There are urgent lessons here for institutions,” he stated. “Every assessment must be reviewed. That will require bold retraining initiatives for staff in the power and potential of generative AI.”

He further urged universities to collaborate and share best practices, stating that AI in education should be harnessed to enhance learning rather than inhibit it.

Why Are Students Using AI Tools?

Students reported using genAI for various purposes, including explaining concepts, summarizing articles, and generating research ideas. However, nearly one in five students (18%) admitted to including AI-generated text directly in their work.

When asked about their reasons for using AI, 51% cited time-saving benefits, while 50% said it improved the quality of their work. Despite these advantages, concerns about academic misconduct and the potential for false or biased results discouraged some from fully embracing AI.

One student shared their mixed feelings: “I enjoy working with AI as it makes life easier when doing assignments; however, I do get scared I’ll get caught.”

Who Is More Likely to Use AI?

The report found demographic differences in AI usage. Men, wealthier students, and those studying science, technology, engineering, and maths (STEM) subjects were more likely to use AI tools. Half of students from privileged backgrounds used genAI for summarizing articles, compared to 44% from less privileged backgrounds.

“The digital divide we identified in 2024 appears to have widened,” the report concluded, raising concerns about unequal access to technology.

How Are Universities Addressing AI Use?

Most students believe their universities have responded effectively to concerns over academic integrity, with 80% stating that their institution’s policy on AI use is clear. Additionally, 76% believe their university can detect AI-generated work.

Despite this, only 36% of students have received training in AI skills from their university. Some students expressed frustration with mixed messages from lecturers. “They dance around the subject,” one student noted. “It’s not banned but not advised, it’s academic misconduct if you use it, but lecturers tell us they use it. Very mixed messages.”

Is Avoiding AI a Disadvantage?

A computer science expert highlighted that students who resist using AI may be placing themselves at a competitive disadvantage. “Students who aren’t using generative AI tools are now a tiny minority,” he observed. “I know some students are resistant to AI, and I can understand the ethical concerns, but they’re really putting themselves at quite a competitive disadvantage, both in education and in preparing for future careers.”

How Can Universities Strike a Balance?ge?

A spokesperson for a higher education organization emphasized the need for universities to equip students for an AI-driven world while maintaining academic integrity. “To effectively educate the workforce of tomorrow, universities must increasingly equip students to work in a world that will be shaped by AI, and it’s clear progress is being made.”

The spokesperson added that universities must also address the risks posed by AI in assessments. “All have codes of conduct that include severe penalties for students found to be submitting work that is not their own, and they engage students from day one on the implications of cheating.”

As AI in education becomes a permanent fixture in learning, universities face the challenge of ensuring that technology serves as a tool for academic growth rather than an enabler of dishonesty.

Add a Comment

Your email address will not be published. Required fields are marked *