How the North-West University is embracing ethical use of AI and safeguarding its degrees

The North-West University (NWU) establishes committee to protect the sanctity of its degrees from the onslaught of artificial intelligence tools that hamper students’ critical thinking skills. Picture: Freepik

The North-West University (NWU) establishes committee to protect the sanctity of its degrees from the onslaught of artificial intelligence tools that hamper students’ critical thinking skills. Picture: Freepik

Published Sep 23, 2024

Share

In an era where artificial intelligence (AI) is transforming the educational landscape, both positively and negatively, the North-West University (NWU) is taking proactive steps to integrate AI into its academic framework and protect the sanctity of its degrees.

Rather than shunning AI, the NWU is embracing its potential, all while ensuring that students use these tools ethically and responsibly.

Recognizing the risks and rewards associated with AI, NWU has established an AI Steering Committee, comprising members from various departments across the university.

Led by Professor Anné Verhoef, the director of the School of Philosophy, the committee’s mission is clear: to harness AI's capabilities while safeguarding the integrity of academic work and ensuring that students are served and not hindered by this technology..

Chair of the NWU AI Steering Committee, Professor Anné H. Verhoef. Picture: Supplied

Prof. Verhoef points out that while AI can simplify research and academic tasks, it can also make students reliant on taking shortcuts that ultimately undermine their learning and skill development.

When students depend too much on AI tools, they may miss out on developing essential skills like problem-solving, creativity, and time management.

“Students may graduate with degrees, but without the skills needed to thrive in the workforce,” warns Prof. Verhoef.

At the heart of the university’s strategy is a robust framework that includes ethical guidelines for AI usage by lecturers, researchers and most importantly students.

Practically, Prof. Verhoef explains that with every assignment, lecturers need to always indicate the level of AI use allowed, which can vary from none to a certain amount.

There is also an online automated reporting system where the unethical use of AI can be easily and quickly reported by lecturers.

This system gives email warnings to the students and directs them to do a remedial online course on the ethical use of AI. If they complete it, their mark can be capped at 50% for that assignment, but otherwise they get zero.

Depending on the severity of the transgression, an investigation and certain mild to severe penalties can also be applied, but the system aims to be first educative in nature.

“This system is unique to the NWU and our statistics on the use of the system for the last 18 months show fantastic results for creating a culture of academic integrity at the NWU,” adds Prof. Verhoef.

AI has great potential however the associated risks such as the excessive reliance on generative AI, which undermines the quality of assessments (AI capabilities are limited and often prone to biased information) and eventually the entire degree of the student, as well as privacy, security and safety risks, far outweigh the good.

To combat these various risks of AI use, NWU has developed AI literacy courses designed to equip students with the knowledge they need to use these tools responsibly.

“If they don’t use AI tools properly, we can teach them how to use them effectively,” Prof. Verhoef explains.

Developments and challenges regarding AI and higher education are communicated to students and lecturers through the AI@NWU website, where various resources, information on events and training opportunities are also made available.

“There is nothing artificial about an NWU degree. It is more than a piece of paper, it is more than a testament to what you have done. It is a promise of what you are capable of doing,” said Prof. Verhoef.

IOL