Editorial | Academia must ensure AI is used responsibly
Cases of misuse by students and scholars should prompt universities to reconsider how the technology is integrated, to ensure standards are maintained

Artificial intelligence (AI) has been a great boost to productivity in many professional fields, including academia. But it can also be a double-edged sword. Easy access, low costs and convenience have meant that more students are handing in AI-generated assignments without doing their own reading and writing. But misuse has also affected some scholars. Such cases must be addressed, and awareness of responsibility involving AI needs to be raised to uphold academic standards and integrity.
AI can be a great research tool, but its accuracy and reliability are not guaranteed. Great care and repeated checks are needed, as with the use of any sources during research.
Yip is a leading scholar and government adviser. Hopefully, his important work will not be affected. But let this be a cautionary tale for all about the use of AI. Appropriately, the university is now implementing mandatory training and assessments for all researchers to ensure they use AI-related resources properly.
Universities around the world are integrating AI tools into teaching and research. At the same time, they need to balance such usage so they are not subject to misuse. It is now an uphill battle for professors to discourage students from relying on AI to generate essays and assignments, a practice that defeats the very purpose of academic learning, if not worse. Meanwhile, in more scientific fields, the outcomes of research with AI-assistance have, in some instances, led to questions about their validity. Such a powerful and rapidly evolving technology is bound to change how universities teach and conduct research. It has opened up a brave new world of knowledge and errors for students and professors to navigate.
