NEWS
NEWS

Guide developed to catch students doing homework with ChatGPT: "AI can't think."

Updated

Artificial Intelligence tends to avoid questions, limits personal comments, is less persuasive, and lacks a solid perspective on any topic

Students walk through Sproul Plaza on the University of California.
Students walk through Sproul Plaza on the University of California.AP

It must be acknowledged that it is tempting. ChatGPT has become the favorite shortcut for students worldwide to solve their assignments, but according to a study by the University of East Anglia (United Kingdom) in collaboration with Jilin University in China, it shows.

The study compared the work of 145 real students with 145 essays generated by ChatGPT. And although the AI essays were impressively coherent and grammatically correct, they lacked what we call "a personal touch," the study points out.

The findings are expected to help teachers catch students who are cheating in schools, colleges, and universities worldwide every day by presenting machine-generated essays as their own.

And yet the line between human writing and machine writing continues to blur day by day. "Since its public release, ChatGPT has generated considerable anxiety among educators, who fear that students will use it to write their assignments, and the truth is they do not have reliable tools to detect texts created by AI," explains Professor Ken Hyland from the School of Education and Lifelong Learning at UEA.

"In response to these concerns, we wanted to verify how accurately AI can mimic human essay writing, focusing especially on how writers interact with readers," Hyland explains.

During the analysis of the 145 essays written by students and the other 145 generated by ChatGPT, researchers tried to find what they called 'engagement markers,' such as questions and personal comments. "We found that essays written by real students presented a wide range of engagement strategies, making them more interactive and persuasive. They were full of rhetorical questions, personal comments, and direct appeals to the reader, all techniques that enhance clarity, connection, and produce a solid argument," Hyland explains.

However, essays written by ChatGPT, although linguistically fluent, were more impersonal. "They mimicked the conventions of academic writing, but they failed to give the text a personal touch or demonstrate a clear stance," explains the UEA researcher.

In fact, Artificial Intelligence tended to avoid questions and limit personal comments. Overall, they were less engaging, less persuasive, and lacked a solid perspective on a topic. "This reflects the nature of their training data and their statistical learning methods, which prioritize coherence over conversational nuances," Hyland adds.

There is a fear that ChatGPT and future AI writing tools may weaken students' basic literacy and critical thinking skills. But despite its shortcomings, researchers believe that AI should be used as a teaching aid. "When students come to school, college, or university, we not only teach them how to write, we teach them how to think, and that is something no algorithm can replicate," Professor Hyland adds.