Study Reveals ChatGPT Performs Better On Writing Than Students

The rise of ChatGPT and other generative technologies has prompted a wave of hand-wringing about the potential impacts across society. Nowhere is this more so than in education, where concerns have been raised that such technologies will fundamentally undermine the education system.

These fears were hardly confounded by a study from New York University, which found that ChatGPT can do as well as or even better than the average university student when answering questions in subjects like computer science, political studies, engineering, and psychology.

Writing test

To test this, the researchers asked teachers from New York University Abu Dhabi to give them three answers from students for ten different questions. Then they asked ChatGPT to come up with three sets of answers to the same questions. People who didn’t know where the answers came from looked at them and gave grades. ChatGPT got similar or higher grades compared to students in nine out of 32 courses.

ChatGPT did best in a class about public policy, where it got a much higher grade than the students. The researchers also asked people from different countries what they thought about using ChatGPT for homework. About 74% of students said they would use it. But teachers were more critical: 70% of them said using ChatGPT would be like copying someone else’s work.

There were also some computer programs that were supposed to tell if a human or a machine wrote the answers. They made mistakes: sometimes they thought ChatGPT’s answers were from a person. All of these discoveries could help make rules about using AI tools like ChatGPT in school.

Facebooktwitterredditpinterestlinkedinmail