Sci Simple

New Science Research Articles Everyday

# Computer Science # Software Engineering

Is ChatGPT Safe for Student Coders?

Students face risks using ChatGPT for programming assignments due to plagiarism concerns.

Julia Kotovich, Manuel Oriol

― 6 min read


ChatGPT Risks for Student ChatGPT Risks for Student Coders using ChatGPT for assignments. Plagiarism risks loom for students
Table of Contents

With the rise of artificial intelligence (AI) in our daily lives, one tool has caught the attention of Students and educators alike: ChatGPT. This AI chat engine allows users to ask questions and get answers that feel quite reasonable. Many students are using it to tackle programming Assignments. But the big question is: is it safe for students to use ChatGPT for this purpose? Let’s dive into the details.

What Does “Safe” Mean?

Before we get into the findings, let’s clarify what we mean by “safe.” In this context, "safe" means that students can use ChatGPT to complete assignments without worrying about being caught for Plagiarism. If they can use the tool without getting into trouble, then it's considered safe. If not, well, let’s just say they might want to think twice before relying on it.

Students Are Getting Caught

Studies have shown that relying on ChatGPT for programming tasks is not as safe as one might hope. Research used a tool called Codequiry to check for similarities in code generated by ChatGPT and real-world sources. The results indicated that there is a good chance that students using ChatGPT might be accused of plagiarism. Spoiler alert: this isn't the best news for students hoping to skate by without detection.

In one study, it was found that in many cases, Codequiry flagged ChatGPT-generated code as being too similar to existing code found online. A quick Google search also revealed a plethora of similar code snippets. These findings suggest that using ChatGPT for coding assignments can land students in hot water.

ChatGPT's Accuracy is Remarkable

Let's give credit where it's due: ChatGPT is pretty good at coding. In situations where it was tested against simple requests, the AI produced correct answers every time. Want a Bubble Sort algorithm? Done. Need a Python implementation for a linked list? No problem. The AI seems to hit the mark when it comes to basic coding tasks. However, there's a catch.

While ChatGPT might generate code that works, the real concern lies in how original that code is. If students repeat an assignment using the same prompts, they're likely to get similar, if not identical, results. This is not ideal when you're trying to pass off someone else's work as your own.

The Experiment

In a recent experiment, researchers set out to see just how much code generated by ChatGPT matched existing online sources. They focused on common algorithms and data structures, using Python as the programming language. Why Python, you may ask? Well, it's the most popular programming language, making it a prime candidate for testing.

Students were asked to use ChatGPT to create code for various algorithms. The research team used Codequiry to check the generated code for matches. They also conducted Google searches for each piece of code to see how many similar snippets appeared online.

The results were eye-opening. Out of the total tests, Codequiry found numerous instances where ChatGPT’s code was strikingly similar to existing code. The research indicated that there was a significant chance of being called out for plagiarism.

The Numbers Don't Lie

The data showed that approximately 38% of the time, at least half of the code generated by ChatGPT was flagged as similar to already existing work. When looking at Google search results, nearly 96% of the time, similar code was identified. This means that students using ChatGPT would likely find themselves in a tight spot if their instructors took a closer look at their assignments.

Challenges and Concerns

While the initial results seem pretty convincing, there are some challenges to consider. First, the algorithms tested don't represent all coding scenarios. Sure, ChatGPT can handle simple requests, but what about more complex ones? The concern is that as the complexity of the task increases, the likelihood of ChatGPT producing useful, non-copied code might decrease.

Another issue is that ChatGPT tends to deliver similar results for identical prompts. If one student uses a prompt, and another uses the same one, chances are they'll end up with very similar code. This isn't good news for anyone trying to pass off AI-generated work as their own.

Additionally, the terms of safety were quite conservative. Many universities might not even use tools to detect plagiarism, leaving the door wide open for students to submit similar work without consequences. This study mainly focused on the tools currently available, meaning future improvements could further complicate things for students trying to use AI without being caught.

Related Tools and Their Effectiveness

The emergence of AI tools like ChatGPT has sparked curiosity and concern among educators and students. ChatGPT is not the only game in town, though. Other bots and tools are available to assist in coding, documentation, and answering questions about programming languages. Some tools seek to automate tasks to improve productivity. However, as these tools become more prevalent, we may see an increase in similar-looking assignments across classrooms.

Many plagiarism detection tools, like Codequiry, are making strides to identify AI-generated content. While initial attempts may not be perfect, advancements likely mean these tools will soon become better at spotting similarities in code—even if they originated from AI.

The Future of Coding and Education

As AI continues to evolve in the education sector, it’s safe to say that both students and educators must adapt. Educators may need to rethink how they assess students' abilities and what kind of assignments they give. Assignments may need to evolve to ensure that students can’t rely solely on AI for help.

However, students must also tread carefully. If they make the choice to use AI-driven tools, they should be aware of the potential pitfalls. The risk of being caught for plagiarism should be a serious consideration before they take the plunge. In the long run, it might be better to use AI as a supplementary tool, rather than a crutch to lean on.

Conclusion: Is ChatGPT Safe?

So, what's the bottom line? ChatGPT can produce working code for simple programming tasks. However, relying on it for assignments that are supposed to be original is a risky move. The chances of being caught for plagiarism are notably high, which can lead to serious consequences for students.

While AI tools can enhance productivity and ease the workload, they come with their own set of challenges. Moving forward, it will be crucial for students to stay informed and use these tools wisely—if at all. The world of coding is rapidly changing, and staying one step ahead will be essential for success.

Original Source

Title: Is ChatGPT 3 safe for students?

Abstract: ChatGPT3 is a chat engine that fulfils the promises of an AI-based chat engine: users can ask a question (prompt) and it answers in a reasonable manner. The coding-related skills of ChatGPT are especially impressive: informal testing shows that it is difficult to find simple questions that ChatGPT3 does not know how to answer properly. Some students are certainly already using it to answer programming assignments. This article studies whether it is safe for students to use ChatGPT3 to answer coding assignments (safe means that they will not be caught for plagiarism if they use it). The main result is that it is generally not safe for students to use ChatGPT3. We evaluated the safety of code generated with ChatGPT3, by performing a search with a Codequiry, a plagiarism detection tool, and searching plagiarized code in Google (only considering the first page of results). In 38% of the cases, Codequiry finds a piece of code that is partially copied by the answer of ChatGPT3. In 96% of the cases, the Google search finds a piece of code very similar to the generated code. Overall, it is not safe for students to use ChatGPT3 in 96% of the cases.

Authors: Julia Kotovich, Manuel Oriol

Last Update: Dec 10, 2024

Language: English

Source URL: https://arxiv.org/abs/2412.07564

Source PDF: https://arxiv.org/pdf/2412.07564

Licence: https://creativecommons.org/licenses/by/4.0/

Changes: This summary was created with assistance from AI and may have inaccuracies. For accurate information, please refer to the original source documents linked here.

Thank you to arxiv for use of its open access interoperability.

Similar Articles