• Phanatik
    link
    fedilink
    52 years ago

    There’s a difference between using ChatGPT to help you write a paper and having ChatGPT write the paper for you. One invokes plagiarism which schools/universities are strongly against.

    The problem is being able to differentiate between a paper that’s been written by a human (which may or may not be written with ChatGPT’s assistance) and a paper entirely written by ChatGPT and presented as a student’s own work.

    I want to strongly stress that in the latter situation, it is plagiarism. The argument doesn’t even involve the plagiarism that ChatGPT does. The definition of plagiarism is simple, ChatGPT wrote a paper, you the student did not and you are presenting ChatGPT’s paper as your own, ergo plagiarism.

    • RiikkaTheIcePrincess
      link
      fedilink
      12 years ago

      There’s a difference between using ChatGPT to help you write a paper and having ChatGPT write the paper for you.

      Yeah, one is what many “AI” fans insist is what’s happening, and the other is what people actually do because humans are lazy, intellectually dishonest piles of crap. “Just a little GPT,” they say. “I don’t see a problem, we’ll all just use it in moderation,” they say. Then somehow we only see more garbage full of errors; we get BS numbers, references to studies or legal cases or anything else that simply don’t exist, images of people with extra rows of teeth and hands where feet should be, gibberish non-text where text could obviously be… maybe we’ll even get ads injected into everything because why not screw up our already shitty world even more?

      So now people have this “tool” they think is simultaneously smarter and more creative than humans at all of the things humans have historically claimed makes them better than not only machines but other animals, but is also “just a tool” that they’re only going to use a little bit, to help out but not replace. They’ll trust this tool to be smarter than they are, which it will arguably impressively turn out to not be. They’ll expect everyone else to accept the costs this incurs, from environmental damage due to running the damn things to social, scientific, economic, and other harms caused by everything being generated by “hallucinating” “AI” that’s incapable of thinking.

      It’s all very tiring.

      (And now I’m probably going to get more crap for both things I’ve said and things I haven’t, because people are intellectually lazy/dishonest and can’t take criticism. Even more tiring! Bleh.)