The forum post quoted is by from a University Professor, who is having to deal with students use of AI, highlights very well some of the main issues with the use of AI generally. More and more people are mindlessly using AI to do their thinking for them, and the resultant AI output is often completely flawed or blatantly incorrect. As mentioned in a previous post, people generally using AI have absolutely no understanding how it works or what its limitations are, yet are trusting it completely as if it is a human authority. Also as previously mentioned, AI is programmed to reinforce the position and ego of the user no matter what, even at the expense of hallucinating totally incorrect answers. AI will never challenge, contradict, or admit that it is incorrect, in any way - it is programmed that way to retain and attract more users. AI is a tool, not an authority, and accordingly everything it produces should be verified by the human mind. Here is the article by the professor.
Link
QuoteProfessor here. ChatGPT has ruined my life. It's turned me into a human plagiarism-detector. I can't read a paper without wondering if a real human wrote it and learned anything, or if a student just generated a bunch of flaccid garbage and submitted it. It's made me suspicious of my students, and I hate feeling like that because most of them don't deserve it.
I actually get excited when I find typos and grammatical errors in their writing now.
The biggest issue—hands down—is that ChatGPT makes blatant errors when it comes to the knowledge base in my field (ancient history). I don't know if ChatGPT scrapes the internet as part of its training, but I wouldn't be surprised because it produces completely inaccurate stuff about ancient texts—akin to crap that appears on conspiracy theorist blogs. Sometimes ChatGPT's information is weak because—gird your loins—specialized knowledge about those texts exists only in obscure books, even now.
I've had students turn in papers that confidently cite non-existent scholarship, or even worse, non-existent quotes from ancient texts that the class supposedly read together and discussed over multiple class periods. It's heartbreaking to know they consider everything we did in class to be useless.
My constant struggle is how to convince them that getting an education in the humanities is not about regurgitating ideas/knowledge that already exist. It's about generating new knowledge, striving for creative insights, and having thoughts that haven't been had before. I don't want you to learn facts. I want you to think. To notice. To question. To reconsider. To challenge. Students don't yet get that ChatGPT only rearranges preexisting ideas, whether they are accurate or not.
And even if the information was guaranteed to be accurate, they're not learning anything by plugging a prompt in and turning in the resulting paper. They've bypassed the entire process of learning.
Link