Just do the research, and you don't have to qualify it. "GPT said that Don Knuth said..." Just verify that Don said it, and report the real fact! And if something turns out to be too difficult to fact check, that's still valuable information.
I don't think I've ever seen anyone lambasted for citing stackoverflow as a source. At best, they chastised for not reading the comments, but nowhere as much pushback as for LLMs.
Also, using Stack Overflow correctly requires more critical thinking. You have to determine whether any given question-and-answer is actually relevant to your problem, rather than just pasting in your code and seeing what the LLM says. Requiring more work is not inherently a good thing, but it does mean that if you’re citing Stack Overflow, you probably have a somewhat better understanding of whatever you’re citing it for than if you cited an LLM.
> Not to mention that you are technically not allowed to just copy-paste stuff from SO.
Sure you can. Over the last ten years, I have probably copied at least 100 snippets of code from StackOverflow in my corporate code base (and included a link to the original code). The stuff that was published before Generation AI Slop started is unbeatable as a source of code snippets. I am a developer for internal CRUD apps, so we don't care about licenses (except AGPL due to FUD by legal & compliance teams). Anything goes because we do not distribute our software externally.If anything, SO having verified answers helps its credibility slightly compared to a LLM which are all known to regularly hallucinate (see: literally this post).
And all the other examples will have a chain of "upstream" references, data and discussion.
I suppose you can use those same phrases to reference things without that, random "summaries" without references or research, "expert opinion" from someone without any experience in that sector, opinion pieces from similarly reputation-less people etc. but I'd say they're equally worthless as references as "According to GPT...", and should be treated similarly.
Copy and pasting from ChatGPT has the same consequences as copying and pasting from StackOverflow, which is to say you're now on the hook supporting code in production that you don't understand.
I can use ChatGPT to teach me and understand a topic or i can use it to give me an answer and not double check and just copy paste.
Just shows off how much you care about the topic at hand, no?
Starting the answer with "I asked ChatGPT and it said..." almost 100% means the poster did not double-check.
(This is the same with other systems: If you say, "According to Google...", then you are admitting you don't know much about this topic. This can occasionally be useful, but most of the time it's just annoying...)
It sucks at sports trivia. It will confidently return information that is straight up wrong [1]. This should be a walk in the park for an LLM, but it fails spectacularly at it. How is this useful for learning at all?
[0] https://en.m.wikipedia.org/wiki/Gell-Mann_amnesia_effect
If you don't know anything about the subject area, how do you know if you are asking the right questions?
- I had to Google it...
- According to a StackOverflow answer...
- Person X told me about this nice trick...
- etc.
Stating your sources should surely not be a bad thing, no?