Like it or not, in our society scientists' job is to churn out papers. Of course they'll use the most efficient way to churn out papers.
The problem with this analogy is that it makes no sense.
LLMs aren’t guns.
The problem with using them is that humans have to review the content for accuracy. And that gets tiresome because the whole point is that the LLM saves you time and effort doing it yourself. So naturally people will tend to stop checking and assume the output is correct, “because the LLM is so good.”
Then you get false citations and bogus claims everywhere.
But regardless, I thought the point was that...
> The problem with using them is that humans have to review the content for accuracy.
There are (at least) two humans in this equation. The publisher, and the reader. The publisher at least should do their due diligence, regardless of how "hard" it is (in this case, we literally just ask that you review your OWN CITATIONS that you insert into your paper). This is why we have accountability as a concept.
Right. A gun doesn't misfire 20% of the time.
> The problem with using them is that humans have to review the content for accuracy.
How long are we going to push this same narrative we've been hearing since the introduction of these tools? When can we trust these tools to be accurate? For technology that is marketed as having superhuman intelligence, it sure seems dumb that it has to be fact-checked by less-intelligent humans.
Absolutely. Many guns don't have safties. You don't load a round in the chamber unless you intend on using it.
A gun going off when you don't intend is a negligent discharge. No ifs, ands or buts. The person in possession of the gun is always responsible for it.
false. A gun goes off when not intended too often to claim that. It has happned to me - I then took the gun to a qualified gunsmith for repairs.
A gun they fires and hits anything you didn't intend to is negligent discharge even if you intended to shoot. Gun saftey is about assuming a gun that could possible fire will and ensuring nothing bad can happen. When looking at gun in a store (that you might want to buy) you aim it at an upper corner where even if it fires the odds of something bad resulting is the least lively to happen (it should be unloaded - and you may have checked, but you still aim there!)
same with cat toy lazers - they should be safe to shine in an eye - but you still point in a safe direction.
If someone performs a negligent discharge, they are responsible, not Glock. It does have other safety mechanisms to prevent accidental fires not resulting from a trigger pull.
Another way LLMs are not guns: you don’t need a giant data centre owned by a mega corp to use your gun.
Can’t do science because GlockGPT is down? Too bad I guess. Let’s go watch the paint dry.
The reason I made it is because this is inherently how we designed LLMs. They will make bad citations and people need to be careful.
That's the issue here. Of course you should be aware of the fact that these things need to be checked - especially if you're a scientist.
This is no secret only known to people on HN. LLMs are tools. People using these tools need to be diligent.
Yes, and they are the ones responsible for the poor quality of work that results from that.
The issue is when you give EVERYONE guns, and then are surprised when enough people do bad things with them, to create externalities for everyone else.
There is some sort of trip up when personal responsibility, and society wide behaviors, intersect. Sure most people will be reasonable, but the issue is often the cost of the number of irresponsible or outright bad actors.
I'm proposing the true proposal of many guns rights advocates: anyone might have a gun.
So let me choose the 50 and you give them guns! Why not?