Depending on the decision yes. An LLM might confidently hallucinate incorrect information and misinform, which is worse than simply not knowing.
This item has no comments currently.
It looks like you have JavaScript disabled. This web app requires that JavaScript is enabled.
Please enable JavaScript to use this site (or just go read Hacker News).
Depending on the decision yes. An LLM might confidently hallucinate incorrect information and misinform, which is worse than simply not knowing.