willis936 parent
The answer isn't a technical advancement but a cultural shift. We need to develop a discipline of skepticism and mistrust. No amount of authority, understanding, reasoning, etc. can be delegated to something that comes from a screen. This will take generations.
> We need to develop a discipline of skepticism and mistrust. No amount of authority, understanding, reasoning, etc. can be delegated to something that comes from a screen. This will take generations.
Authoritarian dream.
Please elaborate. Authoritarians seek to consolidate power, which AI enables. Individuals must build immunity to reality distortion fields. This comes from within, not from some centralized authority.
The problem with this line of thinking is that you can then only really trust your own personal bubble. Or actual trial and error, which is costly.
These models get ever better at producing plausible text. Once they permeate the academia completely, we're cooked.
And even academia is not clean for some matters, or complete.
> only really trust your own personal bubble.
I don't know how you got to this conclusion, but I trust my own thinking the least since it is my own personal bubble. Just because it is mine, doesn't make it good, it just makes it mine.
If the stakes are low, do whatever. But when you need solid answers, that is what rigor is for. You address the argument on merits, not who made it.
Don't suffer from open loop opinions, even your own.