No, it is supposed to find an answer that matches your persistence. That's what it does, and understanding that is the key to understanding its strengths and weaknesses. Otherwise you may just keep drinking the investors' kool-aid and pretend that it's a tool that's supposed to tell the truth. That's not what it does, that's not how it works and it's a safe bet that's not how it's gonna work in foreseeable future.
No, it is supposed to tell the truth and that is what is advertised, matching your persistence is what it sometimes actually does. But people are using it because it sometimes tells the truth, not because it sometimes matches your persistence.
Then they're just confused by false marketing. LLMs predict plausible text, that's all they do. Anything else is a side effect.
It is not supposed to find an answer that matches my persistence, its supposed to tell the truth or admit that it does not know. And even if there is an alabamer in the training set, that is either something else, not a US state, or a misspelling, in neither case should it end up on the list.