This is moving goalposts vs the original claim upthread that LLMs are just regurgitating human-authored stackoverflow answers and without those answers it would be useless.
It’s silly to say that something LLMs can reliably do is impossible and every time it happens it’s “dumb luck”.
It’s silly to say that something LLMs can reliably do is impossible and every time it happens it’s “dumb luck”.