Preferences

Then what about new, unseen riddles that don't have a similar pattern to existing ones? That's the question people are asking.

If an LLM can solve a riddle of arbitrary complexity that is not similar to an already-solved riddle, have the LLM solve the riddle "how can this trained machine-learning model be adjusted to improve its riddle-solving abilities without regressing in any other meaningful capability".

It's apparent that particular riddle is not presently solved successfully by LLMs, as if it were solved, humans would be having LLMs improve themselves in the wild.

So, constructively, there exists at least one riddle that doesn't have a pattern similar to existing ones, where that riddle is unsolvable by any existing LLM.

If you present a SINGLE riddle an LLM can solve, people will reply that particular riddle isn't good enough. In order to succeed they need to solve all the riddles, including the one I presented above.

Unfortunately, that's a "could an omipotent god create a boulder so heavy he can't move it" level of "logic puzzle" and does your argument no favors.
It's quite the opposite. Converting to words like yours, the argument is "could a powerful but not omnipotent god make themself more powerful", and the answer is "probably".

If the god cannot grant themself powers they're not very powerful at all, are they?

This item has no comments currently.

Keyboard Shortcuts

Story Lists

j
Next story
k
Previous story
Shift+j
Last story
Shift+k
First story
o Enter
Go to story URL
c
Go to comments
u
Go to author

Navigation

Shift+t
Go to top stories
Shift+n
Go to new stories
Shift+b
Go to best stories
Shift+a
Go to Ask HN
Shift+s
Go to Show HN

Miscellaneous

?
Show this modal