Preferences

From facts and processes we know, we can derive novel information and conclusions. If you really understand why the sky is blue, you might be able to come to conclusions about why other things appear a certain color, like human eye color.

GPT can't make those kinds of reasoning or extensions. It can only regurgitate what is already known and has been stated before, somewhere before in its training set.

It's very impressive, I just think people over-hype it into something it is not.


Keyboard Shortcuts

Story Lists

j
Next story
k
Previous story
Shift+j
Last story
Shift+k
First story
o Enter
Go to story URL
c
Go to comments
u
Go to author

Navigation

Shift+t
Go to top stories
Shift+n
Go to new stories
Shift+b
Go to best stories
Shift+a
Go to Ask HN
Shift+s
Go to Show HN

Miscellaneous

?
Show this modal