Preferences

This is exactly right. Attention is all you need. It's all about attention. Attention is finite.

The more you data load into context the more you dilute attention.


people who criticize LLMs for merely regurgitating statistically related token sequences have very clearly never read a single HN comment

This item has no comments currently.

Keyboard Shortcuts

Story Lists

j
Next story
k
Previous story
Shift+j
Last story
Shift+k
First story
o Enter
Go to story URL
c
Go to comments
u
Go to author

Navigation

Shift+t
Go to top stories
Shift+n
Go to new stories
Shift+b
Go to best stories
Shift+a
Go to Ask HN
Shift+s
Go to Show HN

Miscellaneous

?
Show this modal