simonw parent
I think you just coined "context rot", what an excellent term! Quoted you on my blog https://simonwillison.net/2025/Jun/18/context-rot/
I don’t know why, but going out of your way to make sure the coining of this is attributed to a random user on the internet made me incredibly nostalgic for what the pre Web 2.0 internet was like sans the 4chans, liveleak, and their forebears on usenet
maybe someday soon, LLMs will learn to smoothly forget their own irrelevant context
imagine instead of predicting just the next token, the LLM predicts a mask over the previous tokens, that is then thresholded and only “relevant” tokens are kept in the next inference
one key distinction between humans and LLMs is that humans are excellent at forgetting irrelevant data. we forget tens of times a second and only keep what's necessary