Things would improve a lot if we added offline abilities to html documents. I think atm there is no way to guarantee a website stays in the cache(?)
Designing a system to decide when to keep something is tricky. Maybe each visit and each click should extend the expiration date and increase the storage for static documents. Say, 10 visits should be enough to buy 1 mb of permanent storage to be spend on however many pages it takes starting with the frequently visited pages then a manifest or the order of links on the front page then the first from each sub page etc
It should also be possible to have the browser manage updates rather than every man for himself with each website testing the connection, checking for updates and stitching things back together again. There are quite a few schemes it could follow, smaller requests would require more complicated backends. Different pages with different update frequencies.
I think the single star bookmark button could have 1-5 stars with 5 assigning somewhat generous data to the website and 3+ allowing a prompt for very large things.
Then, since I'm serving static content anyway I really couldn't care less how the user obtains the files. If there is a copy of the website on a network all you need is a public key or to trust the user (at the price of annoying prompts warning you on every page view and every request)
If it all works well enough HN could be a tiny website managing only active discussions. If you have the key and a working connection to some other users most of the archive could be there. The catching priority could change to the rarest pages.
Designing a system to decide when to keep something is tricky. Maybe each visit and each click should extend the expiration date and increase the storage for static documents. Say, 10 visits should be enough to buy 1 mb of permanent storage to be spend on however many pages it takes starting with the frequently visited pages then a manifest or the order of links on the front page then the first from each sub page etc
It should also be possible to have the browser manage updates rather than every man for himself with each website testing the connection, checking for updates and stitching things back together again. There are quite a few schemes it could follow, smaller requests would require more complicated backends. Different pages with different update frequencies.
I think the single star bookmark button could have 1-5 stars with 5 assigning somewhat generous data to the website and 3+ allowing a prompt for very large things.
Then, since I'm serving static content anyway I really couldn't care less how the user obtains the files. If there is a copy of the website on a network all you need is a public key or to trust the user (at the price of annoying prompts warning you on every page view and every request)
If it all works well enough HN could be a tiny website managing only active discussions. If you have the key and a working connection to some other users most of the archive could be there. The catching priority could change to the rarest pages.