That does force you to duplicate some assets a lot. It's also more important the slower your seeks are. This technique is perfect for disc media, since it has a fixed physical size (so wasting space on it is irrelevant) and slow seeks.
I'd love to see it analysed. Specifically, the average number of nonseq jumps vs overall size of the level. I'm sure you could avoid jumps within megabytes. But if someone ever got closer to filling up the disk in the past, the chances of contiguous gigabytes are much lower. This paper effectively says that if you have long files, there's almost guaranteed gaps https://dfrws.org/wp-content/uploads/2021/01/2021_APAC_paper... so at that point, you may be better off preallocating the individual does where eating the cost of switching between them.
But it also depends on how the assets are organized, you can probably group the level specific assets into a sequential section, and maybe shared assets could be somewhat grouped so related assets are sequential.
Nowadays? No. Even those with hard disks will have lots more RAM and thus disk cache. And you are even guaranteed SSDs on consoles. I think in general no one tries this technique anymore.
By default, Windows automatically defragments filesystems weekly if necessary. It can be configured in the "defragment and optimize drives" dialog.
https://web.archive.org/web/20100529025623/http://blogs.tech...
old article on the process
Someone installing a 150GB game sure do have 150GB+ of free space and there would be a lot of continuous free space.
If you break it up into smaller files, those are likely to be allocated all over the disk; plus you'll have delays on reading because windows defender makes opening files slow. If you have a single large file that contains all resources, even if that file is mostly sequential, there will be sections that you don't need, and read ahead cache may work against you, as it will tend to read things you don't need.
Which makes me think: Has there been any advances in disk scheduling in the last decade?