- `zstd --train` a dictionary over all the files, then compress it all using the dictionary. Decompress on demand. Make sure you don't lose the dictionary, or you'll lose all of it.
- OR stick it into a Restic/Borg/Kopia/Deduplicacy repository. These use a rolling hash to deduplicate any subsections of files that are the same, before applying compression. You can try it with `--dry-run` first to see how much space you would save. Use their `mount` command to access the files as a directory.
I would not be surprised if it gets the total size down close-ish to the size of only a single copy of the game.
See the other comment about `precomp` if compression gives you issues.
Personally I would just go with the `restic` route, as that's what it's meant for.
will probably write the MPQ blobs down to disk and deduplicate via hardlinks and additionally on block level.
i don't know about restic (or borg, which was also recommended), but i will read up on it and doe some tests with it, regardless, since this seems to be a very nice tool for a lot of problem scenarios.
thanks for the input!
- `zstd --train` a dictionary over all the files, then compress it all using the dictionary. Decompress on demand. Make sure you don't lose the dictionary, or you'll lose all of it.
- OR stick it into a Restic/Borg/Kopia/Deduplicacy repository. These use a rolling hash to deduplicate any subsections of files that are the same, before applying compression. You can try it with `--dry-run` first to see how much space you would save. Use their `mount` command to access the files as a directory.
I would not be surprised if it gets the total size down close-ish to the size of only a single copy of the game.
See the other comment about `precomp` if compression gives you issues.
Personally I would just go with the `restic` route, as that's what it's meant for.