Hi all, just random thought I wanted to share. At work we use rust and have ci/cd systems for auto deployment. Our compiles are very slow cold, especially on cheap CI machines so it’s important we keep our ~1.5gb build cache and restore it every time we run a new build.
The way we do this is by tarring it and transferring the tar file to an s3 bucket. The latter is easy but I thought that the commands we use to create the tars might be useful to other people that are dealing with large (not in file size but more file numbers) archives.
Make sure you have pigz installed. It’s probably in your package manager.
We just use…
tar pcf - target | pigz > target.tar.gz
To archive and compress the target folder. And…
unpigz --stdout target.tar.gz | tar xf -
To restore it. By default, pigz will use number of cores in your system for compression and I think up to 4 or something for decompression. This really sped up our CI/cd pipeline (along with some more rust specific cleaning of the target folder I can talk about in another post if anyone wants). Hope this helps somebody :)
Here’s the GitHub repo for pigz, in case anyone’s curious
reply
Pigz: parallel implementation of gzip
reply