Whenever I see some source packages or binaries which are compressed with GZip I wonder if there are still reasons to favor gz over xz (excluding time travel to 2000), the savings of the LZMA compression algorithm are substantial and decompressions isn't magnitudes worse than gzip.
"Lowest Common Denominator". The extra space saved is rarely worth the loss of interoperability. Most embedded Linux systems have gzip, but not xz. Many old system as well. Gnu Tar which is the industry standard supports flags -z
to process through gzip, and -j
to process through bzip2, but some old systems don't support the -J
flag for xz, meaning it requires 2-step operation (and a lot of extra diskspace for uncompressed .tar
unless you use the syntax of |tar xf -
- which many people don't know about.) Also, uncompressing the full filesystem of some 10MB from tar.gz
on embedded ARM takes some 2 minutes and isn't really a problem. No clue about xz
but bzip2
takes around 10-15 minutes. Definitely not worth the bandwidth saved.