I have a set of servers filled each with a bunch of files that can be gzipped. The servers all have different numbers of cores. How can I write a bash script to launch a gzip for each core and make sure the gzips are not zipping the same file?
There is an implementation of gzip that is multithreaded, pigz. Since it is compressing one file on multiple threads, it should be able to read from disk more efficiently, compared to compressing multiple files at once.
Possible Duplicate:
gzipping up a set of directories and creating a tar compressed file
This post describes how to gzip each file individually within a directory structure. However, I need to do something slightly different. I need to produce one …
I have a bash script that creates a tar.gz and encrypts then sends to drive. However I cannot open the .tar.gz afterwards. Here is my process...
Bash Script that encrypts.
#!/bin/sh
# tar the automysqlbackup directory
tar -zcf "…