I need to compress a large file of about 17-20 GB. I need to split it into several files of around 1GB per file.
I searched for a solution via Google and found ways using split
and cat
commands. But they did not work for large files at all. Also, they won't work in Windows; I need to extract it on a Windows machine.
You can use the split
command with the -b
option:
split -b 1024m file.tar.gz
It can be reassembled on a Windows machine using @Joshua's answer.
copy /b file1 + file2 + file3 + file4 filetogether
Edit: As @Charlie stated in the comment below, you might want to set a prefix explicitly because it will use x
otherwise, which can be confusing.
split -b 1024m "file.tar.gz" "file.tar.gz.part-"
// Creates files: file.tar.gz.part-aa, file.tar.gz.part-ab, file.tar.gz.part-ac, ...
Edit: Editing the post because question is closed and the most effective solution is very close to the content of this answer:
# create archives
$ tar cz my_large_file_1 my_large_file_2 | split -b 1024MiB - myfiles_split.tgz_
# uncompress
$ cat myfiles_split.tgz_* | tar xz
This solution avoids the need to use an intermediate large file when (de)compressing. Use the tar -C option to use a different directory for the resulting files. btw if the archive consists from only a single file, tar could be avoided and only gzip used:
# create archives
$ gzip -c my_large_file | split -b 1024MiB - myfile_split.gz_
# uncompress
$ cat myfile_split.gz_* | gunzip -c > my_large_file
For windows you can download ported versions of the same commands or use cygwin.