Greetings....
I am writing a backup program in c# 3.5, using hte latest DotNetZip. The basics of the program is to be given a location on a server and the max size of a spanned zip file and go. From there it should traverse all the folder/files from the given location and add them to the archive, keeping the exact structure. It should also compress everything down to a reasonable amount. A given uncompressed collection of folders/files could easily be 10-25gb, with the created spanned files being limited to about 1gb each.
I have everything working (using DotNetZip). My only challenge is there is little to no compession actually happening. I chose to use the "AddDirectory" method for simplicity of code and just generally how well it seemed to fit my project. After reading around I am second guessing that decision.
Given the below code and the large amount of files in an archive, should I compress each file as it is added to the zip? or should the Adddirectory method provide about the same compression?
I have tried every level of compression offered by Ionic.Zlib.CompressionLevel and none seem to help. Should I think about using an outside compression algorithm and stream it into my DotNetZip file?
using (ZipFile zip = new ZipFile())
{
zip.AddDirectory(root.FullName);
if (zipPassword.Length > 0)
zip.Password = zipPassword;
float size = zipGbSize * 1024 * 1024 * 1024;
zip.CompressionLevel = Ionic.Zlib.CompressionLevel.BestCompression;
zip.AddProgress += new EventHandler<AddProgressEventArgs>(Zip_AddProgress);
zip.ZipError += new EventHandler<ZipErrorEventArgs>(Zip_ZipError);
zip.Comment = "This zip was created at " + System.DateTime.Now.ToString("G");
zip.MaxOutputSegmentSize = (int)size; //in gig
zip.Name = archiveDir.FullName + @"\Task_" + taskId.ToString() + ".zip";
zip.Save();
}
Thank you for any help!