I have written a process where a file is encrypted and uploaded to Azure, then the download process has to be decrypted which is what fails with a "Padding is invalid and cannot be removed" error, or a "Length of the data to decrypt is invalid." error.
I've tried numerous solutions online, including C# Decrypting mp3 file using RijndaelManaged and CryptoStream, but none of them seem to work and I end up just bouncing back and forth between these two errors. The encryption process uses the same key/IV pair that decryption uses, and since it will decrypt a portion of the stream I feel like that's working fine - it just ends up dying with the above errors.
Here is my code, any ideas? Please note that the three variants (cryptoStream.CopyTo(decryptedStream)
, do {}
and while
) aren't run together - they are here to show the options I've already tried, all of which fail.
byte[] encryptedBytes = null;
using (var encryptedStream = new MemoryStream())
{
//download from Azure
cloudBlockBlob.DownloadToStream(encryptedStream);
//reset positioning for reading it back out
encryptedStream.Position = 0;
encryptedBytes = encryptedStream.ConvertToByteArray();
}
//used for the blob stream from Azure
using (var encryptedStream = new MemoryStream(encryptedBytes))
{
//stream where decrypted contents will be stored
using (var decryptedStream = new MemoryStream())
{
using (var aes = new RijndaelManaged { KeySize = 256, Key = blobKey.Key, IV = blobKey.IV })
{
using (var decryptor = aes.CreateDecryptor())
{
//decrypt stream and write it to parent stream
using (var cryptoStream = new CryptoStream(encryptedStream, decryptor, CryptoStreamMode.Read))
{
//fails here with "Length of the data to decrypt is invalid." error
cryptoStream.CopyTo(decryptedStream);
int data;
//fails here with "Length of the data to decrypt is invalid." error after it loops a number of times,
//implying it is in fact decrypting part of it, just not everything
do
{
data = cryptoStream.ReadByte();
decryptedStream.WriteByte((byte)cryptoStream.ReadByte());
} while (!cryptoStream.HasFlushedFinalBlock);
//fails here with "Length of the data to decrypt is invalid." error after it loops a number of times,
//implying it is in fact decrypting part of it, just not everything
while ((data = cryptoStream.ReadByte()) != -1)
{
decryptedStream.WriteByte((byte)data);
}
}
}
}
//reset position in prep for reading
decryptedStream.Position = 0;
return decryptedStream.ConvertToByteArray();
}
}
One of the comments mentioned wanting to know what ConvertToByteArray
is, and it's just a simple extension method:
/// <summary>
/// Converts a Stream into a byte array.
/// </summary>
/// <param name="stream">The stream to convert.</param>
/// <returns>A byte[] array representing the current stream.</returns>
public static byte[] ConvertToByteArray(this Stream stream)
{
byte[] buffer = new byte[16 * 1024];
using (MemoryStream ms = new MemoryStream())
{
int read;
while ((read = stream.Read(buffer, 0, buffer.Length)) > 0)
{
ms.Write(buffer, 0, read);
}
return ms.ToArray();
}
}
The code never reaches this though - it dies before I can ever get it to this point.
After a lot of back and forth from various blogs, I found I actually had a couple of errors in the above code that were nailing me. First, the encryption process was incorrectly writing the array - it was wrapped with a CryptoStream
instance, but wasn't actually utilizing that so I was writing the unencrypted data to Azure. Here is the proper route to go with this (fileKey
is part of a custom class I created to generate Key/IV pairs, so wherever that is referenced can be changed to the built-in process from RijndaelManaged
or anything else you'd utilize for coming up with a key/IV pair):
using (var aes = new RijndaelManaged { KeySize = 256, Key = fileKey.Key, IV = fileKey.IV })
{
using (var encryptedStream = new MemoryStream())
{
using (ICryptoTransform encryptor = aes.CreateEncryptor())
{
using (CryptoStream cryptoStream = new CryptoStream(encryptedStream, encryptor, CryptoStreamMode.Write))
{
using (var originalByteStream = new MemoryStream(file.File.Data))
{
int data;
while ((data = originalByteStream.ReadByte()) != -1)
cryptoStream.WriteByte((byte)data);
}
}
}
var encryptedBytes = encryptedStream.ToArray();
return encryptedBytes;
}
}
Second, since my encryption process involves multiple steps (three total keys per file - container, filename and file itself), when I tried to decrypt, I was using the wrong key (which is seen above when I referenced blobKey
to decrypt, which was actually the key used for encrypting the filename and not the file itself. The proper decryption method was:
//used for the blob stream from Azure
using (var encryptedStream = new MemoryStream(encryptedBytes))
{
//stream where decrypted contents will be stored
using (var decryptedStream = new MemoryStream())
{
using (var aes = new RijndaelManaged { KeySize = 256, Key = blobKey.Key, IV = blobKey.IV })
{
using (var decryptor = aes.CreateDecryptor())
{
//decrypt stream and write it to parent stream
using (var cryptoStream = new CryptoStream(encryptedStream, decryptor, CryptoStreamMode.Read))
{
int data;
while ((data = cryptoStream.ReadByte()) != -1)
decryptedStream.WriteByte((byte)data);
}
}
}
//reset position in prep for reading
decryptedStream.Position = 0;
return decryptedStream.ConvertToByteArray();
}
}
I had looked into the Azure Encryption Extensions (http://www.stefangordon.com/introducing-azure-encryption-extensions/), but it was a little more local file-centric than I was interested - everything on my end is streams/in-memory only, and retrofitting that utility was going to be more work than it was worth.
Hopefully this helps anyone looking to encrypt Azure blobs with zero reliance on the underlying file system!