Internally ASP.NET has a 2 GB addressing space, but in reality you only have less than 1 GB free for uploads (see http://support.microsoft.com/?id=295626 ). In addition IIS 7 has a cap of 30 MB ( see http://www.iislogs.com/steveschofield/iis7-post-40-adjusting-file-upload-size-in-iis7 ) and you supposedly have to run
appcmd set config "My Site/MyApp" -section:requestFiltering -requestLimits.maxAllowedContentLength:104857600 -commitpath:apphost
on the server to go beyond this 30 MB limit. But how can I run this on my Azure servers?
Also, according to http://support.microsoft.com/?id=295626
During the upload process, ASP.NET loads the whole file in memory before the user can save the file to the disk.
, so I will quickly exhaust the memory limit if many users upload large files at one time. In my code below I use streams, but I'm guessing that the whole file is uploaded in memory first anyway. Is this the case?
using System;
using System.Web.Security;
using Microsoft.WindowsAzure;
using Microsoft.WindowsAzure.StorageClient;
namespace WebPages
{
public partial class Upload : System.Web.UI.Page
{
CloudBlobClient BlobClient = null;
CloudBlobContainer BlobContainer = null;
void InitBlob()
{
// Setup the connection to Windows Azure Storage
var storageAccount = CloudStorageAccount.FromConfigurationSetting("DataConnectionString");
BlobClient = storageAccount.CreateCloudBlobClient();
// Get and create the container
BlobContainer = BlobClient.GetContainerReference("publicfiles");
}
protected void Page_Load(object sender, EventArgs e)
{
//if (Membership.GetUser() == null) return; // Only allow registered users to upload files
InitBlob();
try
{
var file = Request.Files["Filedata"];
var storageAccount = CloudStorageAccount.FromConfigurationSetting("DataConnectionString");
BlobClient = storageAccount.CreateCloudBlobClient();
// Make a unique blob name
var extension = System.IO.Path.GetExtension(file.FileName);
// Create the Blob and upload the file
var blobAddressUri = String.Format("{0}{1}", Guid.NewGuid(), extension);
var blob = BlobContainer.GetBlobReference(blobAddressUri);
blob.UploadFromStream(file.InputStream);
// Set the metadata into the blob
blob.Metadata["FileName"] = file.FileName;
//blob.Metadata["Submitter"] = Membership.GetUser().UserName;
blob.Metadata["Type"] = "Video";
blob.Metadata["Description"] = "Test";
blob.SetMetadata();
// Set the properties
blob.Properties.ContentType = file.ContentType;
blob.SetProperties();
}
catch(Exception ex)
{
System.Diagnostics.Trace.TraceError("Upload file exception: {0}", ex.ToString());
// If any kind of error occurs return a 500 Internal Server error
Response.StatusCode = 500;
Response.Write("An error occured while uploading the file");
Response.End();
}
}
}
}
I am aware of non web page upload tools like http://azureblobuploader.codeplex.com/ , but I really need it to be uploaded from a web page.
So, my questions are:
I can also mention that my code above works fine by default with files smaller than 30 MB, I use SWFUpload V2.2.0 on the client.
Update 19. June 19:09: @YvesGoeleven on Twitter gave me a tip of using Shared Access Signature (see msdn.microsoft.com/en-us/library/ee395415.aspx ) and uploading the file directly to the Azure Blob Storage without going through the ASP.NET at all. I created a JSON WCF that returns a valid SAS ut to my blob storage.
using System.ServiceModel;
using System.ServiceModel.Web;
namespace WebPages.Interfaces
{
[ServiceContract]
public interface IUpload
{
[OperationContract]
[WebInvoke(Method = "GET",
ResponseFormat = WebMessageFormat.Json)]
string GetUploadUrl();
}
}
--------
using System;
using System.IO;
using System.Runtime.Serialization.Json;
using System.ServiceModel.Activation;
using System.Text;
using Microsoft.WindowsAzure;
using Microsoft.WindowsAzure.StorageClient;
namespace WebPages.Interfaces
{
[AspNetCompatibilityRequirements(RequirementsMode = AspNetCompatibilityRequirementsMode.Allowed)]
public class UploadService : IUpload
{
CloudBlobClient BlobClient;
CloudBlobContainer BlobContainer;
public UploadService()
{
// Setup the connection to Windows Azure Storage
var storageAccount = CloudStorageAccount.FromConfigurationSetting("DataConnectionString");
BlobClient = storageAccount.CreateCloudBlobClient();
// Get and create the container
BlobContainer = BlobClient.GetContainerReference("publicfiles");
}
string JsonSerialize(string url)
{
var serializer = new DataContractJsonSerializer(url.GetType());
var memoryStream = new MemoryStream();
serializer.WriteObject(memoryStream, url);
return Encoding.Default.GetString(memoryStream.ToArray());
}
public string GetUploadUrl()
{
var sasWithIdentifier = BlobContainer.GetSharedAccessSignature(new SharedAccessPolicy()
{
Permissions = SharedAccessPermissions.Write,
SharedAccessExpiryTime =
DateTime.UtcNow.AddMinutes(60)
});
return JsonSerialize(BlobContainer.Uri.AbsoluteUri + "/" + Guid.NewGuid() + sasWithIdentifier);
}
}
}
It works, but I can't use it with SWFUpload since it uses the HTTP POST verb and not the HTTP PUT verb that the Azure Blob Storage expects when creating a new blob item. Anyone know how to get around this without making my own custom Silverlight or Flash client component that uses the HTTP PUT verb? I wanted a progress bar when uploading the files, therefore a submitted form that uses PUT is not optimal.
For those interested in the client code (that wont work since SWFUpload uses HTTP POST and not PUT as Azure Blob Storage expects):
<div id="header">
<h1 id="logo"><a href="/">SWFUpload</a></h1>
<div id="version">v2.2.0</div>
</div>
<div id="content">
<h2>Application Demo (ASP.Net 2.0)</h2>
<div id="swfu_container" style="margin: 0px 10px;">
<div>
<span id="spanButtonPlaceholder"></span>
</div>
<div id="divFileProgressContainer" style="height: 75px;"></div>
<div id="thumbnails"></div>
</div>
</div>
<script type="text/javascript" language="javascript">
$(document).ready(function () {
$.ajax({
url: '/Interfaces/UploadService.svc/GetUploadUrl',
success: function (result) {
var parsedResult = $.parseJSON(result);
InitUploadFile(parsedResult);
}
});
function InitUploadFile(uploadUrl) {
//alert(uploadUrl);
var swfu = new SWFUpload({
// Backend Settings
upload_url: uploadUrl,
post_params: {
"ASPSESSID": "<%=Session.SessionID %>"
},
// File Upload Settings
file_size_limit: "100 MB",
file_types: "*.*",
file_types_description: "All file types",
file_upload_limit: "0", // Zero means unlimited
// Event Handler Settings - these functions as defined in Handlers.js
// The handlers are not part of SWFUpload but are part of my website and control how
// my website reacts to the SWFUpload events.
file_queue_error_handler: fileQueueError,
file_dialog_complete_handler: fileDialogComplete,
upload_progress_handler: uploadProgress,
upload_error_handler: uploadError,
upload_success_handler: uploadSuccess,
upload_complete_handler: uploadComplete,
// Button settings
button_image_url: "Images/swfupload/XPButtonNoText_160x22.png",
button_placeholder_id: "spanButtonPlaceholder",
button_width: 160,
button_height: 22,
button_text: '<span class="button">Select files <span class="buttonSmall">(2 MB Max)</span></span>',
button_text_style: '.button { font-family: Helvetica, Arial, sans-serif; font-size: 14pt; } .buttonSmall { font-size: 10pt; }',
button_text_top_padding: 1,
button_text_left_padding: 5,
// Flash Settings
flash_url: "Js/swfupload-2.2.0/swfupload.swf", // Relative to this file
custom_settings: {
upload_target: "divFileProgressContainer"
},
// Debug Settings
debug: false
});
}
});
</script>
Update 19. June 21:07:
I figured since SWFUpload is open source that I download the source and change the verb from POST to PUT, sadly the Flash Player URLRequestMethod does not support other verbs than GET and POST. I did find a supposed work-around
private function BuildRequest():URLRequest {
// Create the request object
var request:URLRequest = new URLRequest();
request.method = URLRequestMethod.POST;
request.requestHeaders.push(new URLRequestHeader("X-HTTP-Method-Override", "PUT"));
, but this only work in Adobe Air and not with the Flash Player.
I've read that SilverLight 3 and later supports the HTTP PUT verb, so I think I have to write some SilverLight code to get my way here. I did find this blog article series that will probably help me here http://blog.smarx.com/posts/uploading-windows-azure-blobs-from-silverlight-part-1-shared-access-signatures .
Update @ 27. June '11:
I now have successfully managed to upload large files (tested with 4,5 Gb files) from a web page using a custom Silverlight client I wrote based on the project in http://blog.smarx.com/posts/uploading-windows-azure-blobs-from-silverlight-part-1-shared-access-signatures . Since Silverlight supports both the HTTP PUT verb that Azure Blob Storage requires and supports progressive uploads, I now have the possibility to upload massive files directly to the Azure Blob Storage and I don't have to go throgh a ASP.NET solution, I also get some nice progress bars and the user can cancel in the middle of the upload if he/she wants to. The memory usage on the server is minimal since the whole file is not uploaded before it is placed in the Azure Blob Storage. I use a Shared Access Signature (see msdn.microsoft.com/en-us/library/ee395415.aspx ) that is supplied from a WCF RESTfull service on request. I think that this solution is the best one we found. Thanks.
Update @ 18. July '11:
I have created an open source project with what I found here: