How can I avoid OutOfMemoryErrors when using Commons FileUpload's DiskFileItem to upload large files?

Robert Campbell picture Robert Campbell · Nov 7, 2009 · Viewed 8.9k times · Source

I am getting OutOfMemoryErrors when uploading large (>300MB) files to a servlet utilizing Commons FileUpload 1.2.1. It seems odd, because the entire point of using DiskFileItem is to prevent the (possibly large) file from residing in memory. I am using the default size threshold of 10KB, so that's all that should ever be loaded into the heap, right? Here is the partial stack trace:

java.lang.OutOfMemoryError
       at java.io.FileInputStream.readBytes(Native Method)
       at java.io.FileInputStream.read(FileInputStream.java:177)
       at org.apache.commons.fileupload.disk.DiskFileItem.get(DiskFileItem.java:334)
       at org.springframework.web.multipart.commons.CommonsMultipartFile.getBytes(CommonsMultipartFile.java:114)

Why is this happening? Is there some configuration I'm missing? Any tips/tricks to avoid this situation besides increasing my heap size?

I really shouldn't have to increase my heap, because in theory the most that should be loaded into memory from this operation is a little over 10KB. Plus, my heap max (-Xmx) is already set for 1GB which should be plenty.

Answer

Carl Smotricz picture Carl Smotricz · Nov 7, 2009

When dealing with file uploads, especially big ones, you should process those files as streams which you slurp into a medium-size in-memory buffer and copy directly into your output file. The wrong way to do it is to inhale the whole thing into memory before writing it out.

The doc on commons-upload mentions, just below the middle, how to "Process a file upload". If you remember to copy from the inputstream to the outputstream in reasonably sized chunks (say, 1 MB), you should have no problem.