I am wondering why I'm not able to allocate more that 1,000 MB of memory in my 32-bit .NET process. The following mini application throws an OutOfMemoryException after having allocated 1,000 MB. Why 1,000 MB, and not say 1.8 GB? Is there a process-wide setting I could change?
static void Main(string[] args)
{
ArrayList list = new ArrayList();
int i = 0;
while (true)
{
list.Add(new byte[1024 * 1024 * 10]); // 10 MB
i += 10;
Console.WriteLine(i);
}
}
PS: Garbage collecting does not help.
Edit, to clarify what I want: I have written a server application which deals with very large amounts of data before writing to database/disk. Instead of creating temporary files for everything, I have written an in-memory cache, which makes the whole thing super-fast. But memory is limited, and so I tried to find out what the limits are. And wondered why my small test program threw the OutOfMemoryException after exactly 1,000 MB.
Having enormous blocks of memory is never a good idea, even in 64bit. You get big problems with contiguous memory and fragmentation.
The problem here is finding a contiguous block. You could try enabling 3gb mode (which might help it find a few more bytes) but I really advise against it. The answers here are:
You might also want to read Eric Lippert's blog (he seems to have a blog entry for every common .NET question...)