I have written some code that loads an XML document using an XmlDocument
object so as to count it's nodes. Here is the method:
XmlDocument xml = new XmlDocument();
xml.Load(textBox1.Text);
XmlNodeList nodes = xml.SelectNodes("//File");
foreach (XmlNode node in nodes)
{
number_of_childs++;
}
The problem that I am facing is, when importing a large file, it takes like 700MB of RAM. If I then try to do some operation on the file, or even read from it to display its data in a ListView
, the application takes like 2GB of RAM. So, I was wondering, is there a method that closes the XmlDocument
and frees its memory, releasing the RAM. It is like it's forgetting to remove its content from memory.
No. The XmlDocument
class does not implement IDisposable
, so there is no way to force it to release it's resources at will. If you really need to immediately free the memory used by XmlDocument
, the only way to do that would be to do the following:
nodes = null;
xml = null;
GC.Collect();
The garbage collector works on a separate thread, so it may still not happen immediately. To force the garbage collection to occur synchronously, before continuing execution of your code, you must also call WaitForPendingFinalizers
, as such:
nodes = null;
xml = null;
GC.Collect();
GC.WaitForPendingFinalizers();
XmlDocument
always loads the entire document into memory at once. If you simply want to iterate through the nodes in the document as a stream, only loading a bit at a time, that is what the XmlReader
class is for. However, you lose a lot of functionality that way. For instance, there is no way to select nodes via XPath, as you where doing in your example. With XmlReader
, you have to write your own logic to determine where in the document you are and whether that matches what you are looking for.