Best approach to write huge xml data to file?

Kayes picture Kayes · Apr 20, 2010 · Viewed 9.6k times · Source

I'm currently exporting a database table with huge data (100000+ records) into an xml file using XmlTextWriter class and I'm writing directly to a file on the physical drive.

_XmlTextWriterObject = new XmlTextWriter(_xmlFilePath, null);

While my code runs ok, my question is that is it the best approach? Should I instead write the whole xml in memory stream first and then write the xml document in physical file from memory stream? And what are the effects on memory/ performance in both cases?

EDIT

Sorry that I could not actually convey what I meant to say.Thanks Ash for pointing out. I will indeed be using XmlTextWriter but I meant to say whether to pass a physical file path string to the XmlTextWriter constructor (or, as John suggested, to the XmlTextWriter.Create() method) or use stream based api. My current code looks like the following:

XmlWriter objXmlWriter = XmlTextWriter.Create(new BufferedStream(new FileStream(@"C:\test.xml", FileMode.Create, System.Security.AccessControl.FileSystemRights.Write, FileShare.None, 1024, FileOptions.SequentialScan)), new XmlWriterSettings { Encoding = Encoding.Unicode, Indent = true, CloseOutput = true });
using (objXmlWriter)
{
   //writing xml contents here
}

Answer

Kyle Rosendo picture Kyle Rosendo · Apr 20, 2010

The rule of thumb is to use XmlWriter when the document need only be written and not worked with in memory, and to use XmlDocument (or the DOM) where you do need to work with it in memory.

Remember though, XmlWriter implements IDisposable, so do the following:

using (XmlWriter _XmlTextWriterObject = XmlWriter.Create(_xmlFilePath))
{
   // Code to do the write here
}