How to export large SQL Server table into a CSV file using the FileHelpers library?

mircea picture mircea · Oct 18, 2013 · Viewed 26.8k times · Source

I'm looking to export a large SQL Server table into a CSV file using C# and the FileHelpers library. I could consider C# and bcp as well, but I thought FileHelpers would be more flexible than bcp. Speed is not a special requirement. OutOfMemoryException is thrown on the storage.ExtractRecords() when the below code is run (some less essential code has been omitted):

  SqlServerStorage storage = new SqlServerStorage(typeof(Order));
    storage.ServerName = "SqlServer"; 
    storage.DatabaseName = "SqlDataBase";
    storage.SelectSql = "select * from Orders";
    storage.FillRecordCallback = new FillRecordHandler(FillRecordOrder);
    Order[] output = null;
    output = storage.ExtractRecords() as Order[];

When the below code is run, 'Timeout expired' is thrown on the link.ExtractToFile():

 SqlServerStorage storage = new SqlServerStorage(typeof(Order));
    string sqlConnectionString = "Server=SqlServer;Database=SqlDataBase;Trusted_Connection=True";
    storage.ConnectionString = sqlConnectionString;
    storage.SelectSql = "select * from Orders";
    storage.FillRecordCallback = new FillRecordHandler(FillRecordOrder);
    FileDataLink link = new FileDataLink(storage);
    link.FileHelperEngine.HeaderText = headerLine;
    link.ExtractToFile("file.csv");

The SQL query run takes more than the default 30 sec and therefore the timeout exception. Unfortunately, I can't find in the FileHelpers docs how to set the SQL Command timeout to a higher value.

I could consider to loop an SQL select on small data sets until the whole table gets exported, but the procedure would be too complicated. Is there a straightforward method to use FileHelpers on large DB tables export?

Answer

Jay Sullivan picture Jay Sullivan · Jan 2, 2014

Rei Sivan's answer is on the right track, as it will scale well with large files, because it avoids reading the entire table into memory. However, the code can be cleaned up.

shamp00's solution requires external libraries.

Here is a simpler table-to-CSV-file exporter that will scale well to large files, and does not require any external libraries:

using System;
using System.Collections.Generic;
using System.Data;
using System.Data.SqlClient;
using System.IO;
using System.Linq;

public class TableDumper
{
    public void DumpTableToFile(SqlConnection connection, string tableName, string destinationFile)
    {
        using (var command = new SqlCommand("select * from " + tableName, connection))
        using (var reader = command.ExecuteReader())
        using (var outFile = File.CreateText(destinationFile))
        {
            string[] columnNames = GetColumnNames(reader).ToArray();
            int numFields = columnNames.Length;
            outFile.WriteLine(string.Join(",", columnNames));
            if (reader.HasRows)
            {
                while (reader.Read())
                {
                    string[] columnValues = 
                        Enumerable.Range(0, numFields)
                                  .Select(i => reader.GetValue(i).ToString())
                                  .Select(field => string.Concat("\"", field.Replace("\"", "\"\""), "\""))
                                  .ToArray();
                    outFile.WriteLine(string.Join(",", columnValues));
                }
            }
        }
    }
    private IEnumerable<string> GetColumnNames(IDataReader reader)
    {
        foreach (DataRow row in reader.GetSchemaTable().Rows)
        {
            yield return (string)row["ColumnName"];
        }
    }
}

I wrote this code, and declare it CC0 (public domain).