Line breaks in generated csv file driving me crazy

Melle Groenewoud picture Melle Groenewoud · May 10, 2011 · Viewed 13.7k times · Source

I'm trying to make an export of some data i have (stored in a datatable). Some of those values have a linebreak in them. Now every time i try and import the file in excel (2010), the linbreaks get recognised as a new row, instead of an actual linebreak.

I've searched for hours, seen many solutions, but i just can't seem to get it fixed.

The way i output my csv file: (variable csvfile is a stringbuilder)

context.Response.Clear();
context.Response.ContentType = "text/csv";
context.Response.ContentEncoding = System.Text.Encoding.UTF8;
context.Response.AppendHeader("Content-Disposition", "attachment; filename=" + name + ".csv");
context.Response.Write(csvfile.ToString());
context.Response.End();

When i open it with excel manually, it displays fine. But because excel 2003 doesn't support the file format, i have to import it. With the import, it sees the linebreaks (\n in the fields) as a new row.

Unfortunately i can't give you an example of the real data i work with (it's all personal data), but i can give you an example of how it goes wrong:

Header1,Header2,Header3
"value1","value2","value 3
and this is where its going wrong"

It's a simple csv file, and when you import it you'll see where it goes wrong. I encapsulate fields with double quotationmarks by default. I also remove leading spaces from values by default.

I've spent at least 2 days on this seemingly simple problem, but for the life of me, i can't figure out how i can fix it. I've seen multiple topics on this same problem, but none of the solutions offered there seem to fix this.

Answer

Ben Schwehn picture Ben Schwehn · May 10, 2011

This works for me:

a) Setting Response.ContentEncoding = System.Text.Encoding.UTF8 isn't enough to make Excel open UTF-8 files correctly. Instead, you have to manually write a byte-order-mark (BOM) header for the excel file:

if (UseExcel2003Compatibility)
    {
        // write UTF-16 BOM, even though we export as utf-8. Wrong but *I think* the only thing Excel 2003 understands
        response.Write('\uFEFF');
    }
    else
    {
        // use the correct UTF-8 bom. Works in Excel 2008 and should be compatible to all other editors
        // capable of reading UTF-8 files
        byte[] bom = new byte[3];
        bom[0] = 0xEF;
        bom[1] = 0xBB;
        bom[2] = 0xBF;
        response.BinaryWrite(bom);
    }

b) send as octet-stream, use a filename with .csv extension and do quote the filename as is required by the HTTP spec:

response.ContentType = "application/octet-stream";
response.AppendHeader("Content-Disposition", "attachment; filename=\"" + fileName + "\"");

c) use double quotes for all fields

I just checked and for me Excel opens downloaded files like this correctly, including fields with line breaks.

But note that Excel still won't open such CSV correctly on all systems that have a default separator different to ",". E.g. if a user is running Excel on a Windows system set to German regional settings, Excel will not open the file correctly, because it expects a semicolon instead of a comma as separator. I don't think there is anything that can be done about that.