Million inserts: SqlBulkCopy timeout

billybob picture billybob · Dec 9, 2014 · Viewed 12.6k times · Source

We already have a running system that handles all connection-strings (db2, oracle, MSServer).

Currently, We are using ExecuteNonQuery() to do some inserts.

We want to improve the performance, by using SqlBulkCopy() instead of ExecuteNonQuery(). We have some clients that have more than 50 million records.

We don't want to use SSIS, because our system supports multiple databases.

I created a sample project to test the performance of SqlBulkCopy(). I created a simple read and insert function for MSServer

Here's the small function:

public void insertIntoSQLServer()
{
    using (SqlConnection SourceConnection = new SqlConnection(_sourceConnectionString))
    {
        //Open the connection to get the data from the source table
        SourceConnection.Open();
        using (SqlCommand command = new SqlCommand("select * from " + _sourceSchemaName + "." + _sourceTableName + ";", SourceConnection))
        {
            //Read from the source table
            command.CommandTimeout = 2400;
            SqlDataReader reader = command.ExecuteReader();

            using (SqlConnection DestinationConnection = new SqlConnection(_destinationConnectionString))
            {
                DestinationConnection.Open();
                //Clean the destination table
                new SqlCommand("delete from " + _destinationSchemaName + "." + _destinationTableName + ";", DestinationConnection).ExecuteNonQuery();

                using (SqlBulkCopy bc = new SqlBulkCopy(DestinationConnection))
                {
                    bc.DestinationTableName = string.Format("[{0}].[{1}]", _destinationSchemaName, _destinationTableName);
                    bc.NotifyAfter = 10000;
                    //bc.SqlRowsCopied += bc_SqlRowsCopied;
                    bc.WriteToServer(reader);
                }
            }
        }
    }
}

When I have less that 200 000 in my dummyTable the bulk copy is working fine. But, when it's over 200 000 records, I have the following errors:

  • Attempt to invoke bulk copy on an object that has a pending operation.

OR

  • The wait operation timed out (for the IDataReader)

I increased the CommandTimeout for the reader. It seems that it has solved the timeout issue related to IDataReader.

Am I doing something wrong in the code?

Answer

Spock picture Spock · Dec 16, 2014

Can you try adding the following before the call to WriteToServer ...

bc.BatchSize = 10000;
bc.BulkCopyTimeout = 0;

I don't know what the default batch size or timeout is, but I suspect this might be your issue. Hope that helps

Also, you can try playing with different Batch Sizes for optimal performance.