Exception of type 'System.OutOfMemoryException' was thrown. C# when using IDataReader

DA_Prog picture DA_Prog · Dec 24, 2012 · Viewed 31k times · Source

I have an application in which I have to get a large amount of data from DB. Since it failed to get all of those rows (it's close to 2,000,000 rows...), I cut it in breaks, and I run each time the sql query and get only 200,000 rows each time.

I use DataTable to which I enter all of the data (meaning - all 2,000,000 rows should be there).

The first few runs are fine. Then it fails with the OutOfMemoryException.

My code works as following:

private static void RunQueryAndAddToDT(string sql, string lastRowID, SqlConnection conn, DataTable dt, int prevRowCount)
    {
        if (string.IsNullOrEmpty(sql))
        {
            sql = generateSqlQuery(lastRowID);
        }

        if (conn.State == ConnectionState.Closed)
        {
            conn.Open();
        }

        using (IDbCommand cmd2 = conn.CreateCommand())
        {
            cmd2.CommandType = CommandType.Text;
            cmd2.CommandText = sql;
            cmd2.CommandTimeout = 0;

            using (IDataReader reader = cmd2.ExecuteReader())
            {
                while (reader.Read())
                {
                    DataRow row = dt.NewRow();
                    row["RowID"] = reader["RowID"].ToString();
                    row["MyCol"] = reader["MyCol"].ToString();
                    ... //In one of these rows it returns the exception.

                    dt.Rows.Add(row);
                }
            }
        }

        if (conn != null)
        {
            conn.Close();
        }

        if (dt.Rows.Count > prevRowCount)
        {
            lastRowID = dt.Rows[dt.Rows.Count - 1]["RowID"].ToString();
            sql = string.Empty;
            RunQueryAndAddToDT(sql, lastRowID, conn, dt, dt.Rows.Count);
        }
    }

It seems to me as if the reader keeps collecting rows, and that's why It throws an exception only in the third or second round.

Shouldn't the Using clean the memory as its done? What may solve my problem?

Note: I should explain - I have no other choice but get all of those rows to the datatable, Since I do some manipulation on them later, and the order of the rows is important, and I can't split it because sometimes I have to take the data of some rows and set it into one row and so on and so on, so I can't give it up.

Thanks.

Answer

Shift Technology picture Shift Technology · Jun 26, 2013

Check that you are building a 64-bit process, and not a 32-bit one, which is the default compilation mode of Visual Studio. To do this, right click on your project, Properties -> Build -> platform target : x64. As any 32-bit process, Visual Studio applications compiled in 32-bit have a virtual memory limit of 2GB.

64-bit processes do not have this limitation, as they use 64-bit pointers, so their theoretical maximum address space is 16 exabytes (2^64). In reality, Windows x64 limits the virtual memory of processes to 8TB. The solution to the memory limit problem is then to compile in 64-bit.

However, object’s size in Visual Studio is still limited to 2GB, by default. You will be able to create several arrays whose combined size will be greater than 2GB, but you cannot by default create arrays bigger than 2GB. Hopefully, if you still want to create arrays bigger than 2GB, you can do it by adding the following code to you app.config file:

<configuration>
  <runtime>
    <gcAllowVeryLargeObjects enabled="true" />
  </runtime>
</configuration>