How to copy a huge table data into another table in SQL Server

sqlchild picture sqlchild · Mar 14, 2011 · Viewed 113.6k times · Source

I have a table with 3.4 million rows. I want to copy this whole data into another table.

I am performing this task using the below query:

select * 
into new_items 
from productDB.dbo.items

I need to know the best possible way to do this task.

Answer

Mathieu Longtin picture Mathieu Longtin · Sep 8, 2011

I had the same problem, except I have a table with 2 billion rows, so the log file would grow to no end if I did this, even with the recovery model set to Bulk-Logging:

insert into newtable select * from oldtable

So I operate on blocks of data. This way, if the transfer is interupted, you just restart it. Also, you don't need a log file as big as the table. You also seem to get less tempdb I/O, not sure why.

set identity_insert newtable on
DECLARE @StartID bigint, @LastID bigint, @EndID bigint
select @StartID = isNull(max(id),0) + 1
from newtable

select @LastID = max(ID)
from oldtable

while @StartID < @LastID
begin
    set @EndID = @StartID + 1000000

    insert into newtable (FIELDS,GO,HERE)
    select FIELDS,GO,HERE from oldtable (NOLOCK)
    where id BETWEEN @StartID AND @EndId

    set @StartID = @EndID + 1
end
set identity_insert newtable off
go

You might need to change how you deal with IDs, this works best if your table is clustered by ID.