I've have SSIS Package that is exporting 2.5 GB OF DATA containing 10 million records into Sql Server Database which has 10 partitions including PRIMARY FILE GROUP.
Before Changing default Max Insert Commit size i.e."2147483647" and Row per batch.It was taking 7 mins for completed transformation with fast load option.
But After chaning it some decent value with some formula, the execution was done in only 2 minutes.
FYI- DefaultMaxBufferRows & DefaultMaxBufferSize were default value in both scenorio i.e. 10000 and 10 MB respectively.
To calculate Max Insert Commit size & Row per batch Below calucation are used.
1) Calculated length of records from source that is being transfered. which comes around 1038 bytes.
CREATE TABLE [dbo].[Game_DATA2](
[ID] [int] IDENTITY(1,1) NOT NULL, -- AUTO CALCULATED
[Number] [varchar](255) NOT NULL, -- 255 bytes
[AccountTypeId] [int] NOT NULL, -- 4 bytes
[Amount] [float] NOT NULL,-- 4 bytes
[CashAccountNumber] [varchar](255) NULL, -- 255 bytes
[StartDate] [datetime] NULL,-- 8 bytes
[Status] [varchar](255) NOT NULL,-- 255 bytes
[ClientCardNumber] [varchar](255) NULL -- 255 bytes
)
2) Rows per batch = packate_size/bytes per record =32767/1038 =32 approx.
3) Max insert commit size = packate size *number of transaction = 32767*100=3276700 (Packate size and number transaction are variable can change as per requirement)
Question :
Is there any relevance of rows per batch and max insert commit size? As there's no information mentioned in an archive article for tunning DFT(DATA FLOW TASK) execution.
Are these configuration works along with DefaultBuffermaxzie and
DefualtBuffermaxrows?if yes how?
These parameters refer to DFT OLE DB Destination only. OLE DB Destination issues an insert bulk
command. These two parameters control it in the following way:
BULK INSERT (Transact-SQL) - MS Article on this command.
DefaultBuffermaxsize and DefaultBuffermaxrows controls RAM buffer management inside DFT itself, and has no interference with options mentioned above.