I'm migrating large amounts of data +40m of rows to a new version of the tables, my question would be more like, what could be the best way to improve the performance in terms of logging and execution time etc.
I have considered the below check list:
- Set recovery model to bulk-logged recovery model. After bulk importing the data, switch the recovery model back to the full recovery model.
- Load data with the
TABLOCKhint to release any locking during the executions. - Set
MAXDOPto 30 and change it back to normal.
Example of the statements:
/*Insert Example*/ INSERT INTO [dbo].[Order]( [CREATED] ,[LAST_MODIFIED] ,[number] ,[LAST_MODIFIED] ,[Owner_FK] ,[Owner2_FK] ,[Owner3_FK] SELECT [CREATED] ,[LAST_MODIFIED] ,[number] ,[LAST_MODIFIED] ,[Owner_FK] ,[Owner2_FK] ,[Owner3_FK] FROM [Order_old] WHERE [Owner_FK] IS NOT NULL OR [Owner2_FK] IS NOT NULL OR [Owner3_FK] IS NOT NULL /*Delete Example*/ DELETE Ord FROM [Order] Ord WHERE [Owner_FK] IS NOT NULL OR [Owner2_FK] IS NOT NULL OR [Owner3_FK] IS NOT NULL /*Update Example*/ UPDATE Order_Type SET [TYPE]=NULL WHERE [TYPE]='' I thought about integration services but that is a limitation I have, Also I had a look at BCP tool, but seems there is no way to transfer from one table to another unless we export it to a file and then we import it from that file.
UPDATE F SET F.UpdateDate = GETDATE() FROM FactOnlineSales F WHERE LoadDate IS NULL --<NORMAL FILTERS> OR ETLLoadID IS NULL OR UpdateDate IS NULL AND F.OnlineSalesKey > @MaxKey AND F.OnlineSalesKey <= @BatchMaxN;