I'm migrating large amounts of data +40m of rows to a new version of the tables, my question would be more like, what could be the best way to improve the performance in terms of logging and execution time etc.
I have considered the below check list:
Set Recovery Model to bulk-logged recovery model. After bulk importing the data, switch the recovery model back to the full recovery model.
Set recovery model to bulk-logged recovery model. After bulk importing the data, switch the recovery model back to the full recovery model.Add table hit TABLOCK to release any locking during the executions.
Load data with theTABLOCKhint to release any locking during the executions.Set max dop to 30 and change it back to normal.
SetMAXDOPto 30 and change it back to normal.
Example of the statemetsstatements:
/*Insert Example*/ INSERT INTO [dbo].[Order]( [CREATED] ,[LAST_MODIFIED] ,[number] ,[LAST_MODIFIED] ,[Owner_FK] ,[Owner2_FK] ,[Owner3_FK] SELECT [CREATED] ,[LAST_MODIFIED] ,[number] ,[LAST_MODIFIED] ,[Owner_FK] ,[Owner2_FK] ,[Owner3_FK] FROM [Order_old] WHERE [Owner_FK] IS NOT NULL OR [Owner2_FK] IS NOT NULL OR [Owner3_FK] IS NOT NULL /*Delete Example*/ DELETE Ord FROM [Order] Ord WHERE [Owner_FK] IS NOT NULL OR [Owner2_FK] IS NOT NULL OR [Owner3_FK] IS NOT NULL /*Update Example*/ UPDATE Order_Type SET [TYPE]=NULL WHERE [TYPE]='' I thought about integration services but that is a limitation I have, Also I had a look at BCP tool, but seems there is no way to transfer from one table to another unless we export it to a file and then we import it from that file.