7

I used SSMS to add a composite primary key to a table with 64 million rows.

Because the table was so large, I used the change script generated by SSMS. One of the PK fields (REVENUE_CODE) originally allowed NULLs, so the change script changed it to NOT NULL in the CREATE TABLE statement.

Here's a simplified version of the change script from SSMS:

BEGIN TRANSACTION GO -- PART 1 Succeeded CREATE TABLE dbo.Tmp_Charges ( RECORD_ID VARCHAR(12) NOT NULL, REVENUE_CODE VARCHAR(4) NOT NULL, UNITS_OF_SERVICE VARCHAR(7) NULL, ) GO -- PART 2 Succeeded ALTER TABLE dbo.Tmp_Charges SET (LOCK_ESCALATION = TABLE) GO -- PART 3 Failed because there were NULL values in the REVENUE_CODE field IF EXISTS(SELECT * FROM dbo.Charges) EXEC('INSERT INTO dbo.Tmp_Charges (RECORD_ID, REVENUE_CODE, UNITS_OF_SERVICE) SELECT RECORD_ID, REVENUE_CODE, UNITS_OF_SERVICE FROM dbo.Charges WITH (HOLDLOCK TABLOCKX)') GO -- PART 4 Succeeded DROP TABLE dbo.Charges GO -- PART 5 Succeeded EXECUTE sp_rename N'dbo.Tmp_Charges', N'Charges', 'OBJECT' GO -- PART 6 Succeeded ALTER TABLE dbo.Charges ADD CONSTRAINT PK_Charges PRIMARY KEY CLUSTERED ( RECORD_ID, REVENUE_CODE ) GO COMMIT 

When I ran the script, Parts 1 and 2 succeeded, but Part 3 failed because there were NULL values in the REVENUE_CODE field that I didn't know about or check for.

I assumed the entire transaction would fail and changes would be rolled back, but it didn't. All the other parts succeeded, and I was left with an empty table.

What did I misunderstand about this script? Why weren't the changes rolled back?

1
  • If you ever do need multiple batches (so removing GO doesn't work for you) you can enable SQL CMD mode in SSMS and add :on error exit to the top of the script to avoid it continuing on and executing subsequent batches. .Still use XACT_ABORT though Commented Sep 12 at 18:08

3 Answers 3

20

ready steady

There are noble attempts at answers, with some reasonable advice in them, but the main reason why your scripts kept going is because you have a GO after every step. Even if you SET XACT_ABORT ON; it will keep going.

In SSMS, GO acts as a batch separator, so each statement in between them is treated as a separate batch, unaffected by errors in the previous batch.

compare

In this script, every command is followed by GO:

SET XACT_ABORT ON; CREATE TABLE #t ( id integer PRIMARY KEY CLUSTERED ); GO INSERT #t ( id ) VALUES (NULL); GO DROP TABLE #t; GO SELECT t.* FROM #t AS t; GO 

The error messages that come back are:

Msg 515, Level 16, State 2, Line 10 Cannot insert the value NULL into column 'id', table 'tempdb.dbo.#t__000000000198'; column does not allow nulls. INSERT fails.

And...

Msg 208, Level 16, State 0, Line 21 Invalid object name '#t'.

Meaning the DROP TABLE was run, and the SELECT attempted to retrieve data from an object that no longer exists.

contrast

In this script, there is one GO at the end for absolutely no good reason.

SET XACT_ABORT ON; CREATE TABLE #t ( id integer PRIMARY KEY CLUSTERED ); INSERT #t ( id ) VALUES (NULL); DROP TABLE #t; SELECT t.* FROM #t AS t; GO 

This time, we only get one error message back:

Msg 515, Level 16, State 2, Line 34 Cannot insert the value NULL into column 'id', table 'tempdb.dbo.#t__00000000019A'; column does not allow nulls. INSERT fails.

Execution stops when the error is hit on INSERT, the DROP TABLE doesn't run, and neither does the SELECT. Though if you run the SELECT independently after receiving the error, you can confirm that that #temp table was not dropped in the batch. It's there and empty.

Merry Christmas.

2
  • 1
    So, because of the GO batch separator, it didn't matter that all the commands were enclosed within a TRANSACTION? It's frustrating that SSMS would generate code like this, but it's my responsibility to make sure it's going to work correctly. Commented Sep 10 at 14:13
  • 1
    @Daniel you have comprehended the answer correctly! Commented Sep 10 at 15:27
8

XACT_ABORT is your friend here:

Specifies whether SQL Server automatically rolls back the current transaction when a Transact-SQL statement raises a run-time error.

You have to explicitly set it ON (SET XACT_ABORT ON) before beginning the transaction if you want the entire transaction to abort and rollback whenever any statement within the scope of the transaction throws an error:

When SET XACT_ABORT is ON, if a Transact-SQL statement raises a run-time error, the entire transaction is terminated and rolled back.

Otherwise, this is the default behavior you're experiencing (when not in the scope of a trigger):

When SET XACT_ABORT is OFF, in some cases only the Transact-SQL statement that raised the error is rolled back and the transaction continues processing...

...OFF is the default setting in a T-SQL statement, while ON is the default setting in a trigger.

In some severe cases, the entire transaction does get rolled back even when XACT_ABORT is off:

Depending upon the severity of the error, the entire transaction may be rolled back even when SET XACT_ABORT is OFF

But again, the default is usually the behavior you're experiencing.


Yes, also, Erik's answer in his infinite wisdom and astute observations. GO is a batch separator keyword specifically in SSMS. Each batch will run independent of the previous batch's success or failure. So you would also need to remove each of those GOs. Just a single one at the end is sufficient.

0

I think that your issue is the absence of a rollback statement (and error handling). Since you explicitly created the transaction, you need to do the error handling / rollback work as well.

Unless this is a going to be a repeatable process, I am not 100 % sure as to why you put all this in a transaction to begin with.

Why did you not create the table with the PK in place? You can then just bulk insert into the new table with a where clause to ignore the revenue_code = NULL records and deal with them separately.

I would use the bulk insert to not blow up my tlog. in any case, if you need this to be in a transaction, use proper error handling and a rollback block.

2
  • The data came from an external source, and I didn't know much about the structure or content. Because the file is large (3+ GB, > 60 million rows), I needed to load it to MSSQL to explore the content. That's when I discovered the NULL values in the compound PK. I didn't want to reimport the data, so I used the SSMS change script to handle it for me... Commented Sep 10 at 14:27
  • "I think that your issue is the absence of a rollback statement (and error handling). Since you explicitly created the transaction, you need to do the error handling / rollback work as well." Not true at all, XACT_ABORT does error handling. The issue is splitting the batch. Commented Sep 10 at 17:14

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.