SQL Sam and the Never-Ending Delete - Solution!
"It's not alien abduction, psycho-kinesis, or some other unexplained phenomona?" said Nullder, sounding disappointed.
"No, it's really much simpler than that," said Sam. "First, you said you were deleting about a million rows from a five-million row table. You're doing it with one DELETE, which makes it an implicit transaction that must succeed in its entirety. So, SQL Server has to log the deletion of a million records before it can actually commit. That's why you ran out of log space at first."
"But we had Truncate on Checkpoint turned on," said Chuck.
"True," said Sam. "But the truncate process can only remove committed transactions. Your single transaction can't be committed until all 1 million rows are logged and written to the database, so Truncate on Checkpoint won't help you in this case.
"Second, the way the query is structured, using a LIKE clause that begins with a % wildcard, guarantees that the query optimizer cannot use any index on the table. So, it has to do a table scan to find the rows to delete. With a large table like this, that adds to the time it takes to run the query. The query plan confirmed this."
"What about that sp_who2 that showed an INSERT, not a DELETE?" asked Nully.
"That's why I ran the query plan," said Sam. "The plan proved that the DELETE wasn't just a simple DELETE. There's a trigger firing, too."
Sam pulled up the trigger defined for the table. It looked like this:
CREATE TRIGGER Update_Customers ON dbo.Customers FOR UPDATE, DELETE AS DECLARE @deleted_num char (25) SELECT @deleted_num = cust_num FROM DELETED if @deleted_num IS NOT NULL INSERT INTO Updated_Customers (cust_num) (SELECT cust_num FROM deleted WHERE cust_num NOT IN (SELECT cust_num FROM Updated_Customers))
"That's why the query plan was so complicated. As rows are being deleted, they are added to the Updated_Customers table. So, not only does SQL Server have to log the deletion of the records in the customers table, it has to see if they exist in the updated_customers table and then log those changes, too."
"Well, how do we make this work, Sam?" asked Chuck.
"If you don't need the trigger for this test environment, we can drop it. And, we can make the DELETE operation happen in smaller chunks so we don't overrun the log."
Chuck agreed. SQL Sam dropped the trigger, then rewrote the DELETE operation like this:
set rowcount 10000 while 1=1 begin DELETE FROM customers WHERE cust_num LIKE '%8009______' IF @@rowcount = 0 BREAK ELSE CHECKPOINT end
This broke the delete up into deletes of 10,000 rows each, giving SQL Server a chance to commit the changes and write them out. With this change, and the trigger dropped, the million test rows were deleted in 15 minutes.