Deleting a large number of records from a database without locking it (for ages) #SQL

If you have a large database, with millions of rows, and you need to run an batch operation on it, like deleting lots of old records, then you’ll quickly find that the operation often will lock the table, and prevent users get at their data – as well as bloating the transaction log table.
So, the trick is to break it down into manageable chunks. Let’s say the operation I want to run is this;
delete from log where success=0
If the “log” table has millions of rows, then this will undoubtedly lock the log table, as the statement is running. Preventing new logs from being written.
If I then write something like:
delete top (100) from log where success=0
This completes in a jiffy, and deletes 100 rows from the log table. The actual 100 rows that get deleted are beyond your control, it’s just the first 100 that it finds. But that isn’t really a problem.
So, let’s write this as a loop as follows;
declare @FirstPass Bit
select @FirstPass = 1;
while @@ROWCOUNT = 100 or @FirstPass = 1
begin
set @FirstPass = 0;
delete top (100) from log where success=0
end
This just means that it will run this command over and over again until it no longer has rows that it can delete.
Now, to improve upon this a bit more, I’m going to add the lines;
WAITFOR DELAY '00:00:01';
RAISERROR('ok',0,1) WITH NOWAIT;
Which will put a 1 second delay in the loop, to make sure we’re not hogging all the Database’s processing power, and the “RaiseError” just flushes the output to the messages window, so we can see that everything is running smoothly.
interesting approach, thanks.
Would you say it makes the database slow while it is running?
LikeLike