Always we should take care of database server performance but sometimes you need to perform a huge update on your table/tables this will affect performance for concurrent users and they will complain about it.
For example if you have a table contains more than billion row and you want to update a column of this data and your database is accessible 24 hours 7 days a week it is difficult to run this code in simple update statement because a lot of data will be fetch to your memory and a lot of user data will be kicked off out of memory.
So the solution is to use batch update by divide your data in small batches, e.g divide it to update 1000 row at a time and stop for a moment and the update the next 1000 and so on, this will let other users to use share the server with you.
Sample code:
-- SQL update in batches of 1000
WHILE (1 > 1)
BEGIN
BEGIN TRANSACTION
UPDATE TOP ( 1000 ) S
SET Price = Price * 1.08
IF @@ROWCOUNT = 0
COMMIT TRANSACTION
BREAK
END
-- 2 second delay
WAITFOR DELAY '00:00:02'
GO
* the above code is updated version of a code that available online.
Written by: Elmozamil Elamir Hamid
This not only true for UPDATES but also for DELETES. This is a good practice for the Transaction Log file too. Long running UPDATES and DELETES may cause the Transaction Log file to fill and may end with a service interruption "The transaction log file for database XXX is full".
So using this method you can hit two targets with one shot.