Sunday, October 25, 2015

SQL - Concurrent operations | Problems | Solutions

Concurrent Operations: When more than one user(s) / application(s) attempt operation (select / insert / update) on same data row(s), this is called Concurrent operation.

Concurrent operations may cause below given issues -

1. Dirty Reads: If session 2 executed during the 15 second delay. Session 2 will get updated data by session 1, but then session 1 get rolled back, so session 2 got wrong data its called Dirty Read.

Session 1
Begin tran
update tb_city set regionid = 2 where id between 11 and 20
waitfor delay '00:00:15'
--Commit
Rollback

Session 2
select * from tb_city where id = 15

2. Unrepeatable Reads:  If all these session executing parallel then 4 select statements of session 1 will get different values every time.

Session 1
select * from tb_city where id between 100 and 200
select * from tb_city where id between 100 and 200
select * from tb_city where id between 100 and 200
select * from tb_city where id between 100 and 200


Session 2 update tb_city set regionid = 2 where id between 100 and 200
Session 3 update tb_city set regionid = 4 where id between 100 and 200
Session 4 update tb_city set regionid = 9 where id between 100 and 200
Session 5 update tb_city set regionid = 3 where id between 100 and 200

3. Phantom Rows: User looking to set 20 for all active records. Session 2 is running parallel added new active values. These new rows are Phantom Rows.

Session 1
update tb_city set regionid = 20 where IsActive = 1

Session 2
insert into tb_city values (201, 1)
insert into tb_city values (202, 1)
insert into tb_city values (203, 1)

4. Lost Update: user is looking to set region id to 200, but parallel session 2 set it to 100. so user 1 lost his updates.

Session 1
update tb_city set regionid = 200 where IsActive = 1

Session 2
update tb_city set regionid = 100 where IsActive = 1
--------------------------------------------------------------------------
Solutions for these problems
  1. Optimistic Lock
  2. Pessimistic Lock

1. Optimistic Lock: It does not put any lock on data actually. But it take snapshot of data before starting operation and compare current data with snapshot before committing its operation.
          There are three ways to achieve it -
  • Using data adapter: Update method internally does it.
    objDataAdapter.Update()
  • Timestamp column : This is a special type column that get updated every time if any data updated of the row. So before starting any transaction take snap of timestamp column for rows you are going to do transaction then before committing your transaction compare current values of time stamp from snapshot, if these are same means rows not changed by any other session and you can commit your operation.

  •        Whole data comparison: Comparing whole data from snapshot



2. Pessimistic Lock: It uses various locks to prevent the problems of concurrent operations
There are a rang of locks / Isolation levels those can be used

Isolation Level
Select
Insert
Update
Remark
Read Uncommitted
Reads data that updated but still not committed
Allowed
Allowed
No Lock
Read committed
Do not read data that updated and not committed, only committed data reads
Allowed
Allowed
Select Lock
Repeatable Read
Do not read data that updated and not committed, only committed data reads
Allowed
Not Allowed
select, Insert lock
Serializable
Do not read data that updated and not committed, only committed data reads
Not Allowed
Not Allowed
select, Insert, update lock
Snapshot
Creates snapshot (temp data) of data & provide data from that temp data until transaction not completed
Allowed
Allowed
No Lock

Set isolation level at database 
SET TRANSACTION ISOLATION LEVEL
    { READ UNCOMMITTED
    | READ COMMITTED
    | REPEATABLE READ
    | SNAPSHOT
    | SERIALIZABLE
    }
[ ; ]

Example:
SET TRANSACTION ISOLATION LEVEL SNAPSHOT

BEGIN TRAN
SELECT * FROM TB_CITY WHERE id = 2

COMMIT

Isolation level can be set from command object in C#
    using (SqlConnection connection = new SqlConnection(connectionString))
    {
        connection.Open(); 
        SqlCommand command = connection.CreateCommand();
        SqlTransaction transaction;      
        transaction = connection.BeginTransaction(IsolationLevel.ReadCommitted); 
        command.Connection = connection;
        command.Transaction = transaction; 
 //Further code is wel known
        :
        :
     }

No comments:

Post a Comment

CI/CD - Safe DB Changes/Migrations

Safe DB Migrations means updating your database schema without breaking the running application and without downtime . In real systems (A...