However, if conflicts happen often, the cost of repeatedly restarting transactions hurts performance significantly; other concurrency control methods have better performance under these conditions."
So called pessimistic concurrency checks for conflicts when an object is first accessed for read or write. If there others out there using the object in a conflicting manner, it can cause the thread to pause while it waits for the other thread to finish its use of that object. In the event that there is such a contention, this is a good thing. It means you don't waste CPU doing calculations based on a 'stale' object and throwing all that work away at the end and having to start all over again. This reduces CPU load. While the blocked thread is waiting, other threads get to use the CPU. The end result is that more users can be processed in parallel with less total CPU usage.
However that check does come with some small cost. In an environment where you are accessing hundreds or thousands of objects in a thread this way, it can add up. Thats where so-called optimistic concurrency comes in. It doesn't take those costs but just acts like the object is always free. It keeps its own copy and, at the end, checks for consistency. If it finds a conflict, it dumps all its investment and starts over. (No government bailout applies.)
SO, which is better? It depends on your expected usage. If you expect a high volume of data accesses to data processing with very little contention, optimistic concurrency makes sense.
In Darkstar/RedDwarf however where we expect maybe a few dozen data access per thread, and where we expect our data processing (game logic) to be of significant cost, its better to be safe (and pessimistic) then sorry.
No comments:
Post a Comment