The power of the double-linked list

(written by lawrence krubner, however indented passages are often quotes). You can contact lawrence at:

This gave me a completely new appreciation for a data structure that I have rarely thought much about:

– First we mark the node deleted

– We walk our prev chain backwards until we find a live node, and we walk our next chain forwards until we find a live node

– We then set the next of our predecessor and the prev of our successor to each other, unconditionally

– We leave the pointers of the deleted node untouched

Now, if deletes occur in isolation (i.e. are separated by a live node) then this just trivially works.

However, if we have a string of adjacent deletes, what happens? Well, perhaps the next and prev pointers of the range represent different lists, perhaps all of them point to a different node than they should. But they all must point to a node prior (or equal) to the live node in the direction of their travel, since all of the mutators stop when encountering a live node, and the property held before we started. Regardless of the interleaving and whose “live node” is persisted in any node en route, the worst we can do is insert a stale reference.

Wait, I thought you said this was guaranteed constant time?

I did. This can be made constant time by simply bounding the number of nodes you traverse to find the live neighbour – to, say, 10. If there are so many competing adjacent deletes that you exceed this number of traversals, you just stop on the 10th deleted node and modify it, editing out the following 10 deleted nodes. The upshot is a list that steadily shrinks, instead of immediately reflecting the correct state, amortizing the costs over multiple operations. This has a stronger impact than it might first appear, switching aggregate behaviour when competition gets too fierce: once we exceed 10 items, mutators start editing different nodes, doing so without competition, ensuring we tend rapidly back towards our ideal list. In reality these modifications occur in such a short space of time that only a small fraction fail to apply cleanly, so this only bounds worst case behaviour.