class Concurrent::Collection::AtomicReferenceMapBackend
def internal_replace(key, expected_old_value = NULL, &block)
Someday when details settle down a bit more, it might be worth
if present), which also makes pre-emptive resize checks worthwhile.
* compute_if_absent prescans for mapping without lock (and fails to add
* Plain +get_and_set+ checks for and performs resize after insertion.
The others interweave other checks and/or alternative actions:
4. Lock and validate; if valid, scan and add or update
3. If bin stale, use new table
2. If bin empty, try to CAS new node
1. If table uninitialized, create
the same basic structure:
little more complicated than the last. All have
Internal versions of the insertion methods, each a
def internal_replace(key, expected_old_value = NULL, &block) hash = key_hash(key) current_table = table while current_table if !(node = current_table.volatile_get(i = current_table.hash_to_index(hash))) break elsif (node_hash = node.hash) == MOVED current_table = node.key elsif (node_hash & HASH_BITS) != hash && !node.next # precheck break # rules out possible existence elsif Node.locked_hash?(node_hash) try_await_lock(current_table, i, node) else succeeded, old_value = attempt_internal_replace(key, expected_old_value, hash, current_table, i, node, node_hash, &block) return old_value if succeeded end end NULL end