ReaLNero's blog

By ReaLNero, history, 5 hours ago, In English

I have a really succinct implementation for a persistent map with $$$O(log N)$$$ querying and (hopefully) $$$O(log N)$$$ amortized average-case insertion. Is this correct, or is the insertion actually $$$O(log^2 N)$$$?

I know there's the pathological case where if you keep doing insertion for the version of the LinkedList where the sum of dict sizes is a power of two, then you might end up repeatedly copying the whole dictionary. Let's say that's not the average case. Besides, I think modifying len(parent.curr_dict) <= len(curr_dict) to be randomized e.g. len(parent.curr_dict) <= len(curr_dict) and random.random() < 0.5 would probably solve that.

FWIW I've heard there's an O(1) lookup/insert persistent map paper, but this approach seems deceptively simple?

Also, for anyone unfamiliar -- Python dictionaries are hashmaps, so insertion/querying is amortized $$$O(1)$$$,

class LinkedList:
  parent: LinkedList | None
  curr_dict: dict[str, int]

def update(parent: LinkedList, key: str, value: int) -> LinkedList:
  curr_dict = {key: value}
  while parent.parent is not null and len(parent.curr_dict) <= len(curr_dict):
    curr_dict |= parent.curr_dict
    parent = parent.parent
  return LinkedList(parent=parent, curr_dict=curr_dict)

def query(node: LinkedList, key: str) -> int | None:
  while node is not None:
    if value := curr_dict.get(key)
      return value
    node = node.parent
  return None
  • Vote: I like it
  • 0
  • Vote: I do not like it

»
5 hours ago, # |
  Vote: I like it 0 Vote: I do not like it

Auto comment: topic has been updated by ReaLNero (previous revision, new revision, compare).