I have a really succinct implementation for a persistent map with $$$O(log N)$$$ querying and (hopefully) $$$O(log N)$$$ amortized average-case insertion. Is this correct, or is the insertion actually $$$O(log^2 N)$$$?
I know there's the pathological case for cases where the sum of dict sizes is a power of two and you end up copying the whole dictionary, but let's say that's not the average case
class LinkedList:
parent: LinkedList | None
curr_dict: dict[str, object]
def update(parent: LinkedList, key: str, value: object) -> LinkedList:
curr_dict = {key: value}
while parent.parent is not null and len(parent.curr_dict) <= len(curr_dict):
curr_dict |= parent.curr_dict
parent = parent.parent
return LinkedList(parent=parent, curr_dict=curr_dict)