Skew heap

Invented by Sleator, Tarjan (1986)
Idea:
  • a self-adjusting variant of a leftist heap
  • recall that, when melding two leftist heaps, we merge their rightmost paths and for each node on this path, we exchange some of the left and right children to maintain the invariant that rank of a left child \(\geq\) rank of a right child
  • when melding two skew heaps, we merge their rightmost paths and then exchange the left and right child (unconditionally) for each node on this path proceeding bottom up
Advantages: compared with leftist heaps, there are no ranks, no rank invariant to be maintained; still the complexity of all operations is \(O(\log n)\) in the amortized sense
[You can download the application for offline viewing.]

Merging

Merging is similar to leftist heap – we simply merge the two rightmost paths. The only difference is, that after merging the paths, we (unconditionally) swap all the left and right children for each node on this path (so the rightmost path becomes a leftmost path):

Once we implement merging, all the other operations are easy:

  • to insert a node, we merge the heap with a new single-node heap,
  • to extract-min, we delete the root node and merge the two remaining subtrees.

Analysis

We will use the accounting method of amortized analysis, so imagine we have to pay for all computation and 1$ can buy us \(O(1)\) time.

Theorem: If get \(2\log n+1\)$ for each merge (insert/extract-min) operation, we can pay for any sequence of operations.

Proof: Take any node \(v\) with its child \(c\); we call the edge from \(v\) to \(c\) heavy, if size of \(c\)’s subtree is more than half the size of \(v\)’s subtree, or in other words, more than half of the descendants of \(v\) are under \(c\). Otherwise we call the edge \((v,c)\) light. (This is called heavy-light decomposition of a tree. Note that this information is not stored by the data structure and is only used for the sake of analysis.)

Note that any path down the tree has at most \(\log n\) light edges, because each time we traverse a light edge, the subtree size decreases by more than a half.

Invariant: Each right heavy edge will have 1$ saved for the future.

Let \(\ell\) and \(h\) be the number of light and heavy edges on the rightmost path, respectively. Obviously, merging takes \(O(\ell+h+1)\) time. However, \(\ell \leq \log n\) and all the right heavy edges have 1\$ saved, so they can pay for themselves – when we swap the children, all right heavy edges become left edges, so they don’t need any money saved. On the other hand, all left heavy edges become right heavy edges, so we need to deposit 1$ on their account, however, there are at most \(\ell\leq\log n\) such edges (if an edge to one child is heavy, the edge to its sibling must be light; and each left heavy edge that turns into a right heavy edge was swapped with a right light edge).

In the potential method, we would define potential \(\Phi(H) =\) number of right heavy edges. The real cost of merge is \(O(\ell + h + 1)\), but the change in potential is \(\Delta\Phi\leq \ell - h\) so the \(+h\) and \(-h\) cancel out and the amortized complexity is \[ T_{amort} = T_{actual} + c\cdot\Delta\Phi = O(\ell+h+1) + c\cdot (\ell-h) = O(\log n+1). \]

References

  • D.D. Sleator and R.E. Tarjan. Self-adjusting heaps. SIAM J. Comput., 15(1):52-69, 1986. [bib] [pdf]
  • A. Kaldewaij and B. Schoenmakers. The derivation of a tighter bound for top-down skew heaps. Information Processing Letters, 37(5):265-271, 1991. [bib] [pdf]
  • B. Schoenmakers. A tight lower bound for top-down skew heaps. Information processing letters, 61(5):279-284, 1997. [bib] [pdf]