0
$\begingroup$

One of the finer points of amortized analysis about which I have been able to find relatively little information is the broad question of what happens to the amortized cost of a structure's existing operations if we add a new one. I'll use two popular examples to illustrate my confusion:

In a binary counter which only supports the increment operation, choosing the potential function $\Phi=\#$ of $1$'s in the present configuration gives us constant amortized cost for the operation. In every example I can see in the literature, adding the decrement operation to the counter creates a situation where both it and the increment operation have linear amortized costs. The argument given is that we can toggle $m$ times between values of $2^n$ and $2^n-1$ with a series of alternating decrement and increment operations, giving us a time of $\Theta(mn)$.

In a binomial heap, using the number of binomial trees in the structure as a potential function gives us constant amortized time for insertion. Popping the minimum takes logarithmic time both amortized and in actuality.

Perhaps I'm mistaken, but I see a certain contradiction here. In a binomial heap we could reach a point where we have one large binomial tree with $2^n$ elements and repeatedly perform pop and push operations, which will cost $\Theta(m\log n)$ time. Even if each vertex keeps its children in a linked list which would give the deletions constant time, the insertions will all take logarithmic time.

Having said that, the insertion operations for both structures still function the same way: for any sufficiently large $m$, any series of $m$ insertions to either structure still takes $O(m)$ time - the inclusion of the deletion (decrement in the counter) operation doesn't affect the way the insertion operation works.

So why do we say that adding decrementation to a binary counter increases the amortized cost for incrementation and then ignore the same argument when analyzing binomial heaps?

$\endgroup$
2
  • 1
    $\begingroup$ Where did you read that binomial heaps have $O(1)$ amortized time for insertion? I wonder if there's some context that is missing. Wikipedia's claim about amortized complexity only relates to consecutive insertions: en.wikipedia.org/wiki/Binomial_heap#Insert, so it doesn't apply to alternating insertions and pops. Can you credit the source where you read that and give full context on what that resource claims? $\endgroup$ Commented May 12, 2021 at 1:46
  • $\begingroup$ I'm going based on what I was taught when I took the course in data structures, but of course there is the possibility that I misunderstood that part of the material. If I've understood you correctly, it's possible that an operation can have one "amortized cost" when it's part of a specific series, but in a complete analysis of the data structure its actual amortized cost is different. If this is what you mean, it helps to clarify things a great deal. $\endgroup$ Commented May 12, 2021 at 2:40

1 Answer 1

2
$\begingroup$

I don't know much about the amortized running time of binomial heap, but after reading Wikipedia, I suspect the following might be true: the amortized time of a long sequence of consecutive insert operations is $O(1)$; but this is only true for consecutive insert operations that don't have any other operations mixed in. Based on that, I wouldn't really summarize the running time of a binomial heap by saying that insert takes $O(1)$ amortized time, without further caveats.

As you quite astutely point out with your example based on a binary counter, the amortized running time of an operation depends on the full set of operations that are allowed to be performed. So I suspect that the resolution is that the claim about the amortized running time of inserts into a binomial heap wasn't quite right, or had some additional caveats. I don't know that for sure, as I don't know much about binomial heaps, but that would be my running hypothesis at this stage.

$\endgroup$
1
  • $\begingroup$ Yep, I think that about covers it. Thanks! $\endgroup$ Commented May 12, 2021 at 6:52

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.