Skip to main content
added 651 characters in body
Source Link
Yuval Filmus
  • 280.9k
  • 27
  • 319
  • 516

I was reading universal Hashing from Introduction to Algorithms by CLRSCormen et al., and came across the following corollary regarding search,insert insert and delete functions on UnivsersallyUniversally Hashed tables

enter image description here:

1.How is the author able to say $n=O\left( m\right) $ ?

Corollary 11.4

Using universal hashing and collision resolution by chaining in an initially empty table with $m$ slots, it takes expected time $\Theta(n)$ to handle any sequence of $n$ Insert, Search, and Delete operations containing $O(m)$ Insert operations.

Proof Since the number of insertions is $O(m)$, we have $n = O(m)$ and so $\alpha = O(1)$. The Insert and Delete operations take constant time and, by Theorem 11.3, the expected time for each Search operation is $O(1)$. By linearity of expectation, therefore, the expected time for the entire sequence of $n$ operations is $O(n)$. Since each operation takes $\Omega(1)$ time, the $\Theta(n)$ bound follows. $\quad\blacksquare$

  1. Also what does $n=O\left( m\right) $ mean ? Because $n$ is a variable and $m$ is a constant therefore the statement seems wrong

  2. Also if $n=O\left( m\right) $ is true then obviously $n=\Omega \left( m\right) $ is true thus yielding $n=\Theta \left( n\right) $

  1. How is the author able to say that $n = O(m)$, in the first line of the proof?

  2. Also, what does $n=O(m)$ mean? Because $n$ is a variable and $m$ is a constant, therefore the statement seems wrong.

  3. Also, if $n=O(m)$ is true, then obviously $n=\Omega(m)$ is true, thus yielding $n=\Theta(m)$.

I was reading universal Hashing from Introduction to Algorithms by CLRS and came across the following corollary regarding search,insert and delete functions on Univsersally Hashed tables

enter image description here

1.How is the author able to say $n=O\left( m\right) $ ?

  1. Also what does $n=O\left( m\right) $ mean ? Because $n$ is a variable and $m$ is a constant therefore the statement seems wrong

  2. Also if $n=O\left( m\right) $ is true then obviously $n=\Omega \left( m\right) $ is true thus yielding $n=\Theta \left( n\right) $

I was reading universal Hashing from Introduction to Algorithms by Cormen et al., and came across the following corollary regarding search, insert and delete functions on Universally Hashed tables:

Corollary 11.4

Using universal hashing and collision resolution by chaining in an initially empty table with $m$ slots, it takes expected time $\Theta(n)$ to handle any sequence of $n$ Insert, Search, and Delete operations containing $O(m)$ Insert operations.

Proof Since the number of insertions is $O(m)$, we have $n = O(m)$ and so $\alpha = O(1)$. The Insert and Delete operations take constant time and, by Theorem 11.3, the expected time for each Search operation is $O(1)$. By linearity of expectation, therefore, the expected time for the entire sequence of $n$ operations is $O(n)$. Since each operation takes $\Omega(1)$ time, the $\Theta(n)$ bound follows. $\quad\blacksquare$

  1. How is the author able to say that $n = O(m)$, in the first line of the proof?

  2. Also, what does $n=O(m)$ mean? Because $n$ is a variable and $m$ is a constant, therefore the statement seems wrong.

  3. Also, if $n=O(m)$ is true, then obviously $n=\Omega(m)$ is true, thus yielding $n=\Theta(m)$.

Source Link

Analysis of Universal Hashing

I was reading universal Hashing from Introduction to Algorithms by CLRS and came across the following corollary regarding search,insert and delete functions on Univsersally Hashed tables

enter image description here

1.How is the author able to say $n=O\left( m\right) $ ?

  1. Also what does $n=O\left( m\right) $ mean ? Because $n$ is a variable and $m$ is a constant therefore the statement seems wrong

  2. Also if $n=O\left( m\right) $ is true then obviously $n=\Omega \left( m\right) $ is true thus yielding $n=\Theta \left( n\right) $