# Tutorial

## Question 1

In the open addressing schema of Hash table, three probing techniques have been introduced, they are linear probing, quadratic probing, and double hashing. Point out how many different probing sequences for each of the schemes. Compare the advantages and disadvantages of each of the techniques.
What is open addressing? In the open addressing hash table scheme, the elements are stored in the hash table itself.
Linear probing
Given a hash function $h\text{'}\left(k\right)=\left\{0,1,\dots ,m-1\right\}$.
Let .
There are $O\left(m\right)$ probing sequences because there are $m$ different starting points for the probing and any two probes starting from the same point will have the same sequence.
While linear probing is simple and takes less time, there is the problem of primary clustering.
Let .
There are $O\left(m\right)$ probing sequences because there are $m$ different starting points for the probing and any two probes starting from the same point will have the same sequence.
Like linear probing, quadratic probing is simple. However, while it avoids the primary clustering problem, there is a problem of secondary clustering.
Double hashing
There are $\Theta \left({m}^{2}\right)$ probing sequences because each possible $\left({h}_{i}\left(k\right),{h}_{2}\left(k\right)$ pair yields a distinct probe sequence. As we vary the key, the initial probe position and offset may vary independently.
Double hashing is an ideal hashing approach. It prevents both primary and secondary clustering problems. However, it is more complicated and requires more running time for hashing.

## Question 2

Design a hash function for the open addressing scheme such that it does not suffer from both primary and secondary clustering. In addition, the number of probing sequences derived from it has to be no less than $O\left({m}^{3}\right)$ (where $m$ is the number of slots in the hash).
Let , where ${h}_{1}$, ${h}_{2}$, and ${h}_{3}$ are hash functions that map a key $k$ to one of the $m$ slots available. This scheme avoids both the primary and secondary clustering problems. Also, there are $\Theta \left({m}^{3}\right)$ probing sequences as each $\left({h}_{1}\left(k\right),{h}_{2}\left(k\right),{h}_{3}\left(k\right)\right)$ 3-tuple yields a distinct probe sequence.

## Question 3

Why should the priority queue be introduced? What's its purpose?
The purpose of the priority queue is to maintain a dynamic set on which the following operations can be implemented efficiently: insert, delete, and search (finding the minimum/maximum/priority key).

## Question 4

Show how to insert an element to a min-heap such that the heap property holds.
We represent a binary heap as an array of $n$ elements $A\left[1\dots n\right]$. The parent of any given element $i$, $P\left(i\right)=i/2$. Left child of any given element $L\left(i\right)=2i$, and right child $R\left(i\right)=2i+1$. To insert an element and maintain the heap property:
1. Insert element into A[n+1].
2. Sift-up from A[n+1]. Call SIFT-UP $\left(A,n+1\right)$.
```def sift_up(A, j):
i = j
while i > 1 and A[i] has priority over A[P(i)]:
swap A[i] and A[P(i)]
i = P(i)

```
SIFT-UP has a runtime of $\Theta \left(\mathrm{lg}n\right)$. At the swapping step, we do not need to be concerned about the key of the sibling as $P\left(i\right)$ has priority over the sibling. Hence, if $i$ has priority over $P\left(i\right)$, then $i$ must have priority over its sibling too.
3. Increment $n$.

File translated from TEX by TTM, version 3.67.
On 31 Mar 2006, 18:12.