Huffman
Huffman
Huffman
An Application of Binary
Trees and Priority Queues
Huffman Coding
Giving credit where credit
is due:
– Most of slides for this
lecture are based on slides
taken from internet.
– I have modified a few of them
and added some new slides
Purpose of Huffman
Coding
Proposed by Dr. David A.
Huffman in 1952
– “A Method for the Construction
of Minimum Redundancy Codes”
Applicable to many forms of
data transmission
– Our example: text files
First Approach
E 0 E 0
T 11 T 10
N 100 N 100
I 1010 I 0111
S 1011 S 1010
11010010010101011 100100101010
Prefix Codes
A = 0
B = 100
C = 1010
D = 1011
R = 11
Decoding is unique
and simple!
How do we find the
optimal coding tree?
it is clear that the two symbols with
the smallest frequencies must be at
the bottom of the optimal tree, as
children of the lowest internal node
A: 64
B: 13
C: 22
D: 32
E: 103
What is the Huffman
Encoding Tree?
Exercise
The Complete Algorithm
1. Scan text to be compressed and tally
occurrence of all characters.
2. Sort or prioritize characters based on
number of occurrences in text.
3. Build Huffman code tree based on
prioritized list.
4. Perform a traversal of tree to determine
all code words.
5. Scan text again and create new file
using the Huffman codes.
Algorithm for Building a
Huffman Tree
1. Construct a set of trees with root nodes that
contain each of the individual symbols and their
weights.
2. Place the set of trees into a priority queue.
3. while the priority queue has more than one item
4. Remove the two trees with the smallest weights.
5. Combine them into a new binary tree in which the
weight of the tree root is the sum of the
weights of its children.
6. Insert the newly created tree back into the
priority queue.
Building a Tree
Scan the original text
E e r i
space
y s n a r l k .
Building a Tree
Scan the original text
E i y l k . r s n a sp e
1 1 1 1 1 1 2 2 2 2 4 8
Building a Tree
While priority queue contains two
or more nodes
– Create new node
– Dequeue node and make it left
subtree
– Dequeue next node and make it right
subtree
– Frequency of new node equals sum of
frequency of left and right children
– Enqueue new node back into queue
Building a Tree
E i y l k . r s n a sp e
1 1 1 1 1 1 2 2 2 2 4 8
Building a Tree
y l k . r s n a sp e
1 1 1 1 2 2 2 2 4 8
E i
1 1
Building a Tree
y l k . r s n a sp e
2
1 1 1 1 2 2 2 2 4 8
E i
1 1
Building a Tree
k . r s n a sp e
2
1 1 2 2 2 2 4 8
E i
1 1
y l
1
1
Building a Tree
2
k . r s n a 2 sp e
1 1 2 2 2 2 4 8
y l
E i 1 1
1 1
Building a Tree
r s n a 2 2 sp e
2 2 2 2 4 8
y l
E i 1 1
1 1
k .
1 1
Building a Tree
r s n a 2 2 sp e
2
2 2 2 2 4 8
E i y l k .
1 1
1 1 1 1
Building a Tree
n a 2 sp e
2 2
2 2 4 8
E i y l k .
1 1
1 1 1 1
r s
2 2
Building a Tree
n a 2 sp e
2 2 4
2 2 4 8
E i y l k . r s
1 1 1 1
1 1 2 2
Building a Tree
2 4 e
2 2 sp
8
4
y l k . r s
E i
1 1 1 1 1 1 2 2
n a
2 2
Building a Tree
2 4 4 e
2 2 sp
4 r s n a
y l k . 8
E i 2 2 2 2
1 1 1 1
1 1
Building a Tree
4 4 e
2 sp
8
4 r s n a
k .
1 1 2 2 2 2
2 2
E i y l
1
1 1 1
Building a Tree
4 4 4
2 sp e
4 2 2 8
k . r s n a
1 1 2 2 2 2
E i y l
1
1 1 1
Building a Tree
4 4 4
e
2 2 8
r s n a
2 2
2 2 E i y l
1
1 1 1
2 sp
4
k .
1 1
Building a Tree
4 4 4 6 e
2 sp 8
r s n a 2 2
4
2 2 k .
2 2
E i y l
1 1
1 1 1 1
Building a Tree
4
6 e
2 2 2 8
sp
4
E i y l k .
1 1 8
1 1 1 1
4 4
r s n a
2 2
2 2
Building a Tree
4
6 e 8
2 2 2 8
sp 4
4 4
E i y l k .
1 1 r s n a
1 1 1 1
2 2 2 2
Building a Tree
8
e
8
4 4
10
r s n a
2 2 4
2 2 6
2 2 2 sp
4
E i y l k .
1 1 1 1 1 1
Building a Tree
8 10
e
8 4
4 4
6
n a 2 2 2
r s sp
2 2 4
2 2 E i y l k .
1 1 1 1 1 1
Building a Tree
10
16
4
6
2 2 e 8
2 sp
4 8
E i y l k . 4 4
1 1 1 1 1 1
r s n a
2 2
2 2
Building a Tree
10 16
4
6
e 8
2 2
2 sp 8
4 4 4
E i y l k .
1 1 1 1 1 1 r s n a
2 2
2 2
Building a Tree
26
16
10
4 e 8
6
8
2 2
2 sp 4 4
4
E i y l k .
1 1 1 r s n a
1 1 1 2 2
2 2
Building a Tree
•After
enqueueing
26 this node
there is only
16
10 one node
4 e 8
left in
6
8
priority
2 2
2 sp 4 4 queue.
4
E i y l k .
r s n a
1 1 1 1 1 1 2 2
2 2
Building a Tree
Dequeue the single node
left in the queue.
26
16
This tree contains the 10
new code words for each
4 e 8
character. 6
2 2 8
2 sp 4 4
4
Frequency of root node E i y l k .
1 1 1 1 1 1 r s n a
should equal number
of characters in 2 2 2
2
text.
26 characters
Eerie eyes seen near
lake.
Encoding the File
Traverse Tree for Codes
Perform a traversal
of the tree to
obtain new code 26
words 16
10
Going left is a 0
going right is a 1 4 e 8
6
code word is only 2 2 8
2 sp 4 4
completed when a 4
E i y l k .
leaf node is 1 1 1 1 1 1 r s n a
reached 2 2 2
2
Encoding the File
Traverse Tree for Codes
Char Code
E
i 0000
y 0001 26
0010 16
l 0011 10
0100 4 e 8
k 0101 6
. 011 2 2 2
8
4 4
sp
space 10 4
E i y l k .
e 1100 1 1 1 1 1 1 r s n a
r 1101 2 2 2
1110 2
s 1111
Encoding the File
Rescan text and
encode file using Char Code
new code words E
i 0000
Eerie eyes seen near y 0001
lake. 0010
0000101100000110011 l 0011
1000101011011010011 0100
1110101111110001100 k 0101
. 011
1111110100100101 space 10
Why is there no need e 1100
for a separator r 1101
character? 1110
s 1111
.
Encoding the File
Results
Have we made
things any 0000101100000110011
better? 1000101011011010011
1110101111110001100
73 bits to encode
1111110100100101
the text
ASCII would take
8 * 26 = 208 bits
If modified code used 4 bits per
character are needed. Total bits
4 * 26 = 104. Savings not as
great.
Decoding the File
Use the Huffman tree to find the codeword for each character.
The sequence of zeros and ones that are the arcs in the path from the root to each leaf node are
the desired codes:
character a e l n o s t
If the message is sent uncompressed with 8-bit ASCII representation for the
characters, we have 261*8 = 2088 bits.
Encoding and decoding examples
Encode (compress) the message tenseas
using the following codewords:
Answer: Replace each character with its
codeword:
001011101010110010
(a)0110011101000 =>
lost (b) 11101110101011