What is the limit of compression? Huffman codes are introduced in order to demonstrate how entropy is the ultimate limit of compression.
Art of the Problem
Source
Similar Posts
10 thoughts on “Information Theory part 13: Data Compression via Huffman Coding”
Comments are closed.
What is the simulation shown at 3:22 is it online?
YES YES YES YES! These are soo good!
awsome videos lovem
I cannot believe how well that was explained. I've not done much searching for explanations for Huffman, but this really makes it easy to understand.
Great video. Awesome to see a new upload from you
Oau, this is super
In this given problem we didn't have that information, but in practice, you could you get better compression if you did it with letter sequences (in stead of individual letters), similar to what the other guy was doing for his english speaking machines, correct?
Hi. Could you please list the music you used as soundtrack for your videos? I like dissonance :).
Wonderful video, I was trying to learn huffman codes from a book, but could not get my mind around it. but this explanation is superb
Could you do ?
A: 1
B : 01
C : 10
D : 00.