Continue to Site

Welcome to EDAboard.com

Welcome to our site! EDAboard.com is an international Electronics Discussion Forum focused on EDA software, circuits, schematics, books, theory, papers, asic, pld, 8051, DSP, Network, RF, Analog Design, PCB, Service Manuals... and a whole lot more! To participate you need to register. Registration is free. Click here to register now.

Idea Case for huffman

Status
Not open for further replies.

BAT_MAN

Member level 5
Joined
Oct 9, 2006
Messages
90
Helped
0
Reputation
0
Reaction score
0
Trophy points
1,286
Activity points
1,845
Can anybody tell me what is the ideal case for huffman encoding and shan-fano and why it is always greater than the entropy of the signal.
 

Hi BAT_MAN,
The ideal case for Huffman coding and Shannon-Fano coding is when the coding efficiency in both these cases approaches 100%.i.e., when the average length of the codeword is equal to the entropy of the source.And the average length of the codeword is greater than the entropy in most cases, since using entropy coding, we are trying to increase the average information per bit and reduce redundancy.
Regards.
 

Basically my question was Can the average length of a Huffman or Shannon-Fano code be equal to the value of entropy for a given input stream? If no, give reasons, if yes, give conditions when they will be equal.
 

Status
Not open for further replies.

Part and Inventory Search

Welcome to EDABoard.com

Sponsor

Back
Top