Welcome to our site! EDAboard.com is an international Electronics Discussion Forum focused on EDA software, circuits, schematics, books, theory, papers, asic, pld, 8051, DSP, Network, RF, Analog Design, PCB, Service Manuals... and a whole lot more! To participate you need to register. Registration is free. Click here to register now.
Hi BAT_MAN,
The ideal case for Huffman coding and Shannon-Fano coding is when the coding efficiency in both these cases approaches 100%.i.e., when the average length of the codeword is equal to the entropy of the source.And the average length of the codeword is greater than the entropy in most cases, since using entropy coding, we are trying to increase the average information per bit and reduce redundancy.
Regards.
Basically my question was Can the average length of a Huffman or Shannon-Fano code be equal to the value of entropy for a given input stream? If no, give reasons, if yes, give conditions when they will be equal.
This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
By continuing to use this site, you are consenting to our use of cookies.