Continue to Site

Welcome to EDAboard.com

Welcome to our site! EDAboard.com is an international Electronics Discussion Forum focused on EDA software, circuits, schematics, books, theory, papers, asic, pld, 8051, DSP, Network, RF, Analog Design, PCB, Service Manuals... and a whole lot more! To participate you need to register. Registration is free. Click here to register now.

what does relative entropy mean?

Status
Not open for further replies.
It is a measure of the distance between two discrete probability distributions. The same probability distribution would have a relative entropy of 0 with itself, while a large relative entropy indicates a large difference between two distributions. It is also known as the Kullback-Leibler distance between two distributions. I believe it is often used to show that one probability distribution is converging to another via a shrinking relative entropy.
 

what does " converging to another via a shrinking relative entropy" mean?

shrinking ? any physical meanning of the relative entropy here?

It is a measure of the distance between two discrete probability distributions. The same probability distribution would have a relative entropy of 0 with itself, while a large relative entropy indicates a large difference between two distributions. It is also known as the Kullback-Leibler distance between two distributions. I believe it is often used to show that one probability distribution is converging to another via a shrinking relative entropy.
 

"converging to another via a shrinking relative entropy" means that as you increase n, the relative entropy is decreasing (going to 0), so the distribution is converging. I do not know of a physical meaning to relative entropy; as I stated above, it is a measure of distance between two probability distributions.
 

I do not understand this "distance".

But thanks a lot though.

"converging to another via a shrinking relative entropy" means that as you increase n, the relative entropy is decreasing (going to 0), so the distribution is converging. I do not know of a physical meaning to relative entropy; as I stated above, it is a measure of distance between two probability distributions.
 

Think about it this way: you define a distance metric for two numbers, say the absolute value of their difference. This one is straightforward to define. Another measure of distance is to take their difference squared; once again, this is pretty straightforward.

However, how do you define the distance between two vectors? One way to do it is to measure the 2-norm of the difference vector; once again, pretty straightforward.

Now, how do you define the distance between two distributions? One answer is to use the relative entropy. That is likely where the motivation for relative entropy comes from; people were trying to judge the distance between two distributions, and in order to do so they came up with relative entropy.
 
Shug mentioned the meaning of relative entropy in statistics. You can review the topic at Wikipedia.

The original post doesn't mention a context, but apparently are you assuming relative entropy could have a meaning in thermodynamics, too. Any indications?

Curiously, you posted the question in the Digital Design and Programming.
 

I think this is digital communication question.
Maybe in wrong place?


Shug mentioned the meaning of relative entropy in statistics. You can review the topic at Wikipedia.

The original post doesn't mention a context, but apparently are you assuming relative entropy could have a meaning in thermodynamics, too. Any indications?

Curiously, you posted the question in the Digital Design and Programming.
 

Dear W_H,

you should be more explicit. Where does the question came from?
I guess that the question is in the context of information theory, where the term has a meaning.
In that case, search for "relative entropy" "information theory".
Regards

Z
 
  • Like
Reactions: FvM

    FvM

    Points: 2
    Helpful Answer Positive Rating
Think about it this way: you define a distance metric for two numbers, say the absolute value of their difference. This one is straightforward to define. Another measure of distance is to take their difference squared; once again, this is pretty straightforward.

However, how do you define the distance between two vectors? One way to do it is to measure the 2-norm of the difference vector; once again, pretty straightforward.

Now, how do you define the distance between two distributions? One answer is to use the relative entropy. That is likely where the motivation for relative entropy comes from; people were trying to judge the distance between two distributions, and in order to do so they came up with relative entropy.


Pro Answer!
 

Status
Not open for further replies.

Similar threads

Part and Inventory Search

Welcome to EDABoard.com

Sponsor

Back
Top