Contents[hide] |
[edit]Definition
The joint entropy of two variables X and Y is defined as
where x and y are particular values of X and Y, respectively, P(x,y) is the probability of these values occurring together, and P(x,y)log2[P(x,y)] is defined to be 0 if P(x,y) = 0.
For more than two variables X1,...,Xn this expands to
where x1,...,xn are particular values of X1,...,Xn, respectively, P(x1,...,xn) is the probability of these values occurring together, andP(x1,...,xn)log2[P(x1,...,xn)] is defined to be 0 if P(x1,...,xn) = 0.
[edit]Properties
[edit]Greater than individual entropies
The joint entropy of a set of variables is greater than or equal to all of the individual entropies of the variables in the set.
[edit]Less than sum of individual entropies
The joint entropy of a set of variables is less than or equal to the sum of the individual entropies of the variables in the set. This is an example ofsubadditivity. This inequality is an equality if and only if X and Y are statistically independent.
[edit]Relations to Other Entropy Measures
Joint entropy is used in the definition of conditional entropy --
-- and mutual information:
In quantum information theory, the joint entropy is generalized into the joint quantum entropy.
No comments:
Post a Comment