# Multivariate¶

Multivariate measures of information generally attempt to capture some global property of a joint distribution. For example, they might attempt to quantify how much information is shared among the random variables, or quantify how “non-indpendent” in the joint distribution is.

## Total Information¶

These quantities, currently just the Shannon entropy, measure the total amount of information contained in a set of joint variables.

## Mutual Informations¶

These measures all reduce to the standard Shannon Mutual Information for bivariate distributions.

It is perhaps illustrative to consider how each of these measures behaves on two canonical distributions: the giant bit and parity.

 giant bit parity size I II T B J I II T B J 2 1 1 1 1 1 1 1 1 1 1 3 1 -1 2 1 1 -1 1 1 2 $$\frac{1}{2}$$ 4 1 1 3 1 1 1 1 1 3 $$\frac{1}{3}$$ 5 1 -1 4 1 1 -1 1 1 4 $$\frac{1}{4}$$ $$n$$ 1 $$(-1)^n$$ $$n$$ 1 1 $$(-1)^n$$ 1 1 $$n$$ $$\frac{1}{n-1}$$

## Common Informations¶

These measures all somehow measure shared information, but do not equal the mutual information in the bivaraite case.

### Ordering¶

The common information measures (together with the Dual Total Correlation and CAEKL Mutual Information) form an ordering:

$\K{X_{0:n}} \leq \J{X_{0:n}} \leq \B{X_{0:n}} \leq \C{X_{0:n}} \leq \G{X_{0:n}} \leq \F{X_{0:n}} \leq \M{X_{0:n}}$

## Others¶

These measures quantify other aspects of a joint distribution.