#### Previous topic

medpy.metric.binary.volume_change_correlation

#### Next topic

medpy.metric.histogram.chebyshev

# medpy.metric.image.mutual_information¶

medpy.metric.image.mutual_information(i1, i2, bins=256)[source]

Computes the mutual information (MI) (a measure of entropy) between two images.

MI is not real metric, but a symmetric and nonnegative similarity measures that takes high values for similar images. Negative values are also possible.

Intuitively, mutual information measures the information that i1 and i2 share: it measures how much knowing one of these variables reduces uncertainty about the other.

The Entropy is defined as:

$H(X) = - \sum_i p(g_i) * ln(p(g_i)$

with $$p(g_i)$$ being the intensity probability of the images grey value $$g_i$$.

Assuming two images $$R$$ and $$T$$, the mutual information is then computed by comparing the images entropy values (i.e. a measure how well-structured the common histogram is). The distance metric is then calculated as follows:

$MI(R,T) = H(R) + H(T) - H(R,T) = H(R) - H(R|T) = H(T) - H(T|R)$

A maximization of the mutual information is equal to a minimization of the joint entropy.

Parameters: i1 : array_like The first image. i2 : array_like The second image. bins : integer The number of histogram bins (squared for the joined histogram). mutual_information : float The mutual information distance value between the supplied images. ArgumentError If the supplied arrays are of different shape.