Previous topic

medpy.metric.histogram.jensen_shannon

Next topic

medpy.metric.histogram.manhattan

This Page

medpy.metric.histogram.kullback_leibler

medpy.metric.histogram.kullback_leibler(h1, h2)[source]

Kullback-Leibler divergence.

Compute how inefficient it would to be code one histogram into another. Actually computes \(\frac{d_{KL}(h1, h2) + d_{KL}(h2, h1)}{2}\) to achieve symmetry.

The Kullback-Leibler divergence between two histograms \(H\) and \(H'\) of size \(m\) is defined as:

\[d_{KL}(H, H') = \sum_{m=1}^M H_m\log\frac{H_m}{H'_m}\]

Attributes:

  • quasimetric (but made symetric)

Attributes for normalized histograms:

  • \(d(H, H')\in[0, \infty)\)
  • \(d(H, H) = 0\)
  • \(d(H, H') = d(H', H)\)

Attributes for not-normalized histograms:

  • not applicable

Attributes for not-equal histograms:

  • not applicable
Parameters:

h1 : sequence

The first histogram, where h1[i] > 0 for any i such that h2[i] > 0, normalized.

h2 : sequence

The second histogram, where h2[i] > 0 for any i such that h1[i] > 0, normalized, same bins as h1.

Returns:

kullback_leibler : float

Kullback-Leibler divergence.