medpy.metric.histogram.jensen_shannon#

medpy.metric.histogram.jensen_shannon(h1, h2)[source]#

Jensen-Shannon divergence.

A symmetric and numerically more stable empirical extension of the Kullback-Leibler divergence.

The Jensen Shannon divergence between two histograms \(H\) and \(H'\) of size \(m\) is defined as:

\[d_{JSD}(H, H') = \frac{1}{2} d_{KL}(H, H^*) + \frac{1}{2} d_{KL}(H', H^*)\]

with \(H^*=\frac{1}{2}(H + H')\).

Attributes:

  • semimetric

Attributes for normalized histograms:

  • \(d(H, H')\in[0, 1]\)

  • \(d(H, H) = 0\)

  • \(d(H, H') = d(H', H)\)

Attributes for not-normalized histograms:

  • \(d(H, H')\in[0, \infty)\)

  • \(d(H, H) = 0\)

  • \(d(H, H') = d(H', H)\)

Attributes for not-equal histograms:

  • not applicable

Parameters:
h1sequence

The first histogram.

h2sequence

The second histogram, same bins as h1.

Returns:
jensen_shannonfloat

Jensen-Shannon divergence.