#### Previous topic

medpy.metric.histogram.histogram_intersection_1

#### Next topic

medpy.metric.histogram.kullback_leibler

# medpy.metric.histogram.jensen_shannon¶

medpy.metric.histogram.jensen_shannon(h1, h2)[source]

Jensen-Shannon divergence.

A symmetric and numerically more stable empirical extension of the Kullback-Leibler divergence.

The Jensen Shannon divergence between two histograms $$H$$ and $$H'$$ of size $$m$$ is defined as:

$d_{JSD}(H, H') = \frac{1}{2} d_{KL}(H, H^*) + \frac{1}{2} d_{KL}(H', H^*)$

with $$H^*=\frac{1}{2}(H + H')$$.

Attributes:

• semimetric

Attributes for normalized histograms:

• $$d(H, H')\in[0, 1]$$
• $$d(H, H) = 0$$
• $$d(H, H') = d(H', H)$$

Attributes for not-normalized histograms:

• $$d(H, H')\in[0, \infty)$$
• $$d(H, H) = 0$$
• $$d(H, H') = d(H', H)$$

Attributes for not-equal histograms:

• not applicable
Parameters: h1 : sequence The first histogram. h2 : sequence The second histogram, same bins as h1. jensen_shannon : float Jensen-Shannon divergence.