jax.scipy.stats.poisson.entropy

Contents

jax.scipy.stats.poisson.entropy#

jax.scipy.stats.poisson.entropy(mu, loc=0)[source]#

Shannon entropy of the Poisson distribution.

JAX implementation of scipy.stats.poisson entropy.

The entropy \(H(X)\) of a Poisson random variable \(X \sim \text{Poisson}(\mu)\) is defined as:

\[H(X) = -\sum_{k=0}^\infty p(k) \log p(k)\]

where \(p(k) = e^{-\mu} \mu^k / k!\) for \(k \geq \max(0, \lfloor \text{loc} \rfloor)\).

This implementation uses regime switching for numerical stability and performance:

  • Small \(\mu < 10\): Direct summation over PMF with adaptive upper bound \(k \leq \mu + 20\)

  • Medium \(10 \leq \mu < 100\): Summation with bound \(k \leq \mu + 10\sqrt{\mu} + 20\)

  • Large \(\mu \geq 100\): Asymptotic Stirling approximation: \(H(\mu) \approx \frac{1}{2} \log(2\pi e \mu) - \frac{1}{12\mu}\)

Matches SciPy to relative error \(< 10^{-5}\) across all regimes.

Parameters:
Returns:

Array of entropy values with shape broadcast from mu and loc. Returns NaN for mu <= 0.

Return type:

Array

Examples

>>> from jax.scipy.stats import poisson
>>> poisson.entropy(5.0)
Array(2.204394, dtype=float32)
>>> poisson.entropy(jax.numpy.array([1, 10, 100]))
Array([1.3048419, 2.561407 , 3.7206903], dtype=float32)