Computes the symmetric Kullback-Leibler divergence between functional data treated as probability distributions. Curves are first normalized to be valid probability density functions.
Arguments
- fdataobj
An object of class 'fdata'.
- fdataref
An object of class 'fdata'. If NULL, computes self-distances.
- eps
Small value for numerical stability (default 1e-10).
- normalize
Logical. If TRUE (default), curves are shifted to be non-negative and normalized to integrate to 1.
- ...
Additional arguments (ignored).
Details
The symmetric KL divergence is computed as: $$D_{KL}(f, g) = \frac{1}{2}[KL(f||g) + KL(g||f)]$$ where $$KL(f||g) = \int f(t) \log\frac{f(t)}{g(t)} dt$$
When normalize = TRUE, curves are first shifted to be non-negative
(by subtracting the minimum and adding eps), then normalized to integrate
to 1. This makes them valid probability density functions.
The symmetric KL divergence is always non-negative and equals zero only when the two distributions are identical. However, it does not satisfy the triangle inequality.