Convenience functions

Common entropy-based measures

We provide a few convenience functions for widely used names for entropy or "entropy-like" quantities. Other arbitrary specialized convenience functions can easily be defined in a couple lines of code.

We emphasize that these functions really aren't anything more than 2-lines-of-code wrappers that call information with the appropriate OutcomeSpace and InformationMeasure.

ComplexityMeasures.entropy_waveletFunction
entropy_wavelet(x; wavelet = Wavelets.WT.Daubechies{12}(), base = 2)

Compute the wavelet entropy. This function is just a convenience call to:

est = WaveletOverlap(wavelet)
information(Shannon(base), est, x)

See WaveletOverlap for more info.

source
ComplexityMeasures.entropy_dispersionFunction
entropy_dispersion(x; base = 2, kwargs...)

Compute the dispersion entropy. This function is just a convenience call to:

est = Dispersion(kwargs...)
information(Shannon(base), est, x)

See Dispersion for more info.

source
ComplexityMeasures.entropy_approxFunction
entropy_approx(x; m = 2, τ = 1, r = 0.2 * Statistics.std(x), base = MathConstants.e)

Convenience syntax for computing the approximate entropy (Pincus, 1991) for timeseries x.

This is just a wrapper for complexity(ApproximateEntropy(; m, τ, r, base), x) (see also ApproximateEntropy).

source
ComplexityMeasures.entropy_distributionFunction
entropy_distribution(x; τ = 1, m = 3, n = 3, base = 2)

Compute the distribution entropy (Li et al., 2015) of x using embedding dimension m with delay/lag τ, using the Chebyshev distance metric, and using an n-element equally-spaced binning over the distribution of distances to estimate probabilities.

This function is just a convenience call to:

x = rand(1000000)
o = SequentialPairDistances(x, n, m, τ, metric = Chebyshev())
h = information(Shannon(base = 2), o, x)

See SequentialPairDistances for more info.

source