TransferEntropy.mutualinfoFunction
mutualinfo(x, y, est; base = 2, q = 1)

Estimate mutual information between x and y, $I^{q}(x; y)$, using the provided entropy/probability estimator est from Entropies.jl or specialized estimator from TransferEntropy.jl (e.g. Kraskov1), and Rényi entropy of order q (defaults to q = 1, which is the Shannon entropy), with logarithms to the given base.

Both x and y can be vectors or (potentially multivariate) Datasets.

Worth highlighting here are the estimators that compute entropies directly, e.g. nearest-neighbor based methods. The choice is between naive estimation using the KozachenkoLeonenko or Kraskov entropy estimators, or the improved Kraskov1 and Kraskov2 dedicated $I$ estimators. The latter estimators reduce bias compared to the naive estimators.

Note: only Shannon entropy is possible to use for nearest neighbor estimators, so the keyword q cannot be provided; it is hardcoded as q = 1.

Description

Mutual information $I$ between $X$ and $Y$ is defined as

\[I(X; Y) = \sum_{y \in Y} \sum_{x \in X} p(x, y) \log \left( \dfrac{p(x, y)}{p(x)p(y)} \right)\]

Here, we rewrite this expression as the sum of the marginal entropies, and extend the definition of $I$ to use generalized Rényi entropies

\[I^{q}(X; Y) = H^{q}(X) + H^{q}(Y) - H^{q}(X, Y),\]

where $H^{q}(\cdot)$ is the generalized Renyi entropy of order $q$, i.e., the genentropy function from Entropies.jl.

source
TransferEntropy.conditional_mutualinfoFunction
conditional_mutualinfo(x, y, z, est; base = 2, q = 1)

Estimate, $I^{q}(x; y | z)$, the conditional mutual information between x, y given z, using the provided entropy/probability estimator est from Entropies.jl or specialized estimator from TransferEntropy.jl (e.g. Kraskov1), and Rényi entropy of order q (defaults to q = 1, which is the Shannon entropy), with logarithms to the given base.

As for mutualinfo, the variables x, y and z can be vectors or potentially multivariate) Datasets, and the keyword q cannot be provided for nearest-neighbor estimators (it is hard-coded to q = 1).

source