Conditional mutual information
TransferEntropy.conditional_mutualinfo
— Functionconditional_mutualinfo(x, y, z, est; base = 2, q = 1)
Estimate, $I^{q}(x; y | z)$, the conditional mutual information between x
, y
given z
, using the provided entropy/probability estimator est
from Entropies.jl or specialized estimator from TransferEntropy.jl (e.g. Kraskov1
), and Rényi entropy of order q
(defaults to q = 1
, which is the Shannon entropy), with logarithms to the given base
.
As for mutualinfo
, the variables x
, y
and z
can be vectors or potentially multivariate) Dataset
s, and the keyword q
cannot be provided for nearest-neighbor estimators (it is hard-coded to q = 1
).