Conditional mutual information API

The condition mutual information (CMI) API is defined by

CausalityTools.condmutualinfoFunction
condmutualinfo([measure::CMI], est::CMIEstimator, x, y, z) → cmi::Real

Estimate a conditional mutual information (CMI) of some kind (specified by measure), between x and y, given z, using the given dedicated ConditionalMutualInformationEstimator, which may be discrete, continuous or mixed.

Estimators

EstimatorPrincipleCMIShannonCMIRenyiPoczos
FPVPNearest neighborsx
MesnerShaliziNearest neighborsx
RahimzamaniNearest neighborsx
PoczosSchneiderCMINearest neighborsx
source
condmutualinfo([measure::CMI], est::ProbabilitiesEstimator, x, y, z) → cmi::Real ∈ [0, a)

Estimate the conditional mutual information (CMI) measure between x and y given z using a sum of entropy terms, without any bias correction, using the provided ProbabilitiesEstimator est. If measure is not given, then the default is CMIShannon().

With a ProbabilitiesEstimator, the returned cmi is guaranteed to be non-negative.

Estimators

EstimatorPrincipleCMIShannonCMIRenyiSarbu
CountOccurrencesFrequencies
ValueHistogramBinning (histogram)
SymbolicPermutationOrdinal patterns
DispersionDispersion patterns
source
condmutualinfo([measure::CMI], est::DifferentialEntropyEstimator, x, y, z) → cmi::Real

Estimate the mutual information between x and y conditioned on z, using the differential version of the given conditional mutual information (CMI) measure. The DifferentialEntropyEstimator est must must support multivariate data. No bias correction is performed. If measure is not given, then the default is CMIShannon().

Note

DifferentialEntropyEstimators have their own base field which is not used here. Instead, this method creates a copy of est internally, where est.base is replaced by measure.e.base. Therefore, use measure to control the "unit" of the mutual information.

Estimators

EstimatorPrincipleCMIShannon
KraskovNearest neighbors
ZhuNearest neighbors
GaoNearest neighbors
GoriaNearest neighbors
LordNearest neighbors
LeonenkoProzantoSavaniNearest neighbors
source
condmutualinfo([measure::CMI], est::MutualInformationEstimator, x, y, z) → cmi::Real

Estimate the mutual information between x and y conditioned on z, using the given conditional mutual information (CMI) measure, computed as a a difference of mutual information terms (just the chain rule of mutual information)

\[\hat{I}(X; Y | Z) = \hat{I}(X; Y, Z) - \hat{I}(X; Z).\]

The MutualInformationEstimator est may be continuous/differential, discrete or mixed. No bias correction in performed, except the bias correction that occurs for each individual mutual information term. If measure is not given, then the default is CMIShannon().

Estimators

EstimatorTypePrincipleCMIShannon
KraskovStögbauerGrassberger1ContinuousNearest neighbors
KraskovStögbauerGrassberger2ContinuousNearest neighbors
GaoKannanOhViswanathMixedNearest neighbors
GaoOhViswanathContinuousNearest neighbors
source

Definitions

ConditionalMutualInformationEstimators

GaussianCMI (parametric)

CausalityTools.GaussianCMIType
GaussianCMI <: MutualInformationEstimator
GaussianCMI(; normalize::Bool = false)

GaussianCMI is a parametric estimator for Shannon conditional mutual information (CMI) (Vejmelka and Paluš, 2008).

Description

GaussianCMI estimates Shannon CMI through a sum of two mutual information terms that each are estimated using GaussianMI (the normalize keyword is the same as for GaussianMI):

\[\hat{I}_{Gaussian}(X; Y | Z) = \hat{I}_{Gaussian}(X; Y, Z) - \hat{I}_{Gaussian}(X; Z)\]

source

FPVP

CausalityTools.FPVPType
FPVP <: ConditionalMutualInformationEstimator
FPVP(k = 1, w = 0)

The Frenzel-Pompe-Vejmelka-Paluš (or FPVP for short) estimator is used to estimate the differential conditional mutual information using a k-th nearest neighbor approach that is analogous to that of the KraskovStögbauerGrassberger1 mutual information estimator (Frenzel and Pompe (2007); Vejmelka and Paluš (2008)).

w is the Theiler window, which controls the number of temporal neighbors that are excluded during neighbor searches.

source

MesnerShalizi

CausalityTools.MesnerShaliziType
MesnerShalizi <: ConditionalMutualInformationEstimator
MesnerShalizi(k = 1, w = 0)

The MesnerShalizi estimator is an estimator for conditional mutual information for data that can be mixtures of discrete and continuous data MesnerShalizi2020.

source

PoczosSchneiderCMI

CausalityTools.PoczosSchneiderCMIType
PoczosSchneiderCMI <: ConditionalMutualInformationEstimator
PoczosSchneiderCMI(k = 1, w = 0)

The PoczosSchneiderCMI estimator computes various (differential) conditional mutual informations, using a k-th nearest neighbor approach (Póczos & Schneider, 2012)[Póczos2012].

source

Rahimzamani

CausalityTools.RahimzamaniType
Rahimzamani <: ConditionalMutualInformationEstimator
Rahimzamani(k = 1, w = 0)

The Rahimzamani estimator, short for Rahimzamani-Asnani-Viswanath-Kannan, is an estimator for Shannon conditional mutual information for data that can be mixtures of discrete and continuous data (Rahimzamani et al., 2018).

This is very similar to the GaoKannanOhViswanath mutual information estimator, but has been expanded to the conditional case.

source
  • Póczos2012Póczos, B., & Schneider, J. (2012, March). Nonparametric estimation of conditional information and divergences. In Artificial Intelligence and Statistics (pp. 914-923). PMLR.