Skip to content

Transfer entropy (TE) estimators

k Nearest neighbours (kNN) estimator

The kNN estimator computes transfer entropy as the sum of two mutual information terms, which are computed using the Kraskov estimator of mutual information (Kraskov et al., 2004). Implemented for Diego et al. (2018).

Documentation

# TransferEntropy.tekNNFunction.

transferentropy_kraskov(points::AbstractArray{T, 2}, k1::Int, k2::Int,
    v::TEVars; metric = Chebyshev())

Compute transfer entropy decomposed as the sum of mutual informations, using an adapted version of the Kraskov estimator for mutual information [1].

Arguments

  • points: The set of points representing the embedding for which to compute transfer entropy. Must be provided as an array of size dim-by-n points.
  • k1: The number of nearest neighbours for the highest-dimensional mutual information estimate. To minimize bias, choose $k_1 < k_2$ if $min(k_1, k_2) < 10$ (see fig. 16 in [1]). Beyond dimension 5, choosing $k_1 = k_2$ results in fairly low bias, and a low number of nearest neighbours, say k1 = k2 = 4, will suffice.
  • k2: The number of nearest neighbours for the lowest-dimensional mutual information estimate. To minimize bias, choose $k_1 < k_2$ if if $min(k_1, k_2) < 10$ (see fig. 16 in [1]). Beyond dimension 5, choosing $k_1 = k_2$ results in fairly low bias, and a low number of nearest neighbours, say k1 = k2 = 4, will suffice.
  • v: A TEVars instance, indicating which variables of the embedding should be grouped as what when computing the marginal entropies that go into the transfer entropy expression.

Keyword arguments

  • metric: The distance metric. Must be a valid metric from Distances.jl.

References

  1. Kraskov, Alexander, Harald Stögbauer, and Peter Grassberger. "Estimating mutual information." Physical review E 69.6 (2004): 066138.
transferentropy_kraskov(points::AbstractArray{T, 2}, k1::Int, k2::Int,
    target_future::Union{Int, UnitRange{Int}, Vector{Int}, Tuple{Int}},
    target_presentpast::Union{Int, UnitRange{Int}, Vector{Int}, Tuple{Int}},
    source_presentpast::Union{Int, UnitRange{Int}, Vector{Int}, Tuple{Int}},
    conditioned_presentpast::Union{Int, UnitRange{Int}, Vector{Int}, Tuple{Int}};
    metric = Chebyshev()) where T

Compute transfer entropy decomposed as the sum of mutual informations, using an adapted version of the Kraskov estimator for mutual information [1].

Arguments

  • points: The set of points representing the embedding for which to compute transfer entropy. Must be provided as an array of size dim-by-n points.
  • k1: The number of nearest neighbours for the highest-dimensional mutual information estimate. To minimize bias, choose $k_1 < k_2$ if $min(k_1, k_2) < 10$ (see fig. 16 in [1]). Beyond dimension 5, choosing $k_1 = k_2$ results in fairly low bias, and a low number of nearest neighbours, say k1 = k2 = 4, will suffice.
  • k2: The number of nearest neighbours for the lowest-dimensional mutual information estimate. To minimize bias, choose $k_1 < k_2$ if if $min(k_1, k_2) < 10$ (see fig. 16 in [1]). Beyond dimension 5, choosing $k_1 = k_2$ results in fairly low bias, and a low number of nearest neighbours, say k1 = k2 = 4, will suffice.
  • v: A TEVars instance, indicating which variables of the embedding should be grouped as what when computing the marginal entropies that go into the transfer entropy expression.
  • target_future: Which rows of points correspond to future values of the target variable?
  • target_presentpast: Which rows of points correspond to present and past values of the target variable?
  • source_presentpast: Which rows of points correspond to present and past values of the source variable?
  • conditioned_presentpast: Which rows of points correspond to present and past values of conditional variables?

Keyword arguments

  • metric: The distance metric. Must be a valid metric from Distances.jl.

References

  1. Kraskov, Alexander, Harald Stögbauer, and Peter Grassberger. "Estimating mutual information." Physical review E 69.6 (2004): 066138.
transferentropy_kraskov(points::AbstractArray{T, 2}, k1::Int, k2::Int,
    target_future::Union{Int, UnitRange{Int}, Vector{Int}, Tuple{Int}},
    target_presentpast::Union{Int, UnitRange{Int}, Vector{Int}, Tuple{Int}},
    source_presentpast::Union{Int, UnitRange{Int}, Vector{Int}, Tuple{Int}};
    metric = Chebyshev()) where T

Compute transfer entropy decomposed as the sum of mutual informations, using an adapted version of the Kraskov estimator for mutual information [1].

Arguments

  • points: The set of points representing the embedding for which to compute transfer entropy. Must be provided as an array of size dim-by-n points.
  • k1: The number of nearest neighbours for the highest-dimensional mutual information estimate. To minimize bias, choose $k_1 < k_2$ if $min(k_1, k_2) < 10$ (see fig. 16 in [1]). Beyond dimension 5, choosing $k_1 = k_2$ results in fairly low bias, and a low number of nearest neighbours, say k1 = k2 = 4, will suffice.
  • k2: The number of nearest neighbours for the lowest-dimensional mutual information estimate. To minimize bias, choose $k_1 < k_2$ if if $min(k_1, k_2) < 10$ (see fig. 16 in [1]). Beyond dimension 5, choosing $k_1 = k_2$ results in fairly low bias, and a low number of nearest neighbours, say k1 = k2 = 4, will suffice.
  • v: A TEVars instance, indicating which variables of the embedding should be grouped as what when computing the marginal entropies that go into the transfer entropy expression.
  • target_future: Which rows of points correspond to future values of the target variable?
  • target_presentpast: Which rows of points correspond to present and past values of the target variable?
  • source_presentpast: Which rows of points correspond to present and past values of the source variable?

This version of the function assumes there is no conditioning.

Keyword arguments

  • metric: The distance metric. Must be a valid metric from Distances.jl.

References

  1. Kraskov, Alexander, Harald Stögbauer, and Peter Grassberger. "Estimating mutual information." Physical review E 69.6 (2004): 066138.
transferentropy_kraskov(E::StateSpaceReconstruction.AbstractEmbedding,
        k1::Int, k2::Int, v::TEVars; metric = Chebyshev())

Compute transfer entropy decomposed as the sum of mutual informations, using an adapted version of the Kraskov estimator for mutual information [1].

Arguments:

  • E: The embedding for which to compute transfer entropy.
  • k1: The number of nearest neighbours for the highest-dimensional mutual information estimate. To minimize bias, choose $k_1 < k_2$ if $min(k_1, k_2) < 10$ (see fig. 16 in [1]). Beyond dimension 5, choosing $k_1 = k_2$ results in fairly low bias, and a low number of nearest neighbours, say k1 = k2 = 4, will suffice.
  • k2: The number of nearest neighbours for the lowest-dimensional mutual information estimate. To minimize bias, choose $k_1 < k_2$ if if $min(k_1, k_2) < 10$ (see fig. 16 in [1]). Beyond dimension 5, choosing $k_1 = k_2$ results in fairly low bias, and a low number of nearest neighbours, say k1 = k2 = 4, will suffice.
  • v: A TEVars instance, indicating which variables of the embedding should be grouped as what when computing the marginal entropies that go into the transfer entropy expression.

References

  1. Kraskov, Alexander, Harald Stögbauer, and Peter Grassberger. "Estimating

mutual information." Physical review E 69.6 (2004): 066138.

transferentropy_kraskov(E::AbstractEmbedding, k1::Int, k2::Int,
    target_future::Union{Int, UnitRange{Int}, Vector{Int}, Tuple{Int}},
    target_presentpast::Union{Int, UnitRange{Int}, Vector{Int}, Tuple{Int}},
    source_presentpast::Union{Int, UnitRange{Int}, Vector{Int}, Tuple{Int}},
    conditioned_presentpast::Union{Int, UnitRange{Int}, Vector{Int}, Tuple{Int}};
    metric = Chebyshev()) where T

Compute transfer entropy decomposed as the sum of mutual informations, using an adapted version of the Kraskov estimator for mutual information [1].

Arguments

  • points: The set of points representing the embedding for which to compute transfer entropy. Must be provided as an array of size dim-by-n points.
  • k1: The number of nearest neighbours for the highest-dimensional mutual information estimate. To minimize bias, choose $k_1 < k_2$ if $min(k_1, k_2) < 10$ (see fig. 16 in [1]). Beyond dimension 5, choosing $k_1 = k_2$ results in fairly low bias, and a low number of nearest neighbours, say k1 = k2 = 4, will suffice.
  • k2: The number of nearest neighbours for the lowest-dimensional mutual information estimate. To minimize bias, choose $k_1 < k_2$ if if $min(k_1, k_2) < 10$ (see fig. 16 in [1]). Beyond dimension 5, choosing $k_1 = k_2$ results in fairly low bias, and a low number of nearest neighbours, say k1 = k2 = 4, will suffice.
  • v: A TEVars instance, indicating which variables of the embedding should be grouped as what when computing the marginal entropies that go into the transfer entropy expression.
  • target_future: Which rows of points correspond to future values of the target variable?
  • target_presentpast: Which rows of points correspond to present and past values of the target variable?
  • source_presentpast: Which rows of points correspond to present and past values of the source variable?
  • conditioned_presentpast: Which rows of points correspond to present and past values of conditional variables?

Keyword arguments

  • metric: The distance metric. Must be a valid metric from Distances.jl.

References

  1. Kraskov, Alexander, Harald Stögbauer, and Peter Grassberger. "Estimating mutual information." Physical review E 69.6 (2004): 066138.

References

Kraskov, A., Stögbauer, H., & Grassberger, P. (2004). Estimating mutual information. Physical Review E - Statistical Physics, Plasmas, Fluids, and Related Interdisciplinary Topics. https://doi.org/10.1103/PhysRevE.69.066138.

Diego, D., Agasøster Haaga, K., & Hannisdal, B. (2018, November 1). Transfer entropy computation using the Perron-Frobenius operator. Eprint ArXiv:1811.01677. Retrieved from https://arxiv.org/abs/1811.01677.