Multivariate counts and probabilities API
For counting and probabilities, Associations.jl extends the single-variable machinery in ComplexityMeasures.jl to multiple variables. See the following types:
ComplexityMeasures.counts
— Methodcounts(o::UniqueElements, x₁, x₂, ..., xₙ) → Counts{N}
counts(encoding::CodifyPoints, x₁, x₂, ..., xₙ) → Counts{N}
counts(encoding::CodifyVariables, x₁, x₂, ..., xₙ) → Counts{N}
Construct an N
-dimensional contingency table from the input iterables x₁, x₂, ..., xₙ
which are such that length(x₁) == length(x₂) == ⋯ == length(xₙ)
.
If x₁, x₂, ..., xₙ
are already discrete, then use UniqueElements
as the first argument to directly construct the joint contingency table.
If x₁, x₂, ..., xₙ
need to be discretized, provide as the first argument
CodifyPoints
(encodes every point in each of the input variablesxᵢ
s individually)CodifyVariables
(encodes everyxᵢ
individually using a sliding window encoding). NB: If using differentOutcomeSpace
s for the differentxᵢ
, thentotal_outcomes
must be the same for every outcome space.
Examples
# Discretizing some non-discrete data using a sliding-window encoding for each variable
x, y = rand(100), rand(100)
c = CodifyVariables(OrdinalPatterns(m = 4))
counts(c, x, y)
# Discretizing the data by binning each individual data point
binning = RectangularBinning(3)
encoding = RectangularBinEncoding(binning, [x; y]) # give input values to ensure binning covers all data
c = CodifyPoints(encoding)
counts(c, x, y)
# Counts table for already discrete data
n = 50 # all variables must have the same number of elements
x = rand(["dog", "cat", "mouse"], n)
y = rand(1:3, n)
z = rand([(1, 2), (2, 1)], n)
counts(UniqueElements(), x, y, z)
See also: CodifyPoints
, CodifyVariables
, UniqueElements
, OutcomeSpace
, probabilities
.
ComplexityMeasures.probabilities
— Methodprobabilities(o::UniqueElements, x₁, x₂, ..., xₙ) → Counts{N}
probabilities(encoding::CodifyPoints, x₁, x₂, ..., xₙ) → Counts{N}
probabilities(encoding::CodifyVariables, x₁, x₂, ..., xₙ) → Counts{N}
Construct an N
-dimensional Probabilities
array from the input iterables x₁, x₂, ..., xₙ
which are such that length(x₁) == length(x₂) == ⋯ == length(xₙ)
.
Description
Probabilities are computed by first constructing a joint contingency matrix in the form of a Counts
instance.
If x₁, x₂, ..., xₙ
are already discrete, then use UniqueElements
as the first argument to directly construct the joint contingency table.
If x₁, x₂, ..., xₙ
need to be discretized, provide as the first argument
CodifyPoints
(encodes every point in each of the input variablesxᵢ
s individually)CodifyVariables
(encodes everyxᵢ
individually using a sliding window encoding).
Examples
# Discretizing some non-discrete data using a sliding-window encoding for each variable
x, y = rand(100), rand(100)
c = CodifyVariables(OrdinalPatterns(m = 4))
probabilities(c, x, y)
# Discretizing the data by binning each individual data point
binning = RectangularBinning(3)
encoding = RectangularBinEncoding(binning, [x; y]) # give input values to ensure binning covers all data
c = CodifyPoints(encoding)
probabilities(c, x, y)
# Joint probabilities for already discretized data
n = 50 # all variables must have the same number of elements
x = rand(["dog", "cat", "mouse"], n)
y = rand(1:3, n)
z = rand([(1, 2), (2, 1)], n)
probabilities(UniqueElements(), x, y, z)
See also: CodifyPoints
, CodifyVariables
, UniqueElements
, OutcomeSpace
.
The utility function marginal
is also useful.
Associations.marginal
— Functionmarginal(p::Probabilities; dims = 1:ndims(p))
marginal(c::Counts; dims = 1:ndims(p))
Given a set of counts c
(a contingency table), or a multivariate probability mass function p
, return the marginal counts/probabilities along the given dims
.
Example: estimating Counts
and Probabilities
Estimating multivariate counts (contingency matrices) and PMFs is simple. If the data are pre-discretized, then we can use UniqueElements
to simply count the number of occurrences.
using Associations
n = 50 # the number of samples must be the same for each input variable
x = rand(["dog", "cat", "snake"], n)
y = rand(1:4, n)
z = rand([(2, 1), (0, 0), (1, 1)], n)
discretization = CodifyVariables(UniqueElements())
counts(discretization, x, y, z)
3×4×3 Counts{Int64,3}
[:, :, 1]
3 4 1 2
"snake" 1 2 3 2
"dog" 1 3 2 0
"cat" 4 0 0 2
[and 2 more slices...]
Probabilities are computed analogously, except counts are normalized to sum to 1
.
discretization = CodifyVariables(UniqueElements())
probabilities(discretization, x, y, z)
3×4×3 Probabilities{Float64,3}
[:, :, 1]
3 4 1 2
"snake" 0.02 0.04 0.06 0.04
"dog" 0.02 0.06 0.04 0.0
"cat" 0.08 0.0 0.0 0.04
[and 2 more slices...]
For numerical data, we can estimate both counts and probabilities using CodifyVariables
with any count-based OutcomeSpace
.
using Associations
x, y = rand(100), rand(100)
discretization = CodifyVariables(BubbleSortSwaps(m = 4))
probabilities(discretization, x, y)
7×6 Probabilities{Float64,2}
4 5 … 0
3 0.0618556701030928 0.02061855670103093 0.0
2 0.05154639175257733 0.0309278350515464 0.0
5 0.02061855670103093 0.0309278350515464 0.010309278350515465
1 0.010309278350515465 0.010309278350515465 0.0
4 0.10309278350515466 0.010309278350515465 … 0.010309278350515465
0 0.04123711340206186 0.0 0.0
6 0.010309278350515465 0.0 0.010309278350515465
For more fine-grained control, we can use CodifyPoints
with one or several Encoding
s.
using Associations
x, y = StateSpaceSet(rand(1000, 2)), StateSpaceSet(rand(1000, 3))
# min/max of the `rand` call is 0 and 1
precise = true # precise bin edges
r = range(0, 1; length = 3)
binning = FixedRectangularBinning(r, dimension(x), precise)
encoding_x = RectangularBinEncoding(binning, x)
encoding_y = CombinationEncoding(RelativeMeanEncoding(0.0, 1, n = 2), OrdinalPatternEncoding(3))
discretization = CodifyPoints(encoding_x, encoding_y)
# now estimate probabilities
probabilities(discretization, x, y)
4×12 Probabilities{Float64,2}
8 11 9 7 … 4 2 5 10 3
2 0.021 0.013 0.023 0.019 0.018 0.02 0.03 0.028 0.017
4 0.019 0.023 0.024 0.027 0.012 0.023 0.023 0.014 0.022
1 0.03 0.023 0.026 0.021 0.018 0.026 0.023 0.018 0.017
3 0.016 0.02 0.02 0.009 0.02 0.024 0.029 0.012 0.019