Library


Contractors

These are interval contractors implemented here; in the future they may be substituted by established libraries, as IntervalRootFinding.jl

RigorousInvariantMeasures.preimage_monotonicFunction

Compute the guaranteed interval preimage f⁻¹(y) of a point y ∈ R according to the function f: [a,b] → R. This preimage is guaranteed to lie inside the real interval x = hull(a,b).

f must be strictly monotonic on [a,b], at least in the sense of MonotonicBranch functions (allowing for uncertainty on the endpoints). f must be differentiable; zero and infinite derivative are allowed.

y1, y2 must be guaranteed to be equal to f(a), f(b) or "more outside", so that y ∉ hull(y1,f(z)) ⟹ f⁻¹(y) ∉ hull(a, z).

Stops when the interval reaches a fixed point, when the diameter is smaller than ε, or when max_iter iterations are reached (with an error)

source

Preimages

Optimized methods to compute preimages of $1$ dimensional maps.

RigorousInvariantMeasures.first_overlappingMethod
first_overlapping(y, a)

Smallest possible i such that a is in the semi-open interval [y[i], y[i+1]).

This should work properly even if a, y are intervals; in this case it returns the smallest possible value of i over all possible "assignments" of a, y inside those intervals. Assumes y is sorted, i.e., map(y, x->Interval(x).lo) and map(y, x->Interval(x).hi) are sorted.

source
RigorousInvariantMeasures.last_overlappingMethod
last_overlapping(y, a)

Largest possible j such that a-ε is in the semi-open interval [y[j], y[j+1]).

This should work properly even if a, y are intervals; in this case it returns the largest possible value of i over all possible "assignments" of a, y inside those intervals. Assumes y is sorted, i.e., map(y, x->Interval(x).lo) and map(y, x->Interval(x).hi) are sorted.

source
RigorousInvariantMeasures.preimagesFunction
preimages(y, D::Dynamic, ylabel = 1:length(y), ϵ = 0.0; progress = true)

Construct preimages of an increasing array y under a dynamic, propagating additional labels `ylabel`
source
RigorousInvariantMeasures.preimagesFunction
preimages(y, br::MonotonicBranch, ylabel = 1:length(y); ϵ, max_iter)

Construct preimages of a partition y under a monotonic branch defined on X = (a, b), propagating additional labels ylabel

Construct preimages of a partition y under a monotonic branch br defined on X = (a, b), propagating additional labels ylabel

It is assumed that it cannot happen that $f(x) < y[1]$.

Extended help

The sequence y subdivides the y-axis into semi-open intervals [y[l], y[l+1]); each of them is identified by the label ylabel[l]. We construct an increasing sequence x that splits X (in the x-axis) into semi-open intervals, each of them with f([x[k], x[k+1]) ⊂ [y[l], y[l+1]) for a certain l. We set xlabel[k] = ylabel[l], and return the pair (x, xlabel).

It is assumed that it cannot happen that f(x) < y[1].

In the simplest case where D is full-branch, the points in x are preimages of the points in y, but in the general case they can also include D.endpoints: in general, there may be a certain number of points in y that have no preimage at the beginning and the end of the sequence, because they fall out of the range R = [f(a), f(b)]. In the worst case, no point has a preimage, because y[i] < R < y[i+1] for some i (or vice versa with orientations), and in this case we just return the 1-element vectors x = [branch.X[1]] and xlabel = [i].

x[begin] always coincides with branch.X[1], while branch.X[2] is "the point after x[end]", and is not stored explicitly in x, for easier composing. In this way x and xlabel have the same length.

This function fills the array by using a bisection strategy to save computations: if y ∈ [a,b], then $f^{-1}(y) \\in [f^{-1}(a),f^{-1}(b)]$ (paying attention to orientation). So we can fill v by filling in first entries v[k+1] with higher dyadic valuation of k.

For a dynamic with multiple branches, preimages(y, D) is simply the concatenation of x, xlabel for b in all branches. These values still form an increasing sequence that splits X into intervals, each of which is mapped into a different semi-open interval [y[k], y[k+1]).

Example

julia> using RigorousInvariantMeasures

julia> D = mod1_dynamic(x->2x)
Piecewise-defined dynamic with 2 branches

julia> RigorousInvariantMeasures.preimages(0:0.1:1, D.branches[1]; ϵ = 10^(-15), max_iter = 100)
(Interval{Float64}[[0, 0], [0.05, 0.0500001], [0.1, 0.100001], [0.149999, 0.15], [0.2, 0.200001], [0.25, 0.25], [0.299999, 0.3], [0.349999, 0.35], [0.4, 0.400001], [0.45, 0.450001]], [1, 2, 3, 4, 5, 6, 7, 8, 9, 10])
source
RigorousInvariantMeasures.preimages_and_derivativesFunction
preimages_and_derivatives(y, br::MonotonicBranch, ylabel = 1:length(y); ϵ, maxiter, left=true)

Compute preimages of D and the derivatives f'(x) in each point.

Returns: x, xlabel, x′

We combine the two computations in the same function because in this way it can be implemented more efficiently for composed dynamics.

This assumes that (1) the dynamic is full-branch, and (2) all branches have the same orientation. This is not restrictive because we'll need it only for the Hat assembler (at the moment).

If left==true, x′ contains the derivatives f'.(x). If right==true, x′ contains derivatives in the right endpoint of each interval of the partition, i.e., for each branch, [f'(x[2]), f'(x[3]), ... f'(x[end]), f'(br.X[2])]. In any case, length(x) == length(x′).

source

Differentiation interface

RigorousInvariantMeasures.check_derivativesFunction
check_derivatives(f, x=rand())

Checks (using assertions) that the derivatives of f agree (up to the square root of machine precision) with those computed by TaylorSeries.

This is a useful sanity check if you redefine derivatives.

The derivative are checked in a point x (default: rand()), which should be in the domain of f.

```jldoctest

julia> f = @definewithderivatives x->x^2 x->2x x->2;

ERROR: LoadError: UndefVarError: @define_with_derivatives not defined

in expression starting at none:1

julia> check_derivatives(f, 0.2)

ERROR: UndefVarError: check_derivatives not defined

Stacktrace:

[1] top-level scope

@ none:1

julia> g = @definewithderivatives x->x^2 x->2x x->3;

ERROR: LoadError: UndefVarError: @define_with_derivatives not defined

in expression starting at none:1

julia> check_derivatives(g, 0.2)

ERROR: UndefVarError: check_derivatives not defined

Stacktrace:

[1] top-level scope

@ none:1

```

source
RigorousInvariantMeasures.value_and_derivativeMethod
value_and_derivative(f, x)

Evaluate f(x) and f'(x) at the same time. We define a single method to do this because often there are shared subexpressions between a function and its derivatives, or they are computed together by automatic differentiation packages, so this is often more efficient than having two separate functions for f and f' and calling them separately.

The generic implementation uses TaylorSeries to compute derivatives. It can be overwritten for individual functions:

f = x -> x^2
value_and_derivative(f::typeof(f), x) = (x^2, 2x)

See @definewithderivatives to define derivatives with three separate expressions.

Remark: when specializing derivatives by hand, always specialize the two-argoment functions (e.g., value_and_derivative(f, x)) rather than the one-parameter ones (e.g., value_and_derivative(f)), since the latter are defined generically in terms of the former.

Remark: tricky point that can cause subtle bugs: functions created in the same line-of-code will often have the same type; so for instance

```jldoctest

julia> fs = [x -> k*x for k in 1:3]; # the three elements here have the same type

julia> typeof(fs[1]) == typeof(fs[3])

true

julia> RigorousInvariantMeasures.derivative(f::typeof(fs[1]), x) = 1;

julia> RigorousInvariantMeasures.derivative(f::typeof(fs[2]), x) = 2;

julia> RigorousInvariantMeasures.derivative(f::typeof(fs[3]), x) = 3;

julia> RigorousInvariantMeasures.derivative(fs[1], 0.5) # this returns 3, not 1

3

```

source
RigorousInvariantMeasures.@define_with_derivativesMacro
f = @define_with_derivatives(ex1, ex2, ex3)

Declares a new function f, and redefines the various functions related to derivatives so that its derivatives are computed with the given expressions.

```jldoctest

julia> f = @definewithderivatives x->x^2 x->2x x->2;

ERROR: LoadError: UndefVarError: @define_with_derivatives not defined

in expression starting at none:1

julia> valueandderivative(f, 45)

ERROR: UndefVarError: value_and_derivative not defined

Stacktrace:

[1] top-level scope

@ none:1

```

Note that the three macro parameters are separated just by spaces (no commas or parentheses)

```jldoctest

julia> g = @definewithderivatives x->12 x->34 x->56;

ERROR: LoadError: UndefVarError: @define_with_derivatives not defined

in expression starting at none:1

julia> valuederivativeandsecondderivative(g, -23.5)

ERROR: UndefVarError: value_derivative_and_second_derivative not defined

Stacktrace:

[1] top-level scope

@ none:1

```

This is provided for convenience, but note that in many cases one can find common subexpressions in a function and its derivatives; hence it is more efficient to define the two functions separately.

If you define derivatives in this way, it is recommended to run check_derivatives to ensure that you did not make any mistakes (e.g., forgetting a factor 2). We do not run it automatically because that would require knowing a valid x in the domain of f.

source