Translation of the first example in the talk "Emily Gorcenski - Polynomial Chaos: A technique for modeling uncertainty". The idea is to approximate a general, potentially empirical, probability distribution using a suitable polynomial expansion (Polynomial/Wiener Chaos). With Julia and PolyChaos package it simplified beautifully. For full explanation see related blog post.

In [1]:
#]add PolyChaos, Distributions, Plots
In [2]:
using PolyChaos, Distributions, Plots

First define our Hermite polynomials, $H_i(x)$ (called Gauss in PolyChaos because Hermite is reserved for the physicists Hermite polynomials which are slightly different).

In [3]:
op_gauss = GaussOrthoPoly(20)
H = [x -> evaluate(i, x, op_gauss) for i in 0:5]
plot(-1.5:0.01:1.5, H, leg=false)
Out[3]:
-1.5 -1.0 -0.5 0.0 0.5 1.0 1.5 -6 -4 -2 0 2 4 6
In [4]:
inv_cdf(dist) = u -> quantile(dist, u)

h = inv_cdf(Exponential())
l = inv_cdf(Normal())
Out[4]:
#7 (generic function with 1 method)

PolyChaos calculates our inner (scalar) product for us, conveniently.

The integrand is just the Galerkin projection with the transformed functions, $h(u)$ and $l(u)$.

In [5]:
sp = computeSP2(op_gauss)
integrand(i) = u -> h(u)*H[i](l(u))
Out[5]:
integrand (generic function with 1 method)

Perform the integration up to $p$ polynomials, we can use PolyChaos for this too although any integration scheme works.

In [6]:
p = 5
int_op = Uniform01OrthoPoly(1000, addQuadrature=true)
ki = [integrate(integrand(i), int_op) / sp[i] for i in 1:p]
Out[6]:
5-element Array{Float64,1}:
  0.9999993692484951
  0.9031921803407967
  0.29780023616476886
  0.03334230632910607
 -0.0024169819236717414

Now we have the $k_i$s to do the transform we can reconstitute the approximated distribution by adding up a load of transformed Gaussian random variables, $\zeta_i$.

In [9]:
ζ = randn(5000)
Σ = sum(ki[i] * H[i](ζ) for i in 1:p)
histogram(Σ, normed=true, leg=false)
Out[9]:
0.0 2.5 5.0 7.5 10.0 0.0 0.2 0.4 0.6 0.8
In [ ]: