You are seeing the notebook output generated by Literate.jl from the Julia source file. The rendered HTML can be viewed in the docs.
The kernels defined in this package can also be used to specify the covariance of a Gaussian process prior. A Gaussian process (GP) is defined by its mean function m(⋅) and its covariance function or kernel k(⋅,⋅′): f∼GP(m(⋅),k(⋅,⋅′))
# Load required packages
using KernelFunctions, LinearAlgebra
using Plots, Plots.PlotMeasures
default(; lw=1.0, legendfontsize=8.0)
using Random: seed!
seed!(42); # reproducibility
The function values f={f(xn)}Nn=1 of the GP at a finite number N of points X={xn}Nn=1 follow a multivariate normal distribution f∼MVN(m,K) with mean vector m and covariance matrix K, where mi=m(xi)Ki,j=k(xi,xj)
We can visualize the infinite-dimensional GP by evaluating it on a fine grid to approximate the dense real line:
num_inputs = 101
xlim = (-5, 5)
X = range(xlim...; length=num_inputs);
Given a kernel k
, we can compute the kernel matrix as K = kernelmatrix(k, X)
.
To sample from the multivariate normal distribution p(f)=MVN(0,K), we could make use of Distributions.jl and call rand(MvNormal(K))
.
Alternatively, we could use the AbstractGPs.jl package and construct a GP
object which we evaluate at the points of interest and from which we can then sample: rand(GP(k)(X))
.
Here, we will explicitly construct samples using the Cholesky factorization L=cholesky(K), with f=Lv, where v∼N(0,I) is a vector of standard-normal random variables.
We will use the same randomness v to generate comparable samples across different kernels.
num_samples = 7
v = randn(num_inputs, num_samples);
Mathematically, a kernel matrix is by definition positive semi-definite, but due to finite-precision inaccuracies, the computed kernel matrix might not be exactly positive definite. To avoid Cholesky errors, we add a small "nugget" term on the diagonal:
function mvn_sample(K)
L = cholesky(K + 1e-6 * I)
f = L.L * v
return f
end;
We now define a function that visualizes a kernel for us.
function visualize(k::Kernel)
K = kernelmatrix(k, X)
f = mvn_sample(K)
p_kernel_2d = heatmap(
X,
X,
K;
yflip=true,
colorbar=false,
ylabel=string(nameof(typeof(k))),
ylim=xlim,
yticks=([xlim[1], 0, xlim[end]], ["\u22125", raw"$x'$", "5"]),
vlim=(0, 1),
title=raw"$k(x, x')$",
aspect_ratio=:equal,
left_margin=5mm,
)
p_kernel_cut = plot(
X,
k.(X, 0.0);
title=string(raw"$k(x, x_\mathrm{ref})$"),
label=raw"$x_\mathrm{ref}=0.0$",
legend=:topleft,
foreground_color_legend=nothing,
)
plot!(X, k.(X, 1.5); label=raw"$x_\mathrm{ref}=1.5$")
p_samples = plot(X, f; c="blue", title=raw"$f(x)$", ylim=(-3, 3), label=nothing)
return plot(
p_kernel_2d,
p_kernel_cut,
p_samples;
layout=(1, 3),
xlabel=raw"$x$",
xlim=xlim,
xticks=collect(xlim),
)
end;
We can now visualize a kernel and show samples from a Gaussian process with a given kernel:
plot(visualize(SqExponentialKernel()); size=(800, 210), bottommargin=5mm, topmargin=5mm)
This also allows us to compare different kernels:
kernels = [
Matern12Kernel(),
Matern32Kernel(),
Matern52Kernel(),
SqExponentialKernel(),
WhiteKernel(),
ConstantKernel(),
LinearKernel(),
compose(PeriodicKernel(), ScaleTransform(0.2)),
NeuralNetworkKernel(),
GibbsKernel(; lengthscale=x -> sum(exp ∘ sin, x)),
]
plot(
[visualize(k) for k in kernels]...;
layout=(length(kernels), 1),
size=(800, 220 * length(kernels) + 100),
)
Status `~/work/KernelFunctions.jl/KernelFunctions.jl/examples/gaussian-process-priors/Project.toml` [31c24e10] Distributions v0.25.117 [ec8451be] KernelFunctions v0.10.65 `/home/runner/work/KernelFunctions.jl/KernelFunctions.jl#65fe151` [98b081ad] Literate v2.20.1 [91a5bcdd] Plots v1.40.9 [37e2e46d] LinearAlgebra v1.11.0 [9a3f8284] Random v1.11.0To reproduce this notebook's package environment, you can download the full Manifest.toml.
Julia Version 1.11.3 Commit d63adeda50d (2025-01-21 19:42 UTC) Build Info: Official https://julialang.org/ release Platform Info: OS: Linux (x86_64-linux-gnu) CPU: 4 × AMD EPYC 7763 64-Core Processor WORD_SIZE: 64 LLVM: libLLVM-16.0.6 (ORCJIT, znver3) Threads: 1 default, 0 interactive, 1 GC (on 4 virtual cores) Environment: JULIA_DEBUG = Documenter JULIA_LOAD_PATH = :/home/runner/.julia/packages/JuliaGPsDocs/7M86H/src
This notebook was generated using Literate.jl.