This example shows you how to run a global optimisation with Pints.
First, we import pints:
import pints
Next, we need a model: any class that implements the pints.ForwardModel interface.
Usually, you'd write a class for this purpose (that wrapped around whatever simulation package you wanted to use to generate your time series data). But you could also use a pure-Python model.
In the example, we use a logistic model, provided by Pints's toy model module.
import pints.toy as toy
model = toy.LogisticModel()
This model has two parameters: A growth rate (which determines the steepness of the curve) and a carrying capacity (which determines the number the curve converges to). For the example, we simply pick some nice values:
real_parameters = [0.015, 500]
Finally, we create a list of times (in a real experiment, these would be the times at which the time series was sampled)
import numpy as np
times = np.linspace(0, 1000, 1000)
We now have everything we need to run a simulation and generate some toy data:
values = model.simulate(real_parameters, times)
We can use Matplotlib (or any other plotting package) to have a look at the generated data:
import matplotlib.pyplot as plt
plt.figure(figsize=(14, 6))
plt.xlabel('Time')
plt.ylabel('Values')
plt.plot(times, values)
plt.show()
If you like, you can make it more realistic at this point by adding some noise:
values += np.random.normal(size=values.shape) * 10
plt.figure(figsize=(14, 6))
plt.xlabel('Time')
plt.ylabel('Values')
plt.plot(times, values)
plt.show()
We now set up an optimisation, to see if we can recover our original parameters from this data.
First, we define a problem (in this case a single-valued time series fitting problem):
problem = pints.SingleOutputProblem(model, times, values)
We then define a score function on this problem:
score = pints.SumOfSquaresError(problem)
A lot of real problems have physical constraints on the values the parameters can take, so in this example we add them in the form of boundaries:
boundaries = pints.RectangularBoundaries([0, 200], [1, 1000])
Finally, we define an initial position to start searching at
x0 = np.array([0.5, 500])
found_parameters, found_value = pints.optimise(
score,
x0,
boundaries=boundaries,
method=pints.XNES,
)
Minimising error measure Using Exponential Natural Evolution Strategy (xNES) Running in sequential mode. Population size: 6 Iter. Eval. Best Time m:s 0 6 4.19e+07 0:00.0 1 12 4.05e+07 0:00.0 2 18 4.05e+07 0:00.0 3 24 4.05e+07 0:00.0 20 126 1.14e+07 0:00.0 40 246 769621.3 0:00.1 60 366 769621.3 0:00.1 80 486 104078.7 0:00.1 100 606 104074.4 0:00.1 120 726 104074 0:00.2 140 846 104074 0:00.2 160 966 104074 0:00.2 180 1086 104074 0:00.2 200 1206 104074 0:00.3 220 1326 104074 0:00.3 240 1446 104074 0:00.3 260 1566 104074 0:00.3 280 1686 104074 0:00.3 300 1806 104074 0:00.4 320 1926 104074 0:00.4 340 2046 104074 0:00.4 360 2166 104074 0:00.4 380 2286 104074 0:00.5 400 2406 104074 0:00.5 408 2448 104074 0:00.5 Halting: No significant change for 200 iterations.
In our toy model example, we can compare the parameters with the known true parameters:
print('Score at true solution: ')
print(score(real_parameters))
print('Found solution: True parameters:' )
for k, x in enumerate(found_parameters):
print(pints.strfloat(x) + ' ' + pints.strfloat(real_parameters[k]))
Score at true solution: 104325.21584268715 Found solution: True parameters: 1.50228643870493877e-02 1.49999999999999994e-02 5.00147686247309082e+02 5.00000000000000000e+02
And we can also plot the score between the found solution and the true parameters.
import pints.plot
fig, axes = pints.plot.function_between_points(score, point_1=found_parameters, point_2=real_parameters)
axes.set_ylabel('score')
plt.show()
In real life, we might compare the fit by running a simulation and comparing with the data:
values2 = model.simulate(found_parameters, times)
plt.figure(figsize=(14, 6))
plt.xlabel('Time')
plt.ylabel('Values')
plt.plot(times, values, label='Experimental data')
plt.plot(times, values2, label='Simulated data')
plt.legend()
plt.show()