by Jeffrey Kantor (jeff at nd.edu). The latest version of this notebook is available at https://github.com/jckantor/CBE30338.

Linear first order plus time delay (FOPTD) models are often good approximations to process dynamics for process control applications. This notebook demonstrates the fitting of FOPTD models to step response data.

In [1]:

```
%matplotlib inline
import numpy as np
import matplotlib.pyplot as plt
import control
from scipy.optimize import minimize
```

A linear first-order plus time-delay model is a good approximation for many process control applications. Assume the manipulated process input, $u$, and measured process output, $y$, are initially at steady-state $u_0, y_0$.

Without loss of generality, the response of a linear first-order system without time-delay can be written as a differential equation

$$\tau\frac{d(y-y_0)}{dt} + (y-y_0) = K(u-u_0)$$

or

$$\tau\frac{dy}{dt} + (y-y_0) = K(u-u_0)$$

At time $t_0$, the input $u$ is changed to a new constant value $u_\infty$. Provided the system is stable (i.e, $\tau \geq 0$), the new steady state value of $y$ will be

$$y_\infty = y_0 + K(u_\infty - u_0)$$

The solution to the differential equation can be written in a number of useful forms.

\begin{align*} y(t) & = y_0 + K(u_\infty - u_0) (1 - e^{-(t-t_0)/\tau)}) \\ \\ & = y_0 + (y_\infty - y_0) (1 - e^{-(t-t_0)/\tau)}) \\ \\ & = y_\infty + (y_0 - y_\infty)e^{-(t-t_0)/\tau)} \end{align*}

Chemical processes are frequently encumbered with time delays associated with the transport of materials, chemical measurement, or simply sluggish response to control inputs. A pure time delay is modeled by a single parameter, $\tau_d$, such that

$$y(t) = u(t-\tau_d)$$

If we add the time delay feature to the first order process described above, then

\begin{align*} y(t) & = y_0 + K(u_\infty - u_0) (1 - e^{-(t-\tau_d - t_0)/\tau)}) \\ \\ & = y_0 + (y_\infty - y_0) (1 - e^{-(t-\tau_d-t_0)/\tau)}) \\ \\ & = y_\infty + (y_0 - y_\infty)e^{-(t-\tau_d-t_0)/\tau)} \end{align*}

First we write a function to compute the response of a first order system with time delay to a unit step input where $u_0 = 0$ and $u_\infty = 1$.

In [2]:

```
def foptd(t, K=1, tau=1, tau_d=0):
tau_d = max(0,tau_d)
tau = max(0,tau)
return np.array([K*(1-np.exp(-(t-tau_d)/tau)) if t >= tau_d else 0 for t in t])
```

In [3]:

```
t = np.linspace(0,50,200)
tau = 10
tau_delay = 3
K = 2
y = foptd(t,K,tau,tau_delay)
plt.plot(t,y)
plt.xlabel('Time [min]')
plt.ylabel('Response')
plt.title('FOPTD Step Response')
```

Out[3]:

A distillation column is initially at steady state where the cooling water flow to the condensor is 110 kg/hr and the vapor phase mole fraction of the volatile compound is 0.87. At t = 60 min, the steam flow is raised to 120 kg/hr. The vapor phase mole fraction increases as shown in the following chart.

The problem task to fit a FOPTD model to this experimental result.

In [4]:

```
# create the hypothetical problem data
delta_y,t = control.step(0.05*control.tf([-2, 1],[25, 10, 1]))
y = 0.87 + delta_y
t = t + 60
plt.plot(t,y)
plt.xlabel('Time [min]')
plt.ylabel('mole fraction')
plt.title('Vapor mole fraction response to 10 kg/hr increase in condensor coolng')
```

Out[4]:

The first step is to scale the experimental data to fit the framework of an FOPTD model. This generally involves three steps:

- Shift the time scale to t = 0 corresponds to the start of the experiment.

$$ t_s = t - t_0$$

- Shift the response so y = 0 is the initial steady state
- Rescale the response to a unit change in input

$$ y_s = \frac{y(t) - y_0}{u_\infty - u_0}$$

In [5]:

```
ts = t - t[0]
ys = (y - y[0])/(120 - 110)
plt.plot(ts, ys)
plt.title('Shifted and Scaled Data')
```

Out[5]:

For a given list of times $t$ and parameters $K$, $\tau$, and $\tau_d$, the `foptd`

returns the response of an FOPTD system to a unit change in input at $t = 0$.

In [6]:

```
def foptd(t, K=1, tau=1, tau_d=0):
tau_d = max(0,tau_d)
tau = max(0,tau)
return np.array([K*(1-np.exp(-(t-tau_d)/tau)) if t >= tau_d else 0 for t in t])
z = foptd(ts, 0.005, 10, 3)
plt.plot(ts,ys,ts,z)
plt.legend(['Experiment','FOPTD'])
```

Out[6]:

Let's called the step response of the fitted model to be $\hat{y}_s$. We seek to minimize

$$\min_{K,\tau,\tau_d} \int_0^T \|\hat{y}_s - y_s\|\,dt$$

for some suitable norm $\|\cdot\|$. A common choice of norm for process control is the absolute value of the difference called Integral Absolute Error (IAE)

$$\text{IAE} = \min_{K,\tau,\tau_d} \int_0^T |\hat{y}_s - y_s|\,dt$$

The advantage of IAE over other choices of norms is that it tends to be more robust with respect to larger errors.

In [7]:

```
def err(X,t,y):
K,tau,tau_d = X
z = foptd(t,K,tau,tau_d)
iae = sum(abs(z-y))*(max(t)-min(t))/len(t)
return iae
X = [0.005,10,3]
err(X,ts,ys)
```

Out[7]:

In [8]:

```
K,tau,tau_d = minimize(err,X,args=(ts,ys)).x
print("K = {:.5f}".format(K))
print("tau = {:.2f}".format(tau))
print("tau_d = {:.2f}".format(tau_d))
```

In [9]:

```
z = foptd(ts,K,tau,tau_d)
ypred = y[0] + z*(120 - 110)
plt.plot(t,y,t,ypred)
plt.xlabel('Time [min]')
plt.ylabel('vapor mole fraction')
plt.legend(['Experiment','FOPTD'])
```

Out[9]: