As you may have noticed, many quantities in the real world are random and many problems that we face involve optimizing in the presence of uncertainty. In this lab, we will look at some optimization methods where some of the constraints are random.

For a motivating exapmle, we can look at scheduling in healthcare. Hospitals lose large amounts of money each year due to scheduling problems. One of the biggest culprits for this loss of money is the scheduling of surgeries. Recent studies estimate that it costs $\sim \$62$ each minute to keep an operating room running$^1$. It is easy to imagine a scenario in which a seemingly decent schedule incurs huge amounts of cost for the hospital: for example scheduling $8$ surgeries for an hour each and having each surgery finish $10$ minutes early. This seems like a rather innocuous schedule, but at the end of the day lost the hospital $\$4960$!

The hardness of this problem stems from the random nature of surgery times (which we will call processing times), and hospitals today don't have an efficent way of dealing with these processing times. For example, many hospitals have scheduling departments that construct weekly schedules by hand. In this lab, we are going to use what we have learned about probability and try to help the hospitals out. We will focus on creating offline schedules (so that we know nothing about the realization of job durations) in this lab.

Linear programs are defined by a set of decision variables, a linear cost function (called the objective), and a set of linear inequalities involving the decision variables. Often, when we model problems as linear programs, we consider a cost vector $c=(c_1,c_2,...c_n)$, a decision vector $x=(x_1,x_2,...,x_n)$, a constraint matrix $A=(A_1,A_2,...,A_n)$ where $A_i$ is the $ith$ column. We additionally consider a "goal" vector $b$ of dimension $n$. The "standard form" linear program is:

$$ \begin{equation} \begin{split} & \underset{x}{\text{min }} c'x \\ & \text{subject to } Ax \le b \\ & x \ge 0 \end{split} \end{equation} $$In a general optimization problem, we need not restrict ourselves to linear objective functions (although linear objectives are much nicer to deal with), we simply have an objective function $F(x)$ whose domain is the set of decision variables $x$.

In this lab, we will be using the Python library PuLP as our LP-solver, as we have found it is the fastest and easiest to use. For your projects, you are free to use any optimization software you want (such as cvxopt, etc.).

A hospital wants to make a weekly night shift (12pm-8am) schedule for its nurses. The demand for nurses for the night shift on day $j$ is an integer $d_j, j=1,...,7$. Every nurse works $5$ days in a row on the night shift. The problem is to find the minimal number of nurses the hospital needs to hire.

Let's define our decision variables to be $x=(x_1,x_2,..,x_7)$, where $x_i$ denotes the number of nurses starting their week on day $j$ (do you see why this is a good model?). This gives us the following linear program:

$$ \begin{equation} \begin{split} & \text{minimize} \sum_{i=1}^{7} x_i \\ \text{subject to } & x_1 + x_4 + x_5 + x_6 + x_7 \ge d_1 \\ & x_1 + x_2 + x_5 + x_6 + x_7 \ge d_2 \\ & x_1 + x_2 + x_3 + x_6 + x_7 \ge d_3 \\ & x_1 + x_2 + x_3 + x_4 + x_7 \ge d_4 \\ & x_1 + x_2 + x_3 + x_4 + x_5 \ge d_5 \\ & x_2 + x_3 + x_4 + x_5 + x_6 \ge d_6 \\ & x_3 + x_4 + x_5 + x_6 + x_7 \ge d_7 \\ & x_j \ge 0 \end{split} \end{equation} $$Now, let's write a function to solve this lp given a $d$ vector:

In [2]:

```
%pylab inline
import numpy as np
from pulp import *
import matplotlib.pyplot as plt
def night_shifts(d):
"""
Determine the optimal night schedule
Inputs: d- a vector of 7 integers representing how many nurses
need to be available to work on each day
Outputs: a vector of 7 integers (the x vector) representing how many nurses
start working on each day
"""
prob = LpProblem("schedule",LpMinimize) ###Create the linear program
### Variable Creation ###
x = np.empty( 7,'object') ##number of nurses starting on each day
for i in range(7):
x[i] = LpVariable("x {0}".format(i), lowBound = 0,cat='Continuous')
### Objective Formulation ###
prob += lpSum(x)
###Constraint Formulation ###
prob += (x[0] + x[3] + x[4] + x[5] + x[6] >= d[0])
prob += (x[0] + x[1] + x[4] + x[5] + x[6] >= d[1])
prob += (x[0] + x[1] + x[2] + x[5] + x[6] >= d[2])
prob += (x[0] + x[1] + x[2] + x[3] + x[6] >= d[3])
prob += (x[0] + x[1] + x[2] + x[3] + x[4] >= d[4])
prob += (x[1] + x[2] + x[3] + x[4] + x[5] >= d[5])
prob += (x[2] + x[3] + x[4] + x[5] + x[6] >= d[6])
for i in xrange(7):
prob += (x[i] >= 0)
### Solve the lp ###
prob.solve()
### Extract ###
print "x =", [value(p) for p in x]
rounded_x = [np.ceil(value(p)) for p in x]
print "The total number of nurses needed is: {}".format(sum(rounded_x))
return prob
```

In [3]:

```
d_vals = np.array([5,20,10,17,10,10,19])
schedule = night_shifts(d_vals)
```

Note that one can't have a non-integral number of nurses start work on a certain day, so this is an integer linear programming problem, which is NP-hard. In the code above, we "relax" the integrality constraints, and can round the $x$-values to get an answer (that is no longer guaranteed to be optimal).

Now that we know what linear programs are, we can increase the difficulty a bit with stochastic optimization. Linear programming for scheduling is useful when we know exactly what the job durations are and there is no variation, but what happens when the job durations are random variables?

This is where stochastic optimization comes in. Say that, like in linear programming, we have a cost function $F(A,P)$ which gives the cost of a schedule whose starting times are $A=(A_1,A_2,...,A_n)$ and processing times are $P=(p_1,p_2,...,p_n)$. In stochastic programming, the $P$ vector is a random vector, so our objective becomes $\underset{x}{\text{min }}E_p\bigl[F(x,P)\bigr]$. Thus, we consider the problem: $$ \begin{equation} \begin{split} &\underset{x}{\text{min }}E_p\bigl[F({x},P)\bigr] \\ & \text{subject to }Ax \le b \\ & x \ge 0 \end{split} \end{equation} $$ This is the stochastic programming equivalent of the standard form linear optimization problem.

Of course, most linear program solvers don't take random variables as input; they only solve deterministic linear programs. To deal with this, we consider leting the objective be: $$ \begin{equation} \begin{split} &\underset{x}{\text{min }}\frac{1}{L}\sum_{i=1}^{L}F(x,P_i) \\ & Ax \le b \\ & x \ge 0 \end{split} \end{equation} $$ This implies that we give the linear program $L$ realizations of the processing times vector ($P$) and minimize over the average of the objective evaluated at each realization.

Let's return to scheduling surgery cases. In some hospitals around the world, the accepted version of scheduling is to tell all patients to show up at the beginning of the day and wait until the hospital can perform their operation. Assume the hospital you are working with has decided to try to increase patient happiness, but is unwilling to change this method of scheduling. Thus, they want to minimize the total waiting time of the patients. We will model this by trying to minimize the maximum $\textit{flow time}$. Intuitively, this means that if there are $n$ cases scheduled to be performed on a day, the flowtime of the first case is the processing time of its surgery, and then it exits the system, whereas the $i$th case is in the system for all preceding cases, so has a flowtime of the sum of all of these cases with its own: $F_i = p_1 + p_2 + ... + p_i$. The hospital has found that its surgeons and patients are happiest when the flow time is the smallest. Thus, the objective we would like to look at is $\underset{x}{\text{min }}{\sum_{i=1}^{n}F_i}$ where $F_i$ is the flow time of the $i$th case and $x_i$ is the starting time of the $i$th case. Note that here, the *sequence* of cases is variable, that is if the sequence $N={1,2,...,n}$ of cases are to be scheduled, the order of the cases will be some permutation $\pi(N)$. Now that we have the preliminaries, we will write a function to minimize the total flowtime of the system. We will assume that each case duration has an exponential distribution with some passed in parameters.

In [4]:

```
def generate_samples(n, K, parameters):
"""
Generate K samples of n exponential random variables with the given parameters
Inputs: n- the number of cases to be scheduled
K- the number of samples to be taken
parameters- a list of length n
Outputs
"""
assert len(parameters) == n
ret = np.empty( (n,K) , 'object')
for i in xrange(n):
ret[i] = np.random.exponential(parameters[i], K)
return ret
```

Let's outline the variables:

- $x_{ij}$ is the indicator variable that case $i$ is scheduled in slot $j$
- $D_{li}$ is the duration of case $i$ in sample $l$

The objective: $$ \frac{1}{L}\sum_{l=1}^{L}\biggl(\sum_{k=0}^{n}\sum_{j=0}^{k}\sum_{i=0}^{n}x_{ij}D_{li}\biggr) $$ This is an intimidating sum with a simple explanation. The outside sum is over all samples; we average over everything in parentheses. Inside the parentheses we sum over all cases happening in a day with a temporary variable $k$. Then, we sum over all cases that have occurred up to slot $k$ by summing from $j=0 \rightarrow k$. Finally, within this sum, we add the appropriate duration. Note that $\sum_{i=0}^{n}x_{ij}D_{li} = P_{lj}$ where $P_{lj}$ can be considered the processing time of case $j$ in sample $i$.

The constraints:

- $\sum_{i=0}^{n}x_{ij} = 1$ There can only be one case scheduled in each slot and each case must be scheduled
- $\sum_{j=0}^{n}x_{ij} = 1$ Only one case of each type can be scheduled

In this formulation, the $x_{ij}$ are the decision variables, and the $D_{li}$ are known.

In [5]:

```
def stochastic_flowtime(samples):
"""
Put stuff here
"""
(n, L) = samples.shape
prob = LpProblem("schedule",LpMinimize) ###Create the linear program
### Variable Creation ###
D = samples
x = np.empty( (n,n) ,'object') #indicator
for i,j in itertools.product(xrange(n), xrange(n)):
x[i, j] = LpVariable("x {0} {1}".format(i, j), cat='Binary')
### Objective Formulation ###
obj = []
for l in xrange(L):
inner_obj = []
for k in xrange(n):
for j in xrange(k):
for i in xrange(n):
inner_obj.append(x[i,j] * D[i,l])
obj.append(lpSum(inner_obj))
prob += lpSum(obj) * 1.0/L
### Constraint Formulation ###
for j in xrange(n):
prob += lpSum(x[:,j]) == 1
for i in xrange(n):
prob += lpSum(x[i,:]) == 1
### Solve the lp ###
prob.solve()
### Extract ###
schedule = []
for j in xrange(n):
for i in xrange(n):
if not value(x[i,j]) == 0:
print x[i,j], value(x[i,j])
schedule.append(i)
print "\n"
for j in xrange(n):
print "The case in slot {0} is case {1}".format(j,schedule[j])
print "The average duration for this case is: {} \n"\
.format(np.sum(samples[schedule[j],:])/L)
return schedule
```

In [6]:

```
### Example ###
case_parameters = [10,1,9]
samples = generate_samples(3, 100, case_parameters)
stochastic_flowtime(samples)
```

Out[6]:

Notice that the order of cases is in increasing order of their expectations. It turns out that for this specific example, we can show that ordering by expectation is optimal! Unfortunately, in many scheduling problems this is not true, and simple heuristics such as that turn out to give poor results.

Feel free to play around with different parameters, number of samples and number of cases in the previous example. You won't be expected to do so, but it will help you get a feel for these types of problems.

Hopefully, this lab was a gentle introduction to linear programming, scheduling, and stochastic scheduling. For the project this semester, you have the choice of formulating and solving a stochastic optimization model. Within this scope, you may choose to do one of the following:

where $W_i = (C_{i} - A_{i+1})^{+}$ and $I_i=(A_{i+1} - C_{i})^{+}$, $A_i$ gives the appointment time of case $i$ and $C_{i}$ gives the completion time of case $i$.

[1] A. Macario. What does one minute of operating room time cost? In Journal of Clinical Anesthesia, 2010