Originally Contributed by: Arpit Bhatia
While the last tutorial introduced you to basics of of JuMP code, this tutorial will go in depth focusing on how to work with different parts of a JuMP program.
using JuMP
model = Model();
All of the variables we have created till now have had a bound. We can also create a free variable.
@variable(model, free_x)
While creating a variable, instead of using the <= and >= syntax, we can also use the lower_bound
and upper_bound
keyword arguments.
@variable(model, keyword_x, lower_bound = 1, upper_bound = 2)
We can query whether a variable has a bound using the has_lower_bound
and has_upper_bound
functions. The values of the bound can be obtained
using the lower_bound
and upper_bound
functions.
has_upper_bound(keyword_x)
true
upper_bound(keyword_x)
2.0
Note querying the value of a bound that does not exist will result in an error.
lower_bound(free_x)
Variable free_x does not have a lower bound. Stacktrace: [1] error(::String) at ./error.jl:33 [2] lower_bound(::VariableRef) at /home/mbesancon/.julia/packages/JuMP/e0Uc2/src/variables.jl:422 [3] top-level scope at In[6]:1 [4] include_string(::Function, ::Module, ::String, ::String) at ./loading.jl:1091 [5] execute_code(::String, ::String) at /home/mbesancon/.julia/packages/IJulia/a1SNk/src/execute_request.jl:27 [6] execute_request(::ZMQ.Socket, ::IJulia.Msg) at /home/mbesancon/.julia/packages/IJulia/a1SNk/src/execute_request.jl:86 [7] #invokelatest#1 at ./essentials.jl:710 [inlined] [8] invokelatest at ./essentials.jl:709 [inlined] [9] eventloop(::ZMQ.Socket) at /home/mbesancon/.julia/packages/IJulia/a1SNk/src/eventloop.jl:8 [10] (::IJulia.var"#15#18")() at ./task.jl:356
JuMP also allows us to change the bounds on variable. We will learn this in the problem modification tutorial.
We have already seen how to add a single variable to a model using the @variable
macro. Let's now look at more ways to add
variables to a JuMP model. JuMP provides data structures for adding collections of variables to a model. These data
structures are reffered to as Containers and are of three types - Arrays
, DenseAxisArrays
, and SparseAxisArrays
.
JuMP arrays are created in a similar syntax to Julia arrays with the addition of specifying that the indices start with 1. If we do not tell JuMP that the indices start at 1, it will create a DenseAxisArray instead.
@variable(model, a[1:2, 1:2])
2×2 Array{VariableRef,2}: a[1,1] a[1,2] a[2,1] a[2,2]
An n-dimensional variable $x \in {R}^n$ having a bound $l \preceq x \preceq u$ ($l, u \in {R}^n$) is added in the following manner.
n = 10
l = [1; 2; 3; 4; 5; 6; 7; 8; 9; 10]
u = [10; 11; 12; 13; 14; 15; 16; 17; 18; 19]
@variable(model, l[i] <= x[i = 1:n] <= u[i])
10-element Array{VariableRef,1}: x[1] x[2] x[3] x[4] x[5] x[6] x[7] x[8] x[9] x[10]
Note that while working with Containers, we can also create variable bounds depending upon the indices
@variable(model, y[i = 1:2, j = 1:2] >= 2i + j)
2×2 Array{VariableRef,2}: y[1,1] y[1,2] y[2,1] y[2,2]
DenseAxisArrays are used when the required indices are not one-based integer ranges. The syntax is similar except with an arbitrary vector as an index as opposed to a one-based range.
An example where the indices are integers but do not start with one.
@variable(model, z[i = 2:3, j = 1:2:3] >= 0)
2-dimensional DenseAxisArray{VariableRef,2,...} with index sets: Dimension 1, 2:3 Dimension 2, 1:2:3 And data, a 2×2 Array{VariableRef,2}: z[2,1] z[2,3] z[3,1] z[3,3]
Another example where the indices are an arbitrary vector.
@variable(model, w[1:5,["red", "blue"]] <= 1)
2-dimensional DenseAxisArray{VariableRef,2,...} with index sets: Dimension 1, Base.OneTo(5) Dimension 2, ["red", "blue"] And data, a 5×2 Array{VariableRef,2}: w[1,red] w[1,blue] w[2,red] w[2,blue] w[3,red] w[3,blue] w[4,red] w[4,blue] w[5,red] w[5,blue]
SparseAxisArrays are created when the indices do not form a rectangular set. For example, this applies when indices have a dependence upon previous indices (called triangular indexing).
@variable(model, u[i = 1:3, j = i:5])
JuMP.Containers.SparseAxisArray{VariableRef,2,Tuple{Int64,Int64}} with 12 entries: [2, 3] = u[2,3] [2, 2] = u[2,2] [2, 5] = u[2,5] [1, 4] = u[1,4] [3, 3] = u[3,3] [1, 3] = u[1,3] [2, 4] = u[2,4] [1, 1] = u[1,1] [1, 2] = u[1,2] [1, 5] = u[1,5] [3, 4] = u[3,4] [3, 5] = u[3,5]
We can also conditionally create variables by adding a comparison check that depends upon the named indices and is separated from the indices by a semi-colon (;).
@variable(model, v[i = 1:9; mod(i, 3) == 0])
JuMP.Containers.SparseAxisArray{VariableRef,1,Tuple{Int64}} with 3 entries: [9] = v[9] [3] = v[3] [6] = v[6]
The last arguement to the @variable
macro is usually the variable type. Here we'll look at how to specifiy he variable type.
Integer optimization variables are constrained to the set $x \in {Z}$
@variable(model, integer_x, Int)
or
@variable(model, integer_z, integer = true)
Binary optimization variables are constrained to the set $x \in \{0, 1\}$.
@variable(model, binary_x, Bin)
or
@variable(model, binary_z, binary = true)
JuMP also supports modeling with semidefinite variables. A square symmetric matrix X is positive semidefinite if all eigenvalues are nonnegative.
@variable(model, psd_x[1:2, 1:2], PSD)
2×2 LinearAlgebra.Symmetric{VariableRef,Array{VariableRef,2}}: psd_x[1,1] psd_x[1,2] psd_x[1,2] psd_x[2,2]
We can also impose a weaker constraint that the square matrix is only symmetric (instead of positive semidefinite) as follows:
@variable(model, sym_x[1:2, 1:2], Symmetric)
2×2 LinearAlgebra.Symmetric{VariableRef,Array{VariableRef,2}}: sym_x[1,1] sym_x[1,2] sym_x[1,2] sym_x[2,2]
model = Model()
@variable(model, x)
@variable(model, y)
@variable(model, z[1:10]);
While calling the @constraint
macro, we can also set up a constraint reference. Such a refference is useful for obtaining
additional information about the constraint such as its dual.
@constraint(model, con, x <= 4)
Just as we had containers for variables, JuMP also provides Arrays
, DenseAxisArrays
, and SparseAxisArrays
for storing
collections of constraints. Examples for each container type are given below.
@constraint(model, [i = 1:3], i * x <= i + 1)
3-element Array{ConstraintRef{Model,MathOptInterface.ConstraintIndex{MathOptInterface.ScalarAffineFunction{Float64},MathOptInterface.LessThan{Float64}},ScalarShape},1}: x ≤ 2.0 2 x ≤ 3.0 3 x ≤ 4.0
@constraint(model, [i = 1:2, j = 2:3], i * x <= j + 1)
2-dimensional DenseAxisArray{ConstraintRef{Model,MathOptInterface.ConstraintIndex{MathOptInterface.ScalarAffineFunction{Float64},MathOptInterface.LessThan{Float64}},ScalarShape},2,...} with index sets: Dimension 1, Base.OneTo(2) Dimension 2, 2:3 And data, a 2×2 Array{ConstraintRef{Model,MathOptInterface.ConstraintIndex{MathOptInterface.ScalarAffineFunction{Float64},MathOptInterface.LessThan{Float64}},ScalarShape},2}: x ≤ 3.0 x ≤ 4.0 2 x ≤ 3.0 2 x ≤ 4.0
@constraint(model, [i = 1:2, j = 1:2; i != j], i * x <= j + 1)
JuMP.Containers.SparseAxisArray{ConstraintRef{Model,MathOptInterface.ConstraintIndex{MathOptInterface.ScalarAffineFunction{Float64},MathOptInterface.LessThan{Float64}},ScalarShape},2,Tuple{Int64,Int64}} with 2 entries: [1, 2] = x ≤ 3.0 [2, 1] = 2 x ≤ 2.0
We can add constraints using regular Julia loops
for i in 1:3
@constraint(model, 6x + 4y >= 5i)
end
or use for each loops inside the @constraint
macro.
@constraint(model, [i in 1:3], 6x + 4y >= 5i)
3-element Array{ConstraintRef{Model,MathOptInterface.ConstraintIndex{MathOptInterface.ScalarAffineFunction{Float64},MathOptInterface.GreaterThan{Float64}},ScalarShape},1}: 6 x + 4 y ≥ 5.0 6 x + 4 y ≥ 10.0 6 x + 4 y ≥ 15.0
We can also create constraints such as $\sum _{i = 1}^{10} z_i \leq 1$
@constraint(model, sum(z[i] for i in 1:10) <= 1)
While the recommended way to set the objective is with the @objective macro, the functions set_objective_sense
and
set_objective_function
provide an equivalent lower-level interface.
using GLPK
model = Model(GLPK.Optimizer)
@variable(model, x >= 0)
@variable(model, y >= 0)
set_objective_sense(model, MOI.MIN_SENSE)
set_objective_function(model, x + y)
optimize!(model)
@show objective_value(model);
objective_value(model) = 0.0
To query the objective function from a model, we use the objective_sense
, objective_function
, and objective_function_type
functions.
objective_sense(model)
MIN_SENSE::OptimizationSense = 0
objective_function(model)
objective_function_type(model)
GenericAffExpr{Float64,VariableRef}
We can also add constraints and objective to JuMP using vectorized linear algebra. We'll illustrate this by solving an LP in standard form i.e.
vector_model = Model(GLPK.Optimizer)
A= [ 1 1 9 5;
3 5 0 8;
2 0 6 13]
b = [7; 3; 5]
c = [1; 3; 5; 2]
@variable(vector_model, x[1:4] >= 0)
@constraint(vector_model, A * x .== b)
@objective(vector_model, Min, c' * x)
optimize!(vector_model)
@show objective_value(vector_model);
objective_value(vector_model) = 4.9230769230769225