Consider the symmetric polynomial matrix (taken from [Example 3.77, BPT13]) $$P(x) = \begin{bmatrix} x^2 - 2x + 2 & x\\ x & x^2 \end{bmatrix}.$$ We could like to know whether $P(x)$ is positive semidefinite for all $x \in \mathbb{R}$.

[BPT12] Blekherman, G.; Parrilo, P. A. & Thomas, R. R.
*Semidefinite Optimization and Convex Algebraic Geometry*.
Society for Industrial and Applied Mathematics, **2012**.

In [1]:

```
using DynamicPolynomials
@polyvar x
P = [x^2 - 2x + 2 x
x x^2]
```

Out[1]:

A sufficient condition for a symmetric polynomial matrix $P(x)$ to be positive semidefinite for all $x$ is to the existence of a matrix $M(x)$ such that $P(x) = M^\top(x) M(x)$. If such matrix $M$ exists, we say that the matrix is an \emph{sos matrix} (see [Definition 3.76, BPT13]). While determining whether $P(x)$ is positive semidefinite for all $x$, is NP-hard (checking nonnegativity of a polynomial is reduced to this problem for $1 \times 1$ matrices), checking whether $P(x)$ is an sos matrix is an sos program.

In [2]:

```
using SumOfSquares
```

We first need to pick an SDP solver, see here for a list of the available choices.

In [3]:

```
using CSDP
factory = with_optimizer(CSDP.Optimizer, printlevel=0);
```

In [ ]:

```
using MosekTools
factory = with_optimizer(Mosek.Optimizer, QUIET=true);
```

In [4]:

```
model = SOSModel(factory)
mat_cref = @constraint(model, P in PSDCone())
optimize!(model)
termination_status(model)
```

Out[4]:

While the reformulation of sos matrix to sos polynomial is rather simple, as explained in the "Sum-of-Squares reformulation" section below, there is a technical subtelty about the Newton polytope that if not handled correctly may result in an SDP of large size with bad numerical behavior. For this reason, it is recommended to model sos *matrix* constraints as such as will be shown in this notebook and not do the formulation manually unless there is a specific reason to do so.

As we can verify as follows, only 4 monomials are used using the sos *matrix* constraint.

In [5]:

```
certificate_monomials(mat_cref)
```

Out[5]:

One way to obtain the reduction to an sos program is to create an intermediate vector of variable $y$ and check whether the polynomial $p(x, y) = y^\top P(x) y$ is sos. However, special care is required when approximating the Newton polytope of $p(x, y)$. Indeed, for instance if the entries of $P(x)$ are quadratic forms then the Newton polytope of $p(x, y)$ is the cartesian product between the Newton polytope of $y^\top y$ and the Newton polytope of $x^\top x$. In other words, $p(x, y)$ belongs to a family of quartic forms called biquadratic forms. This fact is important when generating the semidefinite program so that only bilinear monomials are used. So if the cheap outer approximation is used (instead of the exact polyhedral computation) for the newton polytope then it is important to use a multipartite computation approximation of the newton polytope. The multipartie exact approach may perform worse compared to the unipartite exact in certain cases though. Consider for instance the polynomial matrix $\mathrm{Diag}(x_1^1, x_2^2)$ for which $p(x, y) = x_1^2y_1^2 + x_2^2y_2^2$. For this polynomial, only the monomials $x_1y_1$ and $x_2y_2$ are needed in the SDP reformulation while the multipartite approach, as it will compute the Newton polytope as a cartesian product, will not see the dependence between $x$ and $y$ in the presence of monomials and will also select the monomials $x_1y_2$ and $x_2y_1$.

In [6]:

```
@polyvar y[1:2]
p = vec(y)' * P * vec(y)
```

Out[6]:

In [7]:

```
model = SOSModel(factory)
cref = @constraint(model, p in SOSCone())
```

Out[7]:

In [8]:

```
optimize!(model)
termination_status(model)
```

Out[8]:

As we can see below, 6 monomials (instead of the 4 monomials with the sos *matrix* constraint) have been used as the unipartite cheap outer approximation was used for the Newton Polytope.

In [9]:

```
certificate_monomials(cref)
```

Out[9]:

In [ ]:

```
```