# Second-order semidefinite relaxation of a commutative polynomial optimization problem

Posted on 17 May 2013

Consider the following polynomial optimization problem:

$$\min_{x\in \mathbb{R}^2}2x_1x_2$$

such that

$$-x_2^2+x_2+0.5\geq 0$$$$x_1^2-x_1=0.$$

This is the commutative toy example in Pironio et al., 2010, which extends the semidefinite relaxations of constrained polynomial optimization problems (Lasserre, 2001) to noncommutative variables. The number of variables in the corresponding semidefinite programming (SDP) problem grows fast as the number of variables in the original problem increases. To achieve a scalable solution, we would like to solve the SDP with SDPARA, a distributed solver that also has a new GPU-accelerated variant (Fujisawa et al., 2012). Working towards this goal, we look at how the second order relaxation can be exported with PICOS. The toy example in Pironio et al., 2010 applies a small trick, eliminating the equality constraint from the monomial basis. This is a non-trivial problem in general. We look at two solutions, the first one follows the toy example word by word, the second deals with the equality by adding a localizing matrix.

## Equality constraint eliminated manually¶

The monomial basis for a second-order relaxation would be $\{1, x_1, x_2, x_1^2, x_1x_2, x_2^2\}$, but $x_1$ is an idempotent variable, so the corresponding relaxation variables reduce to $\{y_0, y_1, y_2, y_{12}, y_{22}\}$. The relaxation is written as

$$\min_{y}2y_{12}$$

such that $$\left[ \begin{array}{c|cc|cc}1 & y_{1} & y_{2} & y_{12} & y_{22}\\\hline{}y_{1} & y_{1} & y_{12} & y_{12} & y_{122}\\y_{2} & y_{12} & y_{22} & y_{122} & y_{222}\\\hline{}y_{12} & y_{12} & y_{122} & y_{122} & y_{1222}\\y_{22} & y_{122} & y_{222} & y_{1222} & y_{2222}\end{array} \right] \succeq{}0$$

$$\left[ \begin{array}{c|cc}-y_{22}+y_{2}+0.5 & -y_{122}+y_{12}+0.5y_{1} & -y_{222}+y_{22}+0.5y_{2}\\\hline{}-y_{122}+y_{12}+0.5y_{1} & -y_{122}+y_{12}+0.5y_{1} & -y_{1222}+y_{122}+0.5y_{12}\\-y_{222}+y_{22}+0.5y_{2} & -y_{1222}+y_{122}+0.5y_{12} & -y_{2222}+y_{222}+0.5y_{22}\end{array}\right]\succeq{}0.$$

Apart from the matrices being symmetric, notice other regular patterns between the elements. We need to explicitly encode these.

In :
import picos


The description of the problem is straightforward from the matrices above:

In :
prob = picos.Problem()
#y_12
#y_22
#y_122

prob.set_objective('min', 2*M[1,2])



We can solve the problem directly with Cvxopt, a dependency of Picos:

In :
sol = prob.solve(solver='cvxopt', verbose = 0)


This yields the correct results (≈-0.7321). To keep in line with the original goal, we can also export the problem in SDPA format:

In :
prob.write_to_file('example.dat-s')

writing problem in example.dat-s...
done.


Solving it on your favourite cluster, the result should be the same.

## Equality constraint dealt with algorithmically¶

In this case, we do not eliminate the idempotent element from the monomial basis. For more complex equalities, this would prove difficult. Instead, we define a new localizing matrix. The equality equation of the localizing matrix is in turn converted to a pair of positive semidefinite constraints in the background, which increases the size of the problem. While seemingly undesirable, the solution is efficient. Other methods of dealing with equalities, for instance, a QR factorization and a subsequent projection to the new basis, would run on a single node. Whereas if we increase the complexity of the SDP problem, we multiple nodes and, potentially, GPU resources to compute the optimum. Generating the SDP problem fast and solving it on a cluster lead to an efficient solution in terms of computational time, and also in terms of human effort.

The relaxation hence changes to the following:

$$\min_{y}2y_{12}$$

such that

$$\left[ \begin{array}{c|cc|ccc}1 & y_{1} & y_{2} & y_{11} & y_{12} & y_{22}\\\hline{}y_{1} & y_{11} & y_{12} & y_{111} & y_{112} & y_{122}\\y_{2} & y_{12} & y_{22} & y_{112} & y_{122} & y_{222}\\\hline{}y_{11} & y_{111} & y_{112} & y_{1111} & y_{1112} & y_{1122}\\y_{12} & y_{112} & y_{122} & y_{1112} & y_{1122} & y_{1222}\\y_{22} & y_{122} & y_{222} & y_{1122} & y_{1222} & y_{2222}\end{array} \right] \succeq{}0$$$$\left[ \begin{array}{c|cc}-y_{22}+y_{2}+0.5 & -y_{122}+y_{12}+0.5y_{1} & -y_{222}+y_{22}+0.5y_{2}\\\hline{}-y_{122}+y_{12}+0.5y_{1} & -y_{1122}+y_{112}+0.5y_{11} & -y_{1222}+y_{122}+0.5y_{12}\\-y_{222}+y_{22}+0.5y_{2} & -y_{1222}+y_{122}+0.5y_{12} & -y_{2222}+y_{222}+0.5y_{22}\end{array}\right]\succeq{}0.$$$$\left[ \begin{array}{c|cc}y_{11}-y_{1} & y_{111}-y_{11} & y_{112}-y_{12}\\\hline{}y_{111}-y_{11} & y_{1111}-y_{111} & y_{1112}-y_{112}\\y_{112}-y_{12} & y_{1112}-y_{112} & y_{1122}-y_{122}\end{array}\right]=0.$$

Notice that the inequality matrix changed in only one element. The Python code corresponding to this problem is as follows:

In :
prob = picos.Problem()
#y_11
#y_12
#y_22
#y_112
#y_122
#y_1122

prob.set_objective('min', 2*M[0,4])