![]() |
SuperSCS
1.3.2
|
In what follows we give a few code snippets in MATLAB and compare SuperSCS with SCS.
This is a preliminary study to show that SuperSCS outperforms SCS, but a more thorough analysis is necessary.
In what follows we compare the two algorithms using five different types of problems: (i) a LASSO problem ( \ell_1-regularized least squares), (ii) a semidefinite program (SDP) and, in particular, a minimum-norm problem and an LMI-constrained problem, (iii) a logistic regression problem, (iv) a minimum p-norm problem, (v) a 2-norm-constrained minimum-norm problem and, last , (vi) a matrix completion problem involving the nuclear norm.
gcc
v4.8.5 and accessed via MATLAB® using a MEX interface.We solve a simple LASSO problem of the form
\mathrm{Minimize}_x\ \textstyle\frac{1}{2} \|Ax-b\|^2 + \mu \|x\|_1,
with x\in\mathbb{R}^n, A\in\mathbb{R}^{m\times n}.
Here n=10000, m=2000.
We define the problem data as follows:
Note that A is a sparse matrix with density 10\% and reciprocal condition number 0.1.
We then formulate the problem in CVX with accuracy \epsilon=10^{-4}.
We solve it using SuperSCS with restarted Broyden directions with memory 100 and using k_0=0:
Notice that we may pass any parameters to SCS using CVX's cvx_solver_settings
.
SuperSCS with the above options terminates after 169 iterations and 85.1 s.
Instead, SCS requires 3243 iterations which corresponds to 802 s.
SuperSCS exhibits a speed-up of \times 9.4.
cvx_solver_settings
are set to their default values.Here we solve the problem
\begin{eqnarray*} &&\mathrm{Minimize}_{Z}\ \|Z-P\|_{\mathrm{fro}}\\ &&Z\geq 0\\ &&Z: \mathrm{Hermitian},\ \mathrm{Toeplitz} \end{eqnarray*}
SuperSCS solves this problem in 218s ( 103 iterations), whereas SCS requires 500s to terminate ( 392 iterations).
Another interesting SDP problem is that of determining a symmetric positive definite matrix P which solves
\begin{eqnarray*} &&\mathrm{Minimize}_{P}\ \mathrm{trace}(P)\\ &&P=P'\\ &&P \geq I\\ &&A'P + PA \leq -I \end{eqnarray*}
For this example we chose a (stable) matrix A with its eigenvalues uniformly logarithmically spaced between -10^{-1} and -10^{1}.
SuperSCS converges after 40.7s ( 314 iterations), whereas SCS converges after 415s ( 4748 iterations)
Here we solve the following \ell_1-regularized logistic regression problem:
\mathrm{Minimize}_w\ \lambda \|w\|_1 - \sum_{i}\log(1+\exp(a' w_i + b))
We formulate the problem as follows:
For \epsilon=10^{-3}, SuperSCS terminates after 131s ( 134 iterations), while SCS requires 329s ( 675 iterations).
For \epsilon=10^{-4}, SuperSCS requires 179s ( 183 iterations), whereas SCS still requires as much as 497s ( 983 iterations).
In both cases, SuperSCS is about 2.5 times faster.
Consider the problem
\begin{eqnarray*} &&\mathrm{Minimize}_x\ \|x\|_p\\ &&Gx = f \end{eqnarray*}
This is the problem formulation in CVX
For \epsilon=10^{-3}, SuperSCS converged after 11.3s ( 1355 iterations) , while SCS required 39s ( 4911 iters).
For \epsilon=10^{-4}, SuperSCS terminated after 17s ( 4909 iterations), while SCS failed to terminate within 10^4 iterations.
Here we solve a constrained problem of the form
\begin{eqnarray*} &&\mathrm{Minimize}_x\ \|Ax-b\|\\ &&\|Gx\| \leq 1 \end{eqnarray*}
with x\in\mathbb{R}^m, A\in\mathbb{R}^{m\times n} and G\in\mathbb{R}^{2n\times n}.
The problem data are:
And we need to solve it up to an accuracy of \epsilon=10^{-3}.
The problem is formulated in CVX as follows:
SuperSCS converges after 37.1s ( 116 iterations).
On the other hand, SCS did not converge after 242s ( 5000 iterations).
Note that SuperSCS can attain much higher precision; for instance, it converges with \epsilon=10^{-4} in 41.2s ( 131 iterations) and with \epsilon=10^{-6} it converges after 58s ( 194 iterations).
Let M be a given matrix whose elements \{(i,j)\}_{i\in I, j\in J} are missing.
Here, we formulate the matrix completion problem as a nuclear norm minimization problem.
\begin{eqnarray*} &&\mathrm{Minimize}_{X}\ \|X-M\|_{\mathrm{fro}}^2 + \lambda \|X\|_{*}\\ &&X_{i',j'} = M_{i',j'},\ \forall i'\notin I,\ j'\notin J \end{eqnarray*}
Problem formulation:
Problem solution:
SuperSCS (with Broyden directions and memory 50) converges in 22s ( 128 iterations), whereas SCS takes 43s ( 677 iterations).
Here we solve the following portfolio selection problem:
\begin{eqnarray*} &&\mathrm{Maximize}_z\ \mu'z - \gamma z'\Sigma z\\ &&1'z = 1\\ &&z \geq 0, \end{eqnarray*}
where z is the portfolio of assets, \mu is the vector of expected returns, \gamma is a risk-aversion parameter and \Sigma=F'F+D is the asset return covariance matrix with F being the factor loading matrix and D is a diagonal matrix called the asset-specific risk.
The problem is generated as follows:
The problem is solved using CVX as follows:
The above problem is solved in 46.7s and at 292 iterations with SuperSCS running with restarted Broyden directions with memory equal to 100.
The respective results for SCS are 157s and 3050 iterations (approximately 3.3 times slower).
For a lower tolerance of \epsilon=10^{-3} , SuperSCS with memory equal to 100 terminates at 162 iterations after 23.5s while SCS converges after 43.9s at 787 iterations.
Here we solve the following sparse PCA problem (using an \ell_1-regularization).
The problem has the form
\begin{eqnarray*} &&\mathrm{Maximize}_X\ \mathrm{trace}(SX) - \lambda \|X\|_1\\ && \mathrm{trace}(X) = 1\\ && X = X' \succeq 0 \end{eqnarray*}
SuperSCS: 7.6s, 95 iterations
SCS: 65s, 2291 iterations
Speed-up: \times 8.6
Instead of using cvx_solver_settings
to provide the SuperSCS configuration options, you may use scs_set_options
to which you may pass a SuperSCSConfig
object which can be constructed from the namesake factory class.
Here is an example:
Ready-to-use sets of configuration options are the following
SuperSCSConfig.andersonConfig()
Anderson's acceleration with memory equal to 10SuperSCSConfig.broydenConfig()
Restarted Broyden method with memory equal to 50SuperSCSConfig.fprDirectionConfig()
SuperSCSConfig.douglasRachfordConfig()
simple Douglas-Rachford algorithmSuperSCSConfig.scsConfig()
original SCS implementationYou may use SuperSCSConfig
and override some of the default parameters as follows