Examples

In the following we show how ForBES can be used to model and solve some well known optimization problems.

Support vector machines

SVMs can be cast as the following convex optimization problem

minimize λ2x22+mi=1max{0,1bizi},subject to Ax=z

Therefore we have

x1=x, f(x)=λ2x22, g(z)=mi=1max{0,1bizi}, A1=A, B=I

Function g is called hinge loss and is provided in the library (see Functions for more information). The problem can therefore be defined as:

f = squaredNorm(lambda);
g = hingeLoss(1, b); % vector b contains the labels
out = forbes(f, g, [], [], {A, -1, zeros(m, 1)});

Sparse logistic regression

Consider the following problem

minimize mi=1log(1+exp(biai,x))+rx1

The smooth term in this case is the logistic function, and the nonsmooth term is the 1 regularization. We then have

f(x)=1mmi=1log(1+exp(xi))g(x)=rx1=i|xi|

This problem can be defined using the functions in the library (see Functions for more information) as follows:

f = logLoss(1/m);
g = l1Norm(r);
C = diag(sparse(b))*X; % vector b contains the labels, X is the design matrix
out = forbes(f, g, [], C);