SuperSCS  1.3.2
 All Data Structures Files Functions Variables Typedefs Enumerations Enumerator Macros Pages
Contributing to SuperSCS

Table of Contents

Thank you for considering contributing to SuperSCS!

Please, read the following guide before you fork this repository, or before you file a new issue.

Style guide

C coding style

SuperSCS follows the following naming convention:

In particular, in regard to structures, they should be written like that:

struct scs_structure {
/* fields go here */
};
typedef struct scs_structure ScsStructure;

API Documentation in C

All non-static functions in C must be documented using the Doxygen style.

/**
* \brief Give a brief description here
*
* Give a more detailed description here
*
* @param data explain what this parameter is about
* @param cone explanation...
*
* @return returns 0 if ... and 1 if ...
*
*/
scs_int scs_function(ScsData * data, ScsCone * cone);

Installation instructions, mathematical documentation and other non-API documentation should not be part of function/variable documentation.

Instead, you should contribute to an existing page in pages/.

Unit testing in C

Unit tests are supported and executed by a simple in-house framework.

C Tests are stored in /tests/c/ and supporting data files are found in /tests/c/data/.

The main test runner file is tests/c/test_runner_dir.c.

Tests are added as follows:

r += scs_test(&test_problem_metadata, "Metadata");

The above line adds the test test_problem_metadata which has the name Metadata.

Test functions are of type typedef bool (*unitTest_t)(char**); for example:

bool test_broyden(char** str) {
return TEST_SUCCESS;
}

Unit tests contain assertions which are documented in unit_test_util.h.

For example:

bool test_example(char** str) {
ASSERT_TRUE_OR_FAIL(x > 0, str, "variable `x` should be positive");
ASSERT_EQUAL_INT_OR_FAIL(i, i_expected, str, "the value of `i` is wrong");
ASSERT_EQUAL_FLOAT_OR_FAIL(t, t_expected, 1e-9, str, "the value of `t` is wrong");
ASSERT_EQUAL_ARRAY_OR_FAIL(arr, arr_expected, 10, 1e-9, str, "array `arr` is not correct");
return TEST_SUCCESS;
}

All test functions start with test_.

Comments in C

Comments in C should be like that:

/* comment goes here */

Multi-line comments should be like that:

/*
* this is a multiple-line comment
* which continues into the next line
*/

Version Numbers

SuperSCS uses a 3-digit version number with a major, a minor and a build number.

Version numbers are updated every time dev is merged into master.

Git

Using/creating branches

We use a simple collaboration model with two branches:

Experimental branches can be created, branching out of dev.

If you need to provide some new functionality, or solve an issue:

Creating an issue

Issues: code of conduct

Before creating a new issue, please make sure that the same or a very similar issue has already been filed.

In general, you shouldn't file an issue to ask a question unless:

If you simply need to ask a question...

Gitter button

Reporting an issue

You may report your issue using the project's issue tracker on github

In your issue report, please include the following information:

Alongside, provide any additional information that will help reproduce and resolve the issue.

If possible, write a test that reproduces the error.

Labels of issues

Labels:

Committing to github

The following commit guidelines were inspired by the guidelines of Atom...

Contributing to SuperSCS

In order to contribute and actively participate in the development of SuperSCS, you first need to fork the repository on github:

This way you will obtain a copy of SuperSCS in your user account.

You will be able to work there and submit a pull request once you complete your changes.

Before merging into master

Before merging into master, use the following checklist:

After the pull request has been merged:

Copy this into your pull request!

Benchmarking

Additional benchmarks are always welcome.

The standard way to report benchmarking results in SuperSCS is the Dolan-Moré plot.

Benchmarking in MATLAB

Benchmarking scripts are found in tests/profiling_matlab.

The files are organised in three main subfolders:

Helpers

Profile helpers are functions of the following form:

out = profile_lasso(problem, solver_options);

These accept two arguments: a problem structure with the problem parameters and a solver_options with the solver configuration (as instance of SuperScsConfig).

Runners

Runners are saved in profile_runners/. Profile runners execute a diverse collection of problems by invoking profile helper functions.

Profile runners are functions of the form:

profile_runner_logreg(solver_options, id, runner_options);

The first argument is an instance of SuperSCSConfig which is passed to all invocations of profile_runner_*.

The second argument is an (integer) identifier of the runner. The results will be stored in a .mat file at profile_results/{id}.mat.

These mat files are not put under version control.

The third argument is a structure with runner configuration parameters.

These typically are ranges of the problem parameters to be tested, number of repetitions of each run and more.

Profile runners store some general statistics in the CSV file register.csv which can be used for look-up purposes.

Experimenters

Lastly, we have the experimenters. These are stored in experimenters/.

An experimenter is a MATLAB script that calls a profile runner with different solver parameters.

Plotting results

Once an experimenter has completed successfully, we may create performance plots using perf_profile_plot

Benchmarking in Python

Docker

Making a Docker image

To build a new docker image, run the following command from within the base directory of SuperSCS:

docker image build -t kulforbes/superscs:v{version-name} .

The docker configuration is given in Dockerfile.