Support vector machines

Given a set of vectors \(\{a_i\}_{i=0}^{n-1} \subset \mathbb{R}^m\) and binary labels \(\{\eta_i\} \subset \{0,1\}\), a soft-margin Support Vector Machine solves the problem

\[\min_{w,\beta} \frac{1}{n} \sum_{i=0}^{n-1} h(1-\eta_i(w^H a_i - \beta)) + \gamma \| w \|_2^2,\]

where \(h(t)\) is the hinge loss, or unit ramp function, which is zero when \(t \le 0\) and equal to \(t\) otherwise, and whose purpose is to linearly penalize the modeled distance of a point from a chosen hyperplane separator.

The pair \((w,\beta)\) can be interpreted as an unnormalized description of the hyperplane which approximately partitions the labeled data. In particular, \(w\)‘s direction is the normal for the hyperplane, \(\beta/\|w\|_2\) is the offset of the plane from the origin in the direction of \(w\), and \(1/\|w\|_2\) is the half-margin of the separator, which is the minimum distance a point can be from the separator without being penalized.


While sometimes functional, this implementation is still very much a work-in-progress.


Int SVM(const Matrix<Real> &G, const Matrix<Real> &q, Matrix<Real> &z, Real gamma, Real rho = 1, Int maxIter = 500, bool inv = true, bool progress = true)
Int SVM(const AbstractDistMatrix<Real> &G, const AbstractDistMatrix<Real> &q, AbstractDistMatrix<Real> &z, Real gamma, Real rho = 1, Int maxIter = 500, bool inv = true, bool progress = true)


ElError ElSVM_s(ElConstMatrix_s G, ElConstMatrix_s q, ElMatrix_s z, float gamma, ElInt* numIts)
ElError ElSVM_d(ElConstMatrix_d G, ElConstMatrix_d q, ElMatrix_d z, double gamma, ElInt* numIts)
ElError ElSVMDist_s(ElConstDistMatrix_s G, ElConstDistMatrix_s q, ElDistMatrix_s z, float gamma, ElInt* numIts)
ElError ElSVMDist_d(ElConstDistMatrix_d G, ElConstDistMatrix_d q, ElDistMatrix_d z, double gamma, ElInt* numIts)