As you saw, support vector machines are another classification tool to add to your repertoire. While computationally expensive, they can be powerful tools providing substantial performance gains in many instances.
Probably the most important information worth reviewing is some of the various kernel functions that you can apply:
- Radial Basis Function (RBF)
c
-
$\gamma$ , which can be specified usinggamma
in scikit-learn
- Polynomial Kernel
-
$\gamma$ , which can be specified usinggamma
in scikit-learn -
$r$ , which can be specified usingcoef0
in scikit-learn -
$d$ , which can be specified usingdegree
in scikit-learn
-
- Sigmoid Kernel
-
$\gamma$ , which can be specified usinggamma
in scikit-learn -
$r$ , which can be specified usingcoef0
in scikit-learn
-
Also recall that in general, c
is the parameter for balancing standard accuracy metrics for tuning classifiers with the decision boundary distance.
While it may appear that this section was a bit brief, Support Vector Machines are a powerful algorithm that deserve attention, so make sure you investigate them properly. Moreover, learning to properly tune SVMs using kernels and an appropriate c
value is critical.