A second example that we consider is again a classification task, but this time with a kernelized version of the Passive Aggressive algorithm (KernelizedPassiveAggressiveClassification). The full code of this example can be found in the GitHub repository kelp-full, in particular in the source file HelloKernelLearning.java.
The dataset here used is again the svmlight page dataset, in the KeLP format.
As for the previous example, the dataset is composed by:
- Training set (2000 examples, 1000 of class “+1” (positive), and 1000 of class “-1” (negative))
- Test set (600 examples, 300 of class “+1” (positive), and 300 of class “-1” (negative))
Again, loading the dataset is simply a call to KeLP methods:
// Read a dataset into a trainingSet variable
SimpleDataset trainingSet = new SimpleDataset();
// Read a dataset into a test variable
SimpleDataset testSet = new SimpleDataset();
// define the positive class
StringLabel positiveClass = new StringLabel("+1");
Using a kernel function within KeLP is very simple: it is sufficient to declare a kernel function, the representation on which it will operate and tell the algorithm the it must use a kernel function to compute similarity scores.
For example, if we want to use a Polynomial kernel on top of a linear kernel, it is sufficient to do as following:
// instantiate a passive aggressive algorithm
KernelizedPassiveAggressive kPA = new KernelizedPassiveAggressive();
// indicate to the learner what is the positive class
// set an aggressiveness parameter
// use the first (and only here) representation
Kernel linear = new LinearKernel("0");
// Normalize the linear kernel
NormalizationKernel normalizedKernel = new NormalizationKernel(linear);
// Apply a Polynomial kernel on the score (normalized) computed by the linear kernel
Kernel polyKernel = new PolynomialKernel(2f, normalizedKernel);
// tell the algorithm that the kernel we want to use in learning is the polynomial kernel
The rest of the Java code is very similar to the one of the Hello (linear) Learning example.