February 13, 2017

Hello (linear) Learning!

Let’s start with a very simple classification example based on a linear version of the Passive Aggressive algorithm (LinearPassiveAggressiveClassification). The full code of this example can be found in the GitHub repository kelp-full, in particular in the source file HelloLearning.java.

Dataset used here is the same of the svmlight page; each example is only modified to be readable by KeLP. In fact, a single row in KeLP must indicate what kind of vectors your are using, Sparse or Dense. In the svmlight dataset there are sparse vectors, so if you open the train.dat and test.dat files you can notice that each vector is enclosed in BeginVector (|BV|) and EndVector (|EV|) tags.

The classification task consists in classifying an example with respect to the “+1” and “-1” classes. The dataset is thus composed by examples of such classes:

  • Training set (2000 examples, 1000 of class “+1” (positive), and 1000 of class “-1” (negative))
  • Test set (600 examples, 300 of class “+1” (positive), and 300 of class “-1” (negative))

Let’s start doing some Java code.

First of all, we need to load dataset in memory and define what is the positive class of the classification problem.

If you want, you can print some statistics about dataset through some useful built-in methods.

Then, instantiate a new Passive Aggressive algorithm and set some parameter on it.

Learn a model on the trainingSet obtaining a Classifier

Finally, we classify each example in the test set and compute some performance measure.

At the end of the training the program of the HelloLearning.java file will output the 97.16% accuracy.