March 13, 2017

Regression Algorithms

Algorithms that learn from labeled data a regression function.

The following regression learning algorithm are divided into:

  • Kernel Methods: methods operating in the Reproducing Kernel Hilbert Space
  • Linear Methods: methods operating in the explicit primal space

 

Kernel Methods


EpsilonSvmRegression

Java classEpsilonSvmRegression

Source codeEpsilonSvmRegression.java

Maven Project: kelp-core

JSON type: epsilonSvmRegression

Description: It implements the ε-Support Vector Regression learning algorithm. It is a learning algorithm for linear regression based on Support Vector Machines [Vapnik(1998)]. It relies on kernel functions. It is a Java porting of the library LIBSVM v3.17, written in C++ [Chang and Lin(2011)].

Parameters:

  • kernel: The kernel function
  • pReg: The regularization parameter for positive examples
  • c: The regularization parameter

KernelizedPassiveAggressiveRegression

Java classKernelizedPassiveAggressiveRegression

Source codeKernelizedPassiveAggressiveRegression.java

Maven Project: kelp-additional-algorithms

JSON type: kernelizedPA-R

Description: Online Passive-Aggressive Learning Algorithm for regression tasks (kernel-based version, proposed in [Crammer et al.(2006)]).

Parameters:

  • kernel: The kernel function
  • policy: The updating policy applied by the Passive Aggressive Algorithm when a miss-prediction occurs
  • c: The aggressiveness parameter
  • eps: The accepted distance between the predicted and the real regression values

 

 Linear Methods


LibLinearRegression

Java class: LibLinearRegression

Source codeLibLinearRegression.java

Maven Project: kelp-additional-algorithms

JSON type: liblinearregression

Description: This class implements linear SVM regression trained using a coordinate descent algorithm [Fan et al.(2008)]. It operates in an explicit feature space (i.e., it does not rely on any kernel). This code has been adapted from the Java port of the original LIBLINEAR C++ sources.

Parameters:

  • p: The ε in the loss function of SVR (default 0.1)
  • c: The regularization parameter
  • representation: The identifier of the representation to be considered for the training step

LinearPassiveAggressiveRegression

Java class: LinearPassiveAggressiveRegression

Source codeLinearPassiveAggressiveRegression.java

Maven Project: kelp-additional-algorithms

JSON type: linearPA-R

Description: Online Passive-Aggressive Learning Algorithm for regression tasks (linear version, proposed in [Crammer et al.(2006)]).

Parameters:

  • policy: The updating policy applied by the Passive Aggressive Algorithm when a miss-prediction occurs
  • c: The aggressiveness parameter
  • eps: The accepted distance between the predicted and the real regression values
  • representation: The identifier of the representation to be considered for the training step

 


References

Koby Crammer, Ofer Dekel, Joseph Keshet, Shai Shalev-Shwartz, and Yoram Singer. On-line passive-aggressive algorithms. Journal of Machine Learning Research, 7:551–585, December 2006. ISSN 1532-4435.

Hsieh, C.J., Chang, K.W., Lin, C.J., Keerthi, S. S. and Sundararajan, S. (2008). A Dual Coordinate Descent Method for Largescale Linear SVM. Proceedings of the 25th international conference on Machine learning ICML ’08 (pp. 408415). New York, New York, USA: ACM Press.

R.-E. Fan, K.-W. Chang, C.-J. Hsieh, X.-R. Wang, and C.-J. Lin. LIBLINEAR: A library for large linear classification Journal of Machine Learning Research 9(2008), 1871-1874.