The Java Applets allow you to construct toy pattern recognition and regression estimation problems. When you submit your problem it is solved by the Support Vector Machine Learning Algorithm, and the solution displayed in your Applet.

Pattern Recognition Applet

This Applet allows you to construct 2 dimensional pattern recognition problems by placing example points of each class in the view window which is a visualization of input space. Examples of each class are represented as blue and yellow dots respectively. Switching between the two classes is achieved by clicking on the two radio buttons in the top left hand corner of the Applet labelled "POS" and "NEG".

When you have created the problem, to see the Support Vector solution of the problem click "Submit".

You should see the space you were drawing in subdivided by (the projection of) a higher dimensional hyper-plane into two areas, blue and yellow. This describes the learnt function across the whole input space, not just at the example points.

There are two parameters you can set to control the learning process of the Support Vector Machine: The Kernel type and the Penalty on Error.

  1. The Kernel:

    This describes the set of decision functions. In this applet you can choose:
    1. Hyperplanes (Simple dot product)
    2. Polynomials (of degree d)
    3. Gaussian Radial Basis Functions (of width sigma)

    Both the Polynomial and Radial Basis Function kernels have an extra parameter which can be set by entering a value in the input box. For the polynomial kernel, the (integer) degree may be set. For the radial basis functions, the spread of the Gaussian may be set by entering a the value of sigma.

  2. Penalty on Error:

    This describes how much an incorrectly classified example is punished -- if this parameter is set to a very high value, the SV machine will attempt to find a complex surface that separates the data perfectly, and if it is set to a low value, a simple surface will be found that may separate the data with many errors.

Regression Estimation Applet

This Applet allows you to construct one dimensional regression estimation problems by placing example points in the view window. The window represents the equation y=f(x) where the horizontal axis represents the values of x, and and the vertical axis the values of y.

Placing an example indicates a value y for some point x describing the relationship between y and x, potentially with noise.

When you have created the problem, to see the Support Vector solution of the problem click "Submit".

You should see a plot of the regression function learnt by the Support Vector Machine.

There are three parameters you can set to control the learning process of the Support Vector Machine in this Applet: The Kernel type, the Penalty on Error, and "Max. Deviation". (The latter two may be known to you as the parameters C and epsilon).

  1. The Kernel:

    This describes the set of decision functions. In this applet you can choose:

    1. Hyperplanes (Simple dot product)
    2. Polynomials (of degree d)
    3. Radial Basis Functions (of width sigma)
    4. Linear Splines with an infinite number of nodes.

  2. Penalty on Error:

    This describes how much an incorrectly classified example is punished -- the higher the value, the more it is punished. Very high values of C result in functions that approximate the data very well, but may be complex (non-smooth). Typically, the value of this parameter is set to a value that allows some outliers.

  3. Max. Deviation (width of zero-loss zone):

    The regression implemented here minimises the sum of those absolute deviations that are greater than a tolerance value, which is here termed max. deviation. That is, if y is a data value, and ý is the estimated value, then the loss is max(abs(y-ý) - e, 0), where e is the width of a "zero-loss zone". The SV optimization minimises the sum of these losses times the penalty on error plus a measure of the complexity of the approximating function.

25. June 2013