Step 1: Load the data. You'd better reshape the data as follow:
X is the matrix of input data with dimension of N-by-p where N is the number of instances and p is the number of features. For the convenience of visualization, we define p=2 here;
Y is the column vector of output data with dimension of N-by-1;
Step 2: Define parameters ion define_parameters.m file.
poly_con is the parameter for Polynomial Kernel,
gamma is the parameter for Gaussian Kernel,
kappa1 & kappa2 are the parameters for Sigmoid Kernel,
precision is the tolerance of precision,
Cost is the hyperparameter for SVM.
Step 3: Fit the model using SVM.m file. Choose the kernel you want and fit the model with your data.
Step 4: Visualize the 2D plot. If the number of features of your data is 2, you can visualize your result using SVM_plot.m file.
Demo of binary classification of hard margin models
Data-set
MATLAB sample data set Fisher's 1936 iris data (fisheriris) consists of measurements on the sepal length, sepal width, petal length, and petal width for 150 iris specimens. There are 50 specimens from each of three species:
Setosa,
Versicolor,
Virginica.
SVM
Support Vector Machine (SVM) [Cortes & Vapnuk, 1995] is a supervised learning model.
The following are the demo of SVM:
Transductive SVM (TSVM)
Transductive SVM (TSVM) [Joachims, 1995] is a semi-supervised learning model.
semisup-learn/methods/scikitTSVM.py: Semi-supervised learning frameworks for python, which allow fitting scikit-learn classifiers to partially labeled data
请发表评论