Skip to content

This toolbox offers 8 machine learning methods including KNN, SVM, DA, DT, and etc., which are simpler and easy to implement.

License

Notifications You must be signed in to change notification settings

JingweiToo/Machine-Learning-Toolbox

Repository files navigation

Jx-MLT : A Machine Learning Toolbox for Classification

View Machine Learning Toolbox on File Exchange License GitHub release


"Toward Talent Scientist: Sharing and Learning Together" --- Jingwei Too


Wheel

  • This toolbox contains 8 widely used machine learning algorithms

  • The A_Main file provides the examples of how to use these methods on benchmark dataset

  • Main goals of this toolbox are:

    • Sharing knowledge on machine learning works
    • Helping others in machine learning projects

Usage

The main function jml is used to perform the classification. You may switch the algorithm by simply changes the 'da' to other abbreviations

  • If you wish to use discriminate analysis ( DA ) classifier then you may write
ML = jml('da',feat,label,opts); 
  • If you want to use naive bayes ( NB ) classifier then you may write
ML = jml('nb',feat,label,opts); 

Input

  • feat : feature vector matrix ( Instance x Features )
  • label : label matrix ( Instance x 1 )
  • opts : parameter settings
    • tf : choose either hold-out / k-fold / leave-one-out
    • ho : ratio of testing data in hold-out validation
    • kfold : number of folds in k-fold cross-validation

Output

  • ML : Machine learning model ( It contains several results )
    • acc : classification accuracy
    • con : confusion matrix
    • t : computational time (s)

How to choose the validation scheme?

There are three types of performance validations. These validation strategies are listed as following. However, if you do not select any of them, then the algorithm will automatically perform 10-fold cross-validation as default.

  • Hold-out validation
opts.tf    = 1;
opts.ho    = 0.3;   % 30% data for testing 
  • K-fold cross-validation
opts.tf    = 2
opts.kfold = 10;    % 10-fold cross-validation
  • Leave-one-out validation
opts.tf    = 3 

Example 1 : K-nearest neighbor ( KNN ) with k-fold cross-validation

% Parameter settings
opts.tf    = 2;     
opts.kfold = 10;    
opts.k     = 3;     % k-value in KNN

% Load data
load iris.mat;

% Classification
ML = jml('knn',feat,label,opts);

% Accuracy
accuracy = ML.acc; 

% Confusion matrix
confmat  = ML.con;

Example 2 : Multi-class support vector machine ( MSVM ) with hold-out validation

% Parameter settings
opts.tf    = 1;     
opts.ho    = 0.3;       
opts.fun   = 'r';     % radial basis kernel function in SVM

% Load data
load iris.mat;

% Classification
ML = jml('msvm',feat,label,opts);

% Accuracy
accuracy = ML.acc; 

% Confusion matrix
confmat  = ML.con;

Example 3 : Decision Tree ( DT ) with leave-one-out validation

% Parameter settings
opts.tf     = 3;          
opts.nSplit = 50;    % number of split in DT 

% Load data
load iris.mat;

% Classification
ML = jml('dt',feat,label,opts);

% Accuracy
accuracy = ML.acc; 

% Confusion matrix
confmat  = ML.con;

Requirement

  • MATLAB 2014 or above
  • Statistics and Machine Learning Toolbox

List of available machine learning methods

  • Click on the name of algorithm to check the parameters
  • Use the opts to set the specific parameters
No. Abbreviation Name Support
09 'gmm' Gaussian Mixture Model Multi-class
08 'knn' K-nearest Neighbor Multi-class
07 'msvm' Multi-class Support Vector Machine Multi-class
06 'svm' Support Vector Machine Binary class
05 'dt' Decision Tree Multi-class
04 'da' Discriminate Analysis Classifier Multi-class
03 'nb' Naive Bayes Multi-class
02 'rf' Random Forest Multi-class
01 'et' Ensemble Tree Multi-class