Skip to content

Sample code for creating a multi-layer perceptron using Theano.

Notifications You must be signed in to change notification settings

nbarba/Theano-Based

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

17 Commits
 
 
 
 
 
 

Repository files navigation

MultilayerPerceptron

Theano wrapper for creating a simple multi-layer perceptron neural network.

Usage

The use of the MLP class is pretty straightforward. The neural network must first be instantiated and the method to train the network must be invoked, i.e.:

#Create the classifier
classifier = MLP(number_of_features,number_of_neurons_in_hidden_layer)
#Train
classifier.minibatch_gradient_descent(features,labels,batch_size);

Constructor parameters include the number of neurons for the input and hidden layer. For training, except features & labels, the batch size should also be defined. Other training parameters that can be defined, include:

  • the learning rate
  • the l2 normalization lambda value

Toy Example

A toy example is also provided, that attempts to train the multi-layer perceptron on a real binary classification dataset. The dataset used is a randomly selected subset of the "banknote authentication" dataset available at the UCI Machine learning repository. The task is to distinguish original from forged bank-note like statements, using wavelet features extracted from images. The dataset is un-normalized, so the example also includes code to perform z-score normalization on the train/test set.

About

Sample code for creating a multi-layer perceptron using Theano.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages