Skip to content

I implemented the gradient descent algorithm for the perceptron, which is designed for NOR binary logic function and uses it to update its weights.

Notifications You must be signed in to change notification settings

mahsawz/Perceptron-Algorithm

Repository files navigation

Perceptron-Algorithm

The Perceptron algorithm is a two-class (binary) classification machine learning algorithm.

It is a type of neural network model, perhaps the simplest type of neural network model.

It consists of a single node or neuron that takes a row of data as input and predicts a class label. This is achieved by calculating the weighted sum of the inputs and a bias (set to 1). The weighted sum of the input of the model is called the activation.

  • Activation = Weights * Inputs + Bias

If the activation is above 0.0, the model will output 1.0; otherwise, it will output 0.0.

  • Predict 1: If Activation > 0.0
  • Predict 0: If Activation <= 0.0

Given that the inputs are multiplied by model coefficients, like linear regression and logistic regression, it is good practice to normalize or standardize data prior to using the model.

Here, I have a file with "gradientdescentforperceptron_nor.py" name that implements the gradient descent algorithm for the perceptron, which is designed for NOR binary logic function and uses it to update its weights.

also I have another file with "perceptron_algorithm.py" name that implements the Perceptron algorithm and run it on the attached data.

The classification result:

The error rate per iteration:

About

I implemented the gradient descent algorithm for the perceptron, which is designed for NOR binary logic function and uses it to update its weights.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages