Skip to content

SaiKalyan124/Multi-layer-Neural-Network

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

6 Commits
 
 
 
 
 
 

Repository files navigation

Multi-layer Neural Network

Implementing multi-layer neural network WITHOUT using external deep learning libraries such as Keras, Caffe, Theano, TensorFlow, PyTorch...

  • Considering a neural network as shown in Figure 1

    • The width of the layer 1 is 2, and the width of the layer 2 is 1.
    • The activation functions of the layer 1 are the hyperbolic tangent.
    • The activation function of the layer 2 is the sigmoid.
    • The loss function is the binary cross entropy.
  • Calculating image

  • Implementing the model without using any deep learning libraries.

  • Using numpy

    • X = np.array([[0,0],[0,1],[1,0],[1,1]])
    • y = np.array([0,1,1,0])
  • Optimized using gradient descent method

image

Releases

No releases published

Packages

No packages published