Generic L-layer 'straight in Python' fully connected Neural Network implementation using numpy.
-
Updated
May 29, 2021 - Python
Generic L-layer 'straight in Python' fully connected Neural Network implementation using numpy.
Deep Learning model for predicting success after donation coded in Google Colab
I have implemented some AI projects from scratch implementation without explicit use of the built-in-libraries and thus added to this repo.
Artificial Neural Networks Activation Functions
GAAF implementation on Keras
Feed Forward Neural Network to classify the FB post likes in classes of low likes or moderate likes or high likes, back propagtion is implemented with decay learning rate method
Neural Network implementation from scratch along with its analysis with different type of activation function and with variation in hidden layer size and depth.
Time series forecast using RNN and LSTM
2nd Project of Course 'Machine Learning' of the SMARTNET programme. Taken at the National and Kapodistrian University of Athens.
This repo is created for learning about computer vision and pattern recognition
A data classification using MLP
Neural-Net-Numpy(NNN) is a simple python package for training neural networks using only numpy components
Exploration of teamwork in neural networks
"The 'Activation Functions' project repository contains implementations of various activation functions commonly used in neural networks. "
Comparison of common activation functions on MNIST dataset using PyTorch.
Simple self-written ANN powered by NumPy to classify handwritten digits of the famous MNIST Dataset. ✍️
Add a description, image, and links to the tanh topic page so that developers can more easily learn about it.
To associate your repository with the tanh topic, visit your repo's landing page and select "manage topics."