Skip to content
#

mini-batch-gradient-descent

Here are 60 public repositories matching this topic...

Gradient Descent is a technique used to fine-tune machine learning algorithms with differentiable loss functions. It's an open-ended mathematical expression, tirelessly calculating the first-order derivative of a loss function and making precise parameter adjustments.

  • Updated Apr 26, 2024
  • Jupyter Notebook

🐚 Abalone Age Prediction: Dive into Data, Surf on Insights! 📊 Unleash the power of predictive analytics on abalone age estimation! From meticulous data exploration to a showdown of optimization methods, this repo is your gateway to accurate age predictions using physical measurements using Pysaprk. 🌊🔮

  • Updated Nov 14, 2023
  • Jupyter Notebook

This GitHub repository explores the importance of MLP components using the MNIST dataset. Techniques like Dropout, Batch Normalization, and optimization algorithms are experimented with to improve MLP performance. Gain a deeper understanding of MLP components and learn to fine-tune for optimal classification performance on MNIST.

  • Updated Jun 12, 2023
  • Jupyter Notebook

I implemented a CNN to train and test a handwritten digit recognition system using the MNIST dataset. I also read the paper “Backpropagation Applied to Handwritten Zip Code Recognition” by LeCun et al. 1989 for more details, but my architecture does not mirror everything mentioned in the paper. I also carried out a few experiments such as adding…

  • Updated Jul 7, 2022
  • Jupyter Notebook

Improve this page

Add a description, image, and links to the mini-batch-gradient-descent topic page so that developers can more easily learn about it.

Curate this topic

Add this topic to your repo

To associate your repository with the mini-batch-gradient-descent topic, visit your repo's landing page and select "manage topics."

Learn more