Skip to content

Understand and code some basic algorithms in machine learning from scratch

Notifications You must be signed in to change notification settings

QuyAnh2005/homemade-machine-learning

Repository files navigation

homemade-machine-learning

Implement some ML algorithms from scratch.

Content

  • Linear Regression
    • Model
    • Least Square Method
    • Test Your Model
  • Gradient with Linear Regression
    • Gradient Formula
    • Implement
    • Apply Model
  • Regression and Sigmoid
  • Cost function and Gradient
    • Cost function
    • Partial derivative of $J(\theta)$
    • Update the weights
    • Implement gradient descent function
  • Training model
  • Test model
    • Your model
    • Sklearn model
  • Why Need to Reduce Dimensionality?
  • Main Approaches for Dimensionality Reduction
    • Projection
    • Manifold Learning
  • Principle Compent Analysis
    • Preserving the Variance
    • Principal Components
    • Projecting Down to d Dimensions
  • PCA in Sklearn
  • What is k-means clustering?
  • Pros and cons of k-means
  • K-means algorithm and implementation
  • K-means algorithm in sklearn
  • What is Decision Tree?
  • Important Terminology related to Decision Trees
  • How do Decision Trees work?
  • ID3 Algorithm
    • Entropy
    • Infomation Gain
    • Split a node into branches
  • Model from scratch and predict
  • Decision Tree in sklearn and some notes
    • CART cost function for classification
    • CART cost function for regression
  • Pros and cons
  • SVMs - Gradient Descent
    • What is SVMs?
    • Primal Support Vector Machine
      • Distance between Two Parallel Lines
      • Optimal Hyperplane
    • Hard Margin
    • Soft Margin
    • Solve SVMs by Gradient Descent
      • Hard Margin by Gradient Descent
      • Soft Margin by Gradient Descent
  • SVMs - Lagrange Method
    • Review about Primal Support Vector Machine
    • Dual Support Vector Machine
    • Hard - Soft Margin
      • Implementation
      • Sklearn
    • Kernel SVM
      • Implementation
      • Sklearn
    • Visualization
    • References
  • Gaussian Naive Bayes
    • Naive Bayes Rule
    • Gaussian Naive Bayes
    • Gaussian Naive Bayes Model from Scratch
    • Gaussian Naive Bayes Model in Sklearn
  • Multinomial Naive Bayes
    • Multinomial Naive Bayes
    • Multinomial Naive Bayes Model from Scratch
    • Multinomial Naive Bayes Model in Sklearn
    • Multinomial Naive Bayes for Out of Vocabulary
  • Bagging
  • What is Random Forest?
  • Important Features of Random Forest
  • Important Hyperparameters
  • Pseudo-code
  • Implementation
  • Advantages and Disadvantages
  • References
  • Introduction
    • Definnition of Gradient Boosting
    • Overview of ensemble learning
  • Theoretical foundations of Gradient Boosting
    • Introduction to decision trees
    • Mathematical formulation of Gradient Boosting
  • Implementation
  • Pros and cons

Other Algorithms

Besides the above algorithms, I have researched some other algorithms and will publish as soon as possible. A list of algorithms:

  • SVD
  • LDA
  • DBSCAN
  • ...

Contact

If you have any problems, please contact me: