Skip to content

Custom implementations of L1/L2/BCE/CE loss and ReLU/Sigmoid/Tanh/Softmax activation functions.

License

Notifications You must be signed in to change notification settings

marmiskarian/Custom-Loss-Activation-Implementations

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 
 
 
 
 
 
 

Repository files navigation

Custom Loss and Activation Functions

This repository contains custom implementations of common loss functions and activation functions in Machine Learning. These implementations are done without the use of external libraries like PyTorch, providing a deeper understanding of their underlying principles.

Implemented Loss Functions

  • L1 Loss (Mean Absolute Error): Computes the mean absolute error between predicted and target values.
  • L2 Loss (Mean Squared Error): Computes the mean squared error between predicted and target values.
  • Binary Cross-Entropy Loss: Computes the binary cross-entropy loss between predicted probabilities and target labels.
  • Categorical Cross-Entropy Loss: Computes the categorical cross-entropy loss between predicted probabilities and one-hot encoded target labels.

Implemented Activation Functions

  • ReLU (Rectified Linear Unit): Applies the rectified linear unit activation function element-wise.
  • Sigmoid Function: Computes the sigmoid activation function element-wise.
  • Hyperbolic Tangent (Tanh) Function: Computes the hyperbolic tangent activation function element-wise.
  • Softmax Function: Computes the softmax activation function across classes.

Usage

The implementations are provided in a Jupyter Notebook format for easy experimentation and understanding.

About

Custom implementations of L1/L2/BCE/CE loss and ReLU/Sigmoid/Tanh/Softmax activation functions.

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published