Skip to content

My implementation of an MLP for educational purposes using raw numpy.

License

Notifications You must be signed in to change notification settings

nikita-petrashen/mynet

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

6 Commits
 
 
 
 
 
 
 
 

Repository files navigation

mynet

My implementation of a MLP for self-education purposes using raw numpy.

Types of layers implemented: Linear, ReLu activation layer, BatchNorm

Optimizing algorithm: Adam using mini-batches

About

My implementation of an MLP for educational purposes using raw numpy.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages