Skip to content

fedorovarthur/KerasAddons

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

20 Commits
 
 
 
 
 
 
 
 

Repository files navigation

Keras Add-on

Keras implementation of variety of the newest layers, losses, activations, etc. from the recent research papers.

By now there is the implementation of:

  1. Neural Arithmetic Logic Unit and Neural Accumulator, https://arxiv.org/pdf/1808.00508.pdf;
  2. Gaussian Error Linear Units, https://arxiv.org/pdf/1606.08415.pdf, GELU had been extended with reparametrization trick to have learnable mu and sigma;
  3. Relational Loss, https://arxiv.org/pdf/1802.03145.pdf;
  4. Swish activation function, https://arxiv.org/pdf/1710.05941.pdf, Swish had been added in two variants: with constant beta and parametrized with learnable beta;
  5. Layer normalization, https://arxiv.org/pdf/1607.06450.pdf.

Releases

No releases published

Packages

No packages published

Languages