Skip to content

Transformer implementation from scratch written on Py-Torch. Optimized for CUDA runtime, designed to integrate seamlessly with Azure ML workspaces

Notifications You must be signed in to change notification settings

ParzivalExtrimis/Plato-GPT-2

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

8 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Plato-GPT-2

Transformer implementation from scratch written on Py-Torch. Optimized for CUDA runtime, designed to integrate seamlessly with Azure ML workspaces

Modularized version. Currently clocked at 85.04M parameters Automatic training pipeline configured for Azure ML.

About

Transformer implementation from scratch written on Py-Torch. Optimized for CUDA runtime, designed to integrate seamlessly with Azure ML workspaces

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published