Skip to content
#

pre-training

Here are 143 public repositories matching this topic...

Comprehensive Project on training and fine-tuning transformer models using PyTorch and the Hugging Face Transformers library. Aimed at enthusiasts and researchers, it offers an accessible yet deep dive into the practical aspects of working with transformers for NLP tasks.

  • Updated Mar 13, 2024
  • Jupyter Notebook

Pre-Training and Fine-Tuning transformer models using PyTorch and the Hugging Face Transformers library. Whether you're delving into pre-training with custom datasets or fine-tuning for specific classification tasks, these notebooks offer explanations and code for implementation.

  • Updated Mar 13, 2024
  • Jupyter Notebook

Improve this page

Add a description, image, and links to the pre-training topic page so that developers can more easily learn about it.

Curate this topic

Add this topic to your repo

To associate your repository with the pre-training topic, visit your repo's landing page and select "manage topics."

Learn more