Skip to content

shawnricecake/EdgeQAT

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

8 Commits
 
 
 
 
 
 
 
 

Repository files navigation

EdgeQAT

Official repo for the paper: EdgeQAT: Entropy and Distribution Guided Quantization-Aware Training for the Acceleration of Lightweight LLMs on the Edge

Implementation

Follow the instructions of the BabyLLaMA to implement the training environment, and BabyLM Challenge to implement the evaluation environment.

Usage

  1. Download dataset from BabyLM Challenge
  2. Clean the dataset according to BabyLLaMA
  3. Pretrain teacher model
  4. Download FP16 LLaMA-58M model from BabyLLaMA
  5. QAT with scripts in distill_train/scripts/
  6. Evaluation with scripts in evaluation_pipeline/

Citation

@article{shen2024edgeqat,
  title={EdgeQAT: Entropy and Distribution Guided Quantization-Aware Training for the Acceleration of Lightweight LLMs on the Edge},
  author={Shen, Xuan and Kong, Zhenglun and Yang, Changdi and Han, Zhaoyang and Lu, Lei and Dong, Peiyan and others},
  journal={arXiv preprint arXiv:2402.10787},
  year={2024}
}

About

Official Repo for EdgeQAT

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published