Skip to content

Latest commit

 

History

History
34 lines (28 loc) · 1.18 KB

File metadata and controls

34 lines (28 loc) · 1.18 KB
license datasets language pipeline_tag tags base_model
cc-by-nc-4.0
meta-math/MetaMathQA
en
text-generation
Math
merge
Q-bert/MetaMath-Cybertron
berkeley-nest/Starling-LM-7B-alpha

MetaMath-Cybertron-Starling

Merge Q-bert/MetaMath-Cybertron and berkeley-nest/Starling-LM-7B-alpha using slerp merge.

You can use ChatML format.

Detailed results can be found Here

Metric Value
Avg. 71.35
ARC (25-shot) 67.75
HellaSwag (10-shot) 86.23
MMLU (5-shot) 65.24
TruthfulQA (0-shot) 55.94
Winogrande (5-shot) 81.45
GSM8K (5-shot) 71.49