Inference on tiny-llama #562
-
Need some help here! I have fine-trained tiny-llama on my personal books. How do perform inference on it? Command I used: ` ` |
Beta Was this translation helpful? Give feedback.
Answered by
abhishekkrthakur
Mar 28, 2024
Replies: 1 comment 1 reply
-
code to do inference can be found in README.md of the output repo/folder (josh-ops) |
Beta Was this translation helpful? Give feedback.
1 reply
Answer selected by
MrAnayDongre
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
code to do inference can be found in README.md of the output repo/folder (josh-ops)