-
Notifications
You must be signed in to change notification settings - Fork 4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
incorrect model_max_length #289
Comments
Not sure if maybe this is in action: However, it seems odd that this happens and at the same time, the tokenizer in train.py is specified as: Specifically using traning_args.model_max_length not any model attribute. |
I don't understand why the default model_max_length is set to 512, and the example training bash script on the main readme doesn't pass in an argument for that as 2048 (the context size for llama). What's going on here? Thanks.
The text was updated successfully, but these errors were encountered: