-
Notifications
You must be signed in to change notification settings - Fork 11
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Blocksize for pretrained models #4
Comments
@Quaa1205 Hi, both of them should work, but if you need longer sequences, then you should retrain with a 200 blocksize. I set it to 64 because of the memory issue that I had. Personally, I prefer models with longer sequences capacity. |
So is this the reason why I got a worse result? I inferenced using pretrained models for 3 variables and expressions with logmse less than -10 accounted for about 23%, rather than over 30% in your paper. |
Hello
The blocksize for 3 variable model in config file is 200 , but the blocksize in the pretrained model might be 64? So which blocksize is for reproducing the results? I tested on the pretrained model for three variables and got a result a little worse than yours. Do I need to set the blocksize to 200 and retrain to reproduce your results?
Thanks.
The text was updated successfully, but these errors were encountered: