Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

pretrain and finetune loss and lr #52

Open
1073521013 opened this issue Mar 11, 2022 · 0 comments
Open

pretrain and finetune loss and lr #52

1073521013 opened this issue Mar 11, 2022 · 0 comments

Comments

@1073521013
Copy link

麻烦问两个问题,感谢
1 请问loss在预训练的下降情况是怎么样的,各自初始值是多少呢,最终各自收敛到什么情况呢,finetune大概是什么情况呢
2 lr一般和batch关系比较紧密,那请问finetune过程中你们的batch和lr大概多少呢

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant