Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

bge m3如何进行预训练 #754

Open
adol001 opened this issue May 8, 2024 · 2 comments
Open

bge m3如何进行预训练 #754

adol001 opened this issue May 8, 2024 · 2 comments

Comments

@adol001
Copy link

adol001 commented May 8, 2024

有没有类似bge 1.5的那种预训练脚本?

@staoxiao
Copy link
Collaborator

staoxiao commented May 8, 2024

bge-m3 and bge-1.5 share the same pretraining script: https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/pretrain

@adol001
Copy link
Author

adol001 commented May 9, 2024

@staoxiao
是以 https://huggingface.co/BAAI/bge-m3-retromae 为基座进行预训练吗?
然后是不使用unify的方式进行微调,最后使用unify的方式进行微调?
BAAI/bge-m3-unsupervisedBAAI/bge-m3在微调训练上的区别就是开启了联合训练?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants