Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

purpose of batch norm controller #65

Open
hzhz2020 opened this issue Jan 6, 2023 · 0 comments
Open

purpose of batch norm controller #65

hzhz2020 opened this issue Jan 6, 2023 · 0 comments

Comments

@hzhz2020
Copy link

hzhz2020 commented Jan 6, 2023

Dear Authors

Thanks for providing this great repo!
You mentioned in your FlexMatch paper that a batch norm controller is introduced in the codebase to prevent performance crashes for some algorithms. You mentioned that Mean Teacher, Pi-model and MixMatch might be unstable if update Batchnorm for both labeled and unlabeled data in turn. Does this have to do with multi-GPU training? (if I use a single GPU, will this instability persist?)

Alternatively, can i simply freeze the batchnorm when forwarding the unlabeled batch?

Hope to hear from you. Thanks!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant