Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Why skip last batch? #2

Open
Tiamo666 opened this issue Oct 11, 2018 · 6 comments
Open

Why skip last batch? #2

Tiamo666 opened this issue Oct 11, 2018 · 6 comments

Comments

@Tiamo666
Copy link

Hi, Zedong, Thanks a lot for your work.
I was a little confused about the following code:
if now_batch_size<opt.batchsize: # next epoch
continue
Does it will influence the performance?

@layumi
Copy link
Owner

layumi commented Oct 11, 2018

Hi @Tiamo666
Generally, it may not compromise the final performance. But in some cases, it does.

  1. For example, after you have trained most images in the training set, you may left one image.
    Using batch normalisation, one image will output zero and abnormal prediction.
  2. Small batchsize also will affect the running mean and std in the batchnorm layer.

@Tiamo666
Copy link
Author

@layumi thanks for your kind explanation, I got it.
Another question is that in your implementation of triplet loss,
I found that you get the negative hard samples is a little different.
your input includes sample, target, pos, pos_target. And you just get
hard negative from "pos" exclude sample, I was confused about that.

@layumi
Copy link
Owner

layumi commented Oct 11, 2018

@Tiamo666
Sure. You can sample another batch as negative pool.
But it may be more efficient by directly using other positive data as negative sample.

@Tiamo666
Copy link
Author

@layumi OK. And is it necessary to do the random permutation of "nf_data"(in train_new.py line ~217)? Cause your default opt.poolsize is 128(equal to the batchsize of nf_data), and after sorting score, the result will be the same.

@layumi
Copy link
Owner

layumi commented Oct 11, 2018

@Tiamo666
It is designed for the small poolsize.
If you using the biggest pool size, it is not necessary.

@Tiamo666
Copy link
Author

Hi, @layumi Thanks a lot

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants