New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Model overfitting, smooth triplet margin loss #690
Comments
Do you know what settings were used to get 0.8 accuracy? |
Not exactly, but I checked the open-metric-learning library They achieved 0.9 P@1 (max 1000 epochs )with the above parameters and using oml I was able to achieve 0.8 P@1 on validation (220 epochs), while on train it was 0.9. Training for more epochs it can probably reach the benchmark of 0.9. The main thing I was forgetting was augmentation, I could try again with pytorch-metric-learning this week. |
Hi there!
Thank you for the awesome library!
I'm currently working on training a model using the CARS196 dataset with the following parameters:
Initially, I didn't include the smooth_loss parameter, which caused the loss to get stuck at the margin of 0.2. However, after setting smooth_loss=True, I encountered an overfitting issue. Do you think the margin and type of triplets I've used are appropriate to begin with? Should I consider adjusting them?
Additionally, I'm using the VIT base 16 224 model and freezing the early layers to reduce parameters. Do you see any mistakes in my approach, or do you have any suggestions on what I should try next? I think It is possible to achieve at least Precision@1 of 0.8. Currently, I'm at 0.6 with some overfitting.
The text was updated successfully, but these errors were encountered: