-
Notifications
You must be signed in to change notification settings - Fork 91
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Question about loss_functions.py #10
Comments
Hi @junikkoma , nn.Linear is used to initialize the weights of fc layer, and during model training, forward() function is called. The layer only get initialized once here:
|
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Hi, thank you for great implementation. I appreciate your work as well as your generosity for opening it.
As mentioned in title, I have a question about line 35 of loss_functions.py, as given below
Angular-Penalty-Softmax-Losses-Pytorch/loss_functions.py
Line 35 in c41d599
To my understanding, I think it would initialize new fully connected layer in each epoch of training.
I don't understand how this layer can be optimized via backpropagation, as it would be re-initialized each time.
It would be a great help if anyone can teach me why such inference is wrong.
The text was updated successfully, but these errors were encountered: