Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Question about loss_functions.py #10

Open
junikkoma opened this issue Feb 3, 2021 · 1 comment
Open

Question about loss_functions.py #10

junikkoma opened this issue Feb 3, 2021 · 1 comment

Comments

@junikkoma
Copy link

Hi, thank you for great implementation. I appreciate your work as well as your generosity for opening it.

As mentioned in title, I have a question about line 35 of loss_functions.py, as given below

self.fc = nn.Linear(in_features, out_features, bias=False)

To my understanding, I think it would initialize new fully connected layer in each epoch of training.
I don't understand how this layer can be optimized via backpropagation, as it would be re-initialized each time.

It would be a great help if anyone can teach me why such inference is wrong.

@zhekang
Copy link

zhekang commented Feb 18, 2021

Hi @junikkoma , nn.Linear is used to initialize the weights of fc layer, and during model training, forward() function is called. The layer only get initialized once here:

self.adms_loss = AngularPenaltySMLoss(3, num_classes, loss_type=loss_type)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants