You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi @jaggernaut007, are you referring to this line? If that's the case, the loss within the testing loop gives you a single scalar value per batch, and that's what you are accumulating in test_loss. That's why you need to divide after the loop by the number of batches, num_batches.
Add Link
https://pytorch.org/tutorials/beginner/basics/optimization_tutorial.html
Describe the bug
The loss calculation seems to be wrong. Should we divide the total loss by the number of values of X rather than the number of batches?
The optimiser also seems wrong.
Describe your environment
Running on Google Colab
cc @subramen @albanD
The text was updated successfully, but these errors were encountered: