Difference b.w calling self.log(..., on_step=False, on_epoch=True)
in training_step() and training_epoch_end()
#485
Answered
by
ashleve
janghyuk-choi
asked this question in
Q&A
-
lightning-hydra-template/src/models/mnist_module.py Lines 75 to 76 in 75b44ff In the src/models/mnist_module.py, I wonder what the difference is compared to using It seems that the resulting values in either way are the same. Is there any difference in the internal mechanism? |
Beta Was this translation helpful? Give feedback.
Answered by
ashleve
Dec 18, 2022
Replies: 1 comment 1 reply
-
No difference as far as I'm aware, as long as we're logging torchmetrics object directly. Not sure if there won't be slight differences when we log through value though.
|
Beta Was this translation helpful? Give feedback.
1 reply
Answer selected by
janghyuk-choi
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
No difference as far as I'm aware, as long as we're logging torchmetrics object directly. Not sure if there won't be slight differences when we log through value though.
training_epoch_end()
flagson_step=False, on_epoch=True
don't matter as you're always logging only once per epoch.training_step()
settingon_epoch=True
will make lightning average given value over logs from all steps.