Skip to content

Difference b.w calling self.log(..., on_step=False, on_epoch=True) in training_step() and training_epoch_end() #485

Answered by ashleve
janghyuk-choi asked this question in Q&A
Discussion options

You must be logged in to vote

No difference as far as I'm aware, as long as we're logging torchmetrics object directly. Not sure if there won't be slight differences when we log through value though.

  • Inside training_epoch_end() flags on_step=False, on_epoch=True don't matter as you're always logging only once per epoch.
  • Inside training_step() setting on_epoch=True will make lightning average given value over logs from all steps.

Replies: 1 comment 1 reply

Comment options

You must be logged in to vote
1 reply
@janghyuk-choi
Comment options

Answer selected by janghyuk-choi
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
2 participants