What if I wanted to compute something on every batch output (so every iteration), but I only wanted to do this every 20 epochs? How could I do this? Is there a sort of “detach_event_handler” as a counterpart to attach_event_handler? Or is there some other way to do this? Thanks!
Because the first time epoch gets to 20, the handler will be attached to the trainer and then foo() will be called on every single iteration from then on, whether epoch%20 or not.
either this.
Or, as far as I understand you want to call foo() on every iteration of every 20th epoch right?
So this:
for epoch in range(n_epochs): #your epoch loop
for iteration, data in enumerate(your_dataloader, 0): #your iteration loop
if((epoch+1)%20 == 0):
foo() #function gets called on every iteration of every 20th epoch
should work. @pytorchnewbie Is this not the behavior you asked for?