Hi, here again,
I have a question, say in my training code, I want to save the output of a module or layer for use in another layer (while training), ensuring the output of the said layer or module does not change during training- how do I do that? Do I define it in the forward pass? in the code below say I want to use the output at D, do I save the output or call self.block4 again? Apologies if this question is stupid I am just learning. Thank you!
class block1(nn.Module):
...........
class block2(nn.Module):
...........
class block3(nn.Module):
..................
class block4(nn.Module):
...............
class exped (nn.Module):
class RDCDH (nn.Module):
def __init__(self):
super (RDCDH, self).__init__ ()
self.conv = conv
self.block1 = block1
self.block2 = block2
self.block3 = block1
self.block4 = block1
self.trans = exped
def forward(self, x):
out = self.conv (x)
out = self.block1(out, before_Trans=False) #A
out = self.block2 (out, before_trans=False) #B
out = self.block3 (out, before_Trans=False) #C
out = self.block4(out, before_trans = False) #D
out = self.upsam_trans(out) #P
out = self.block4(out, before_trans = True)#Q
out = self.trans(out)
return out
Many thanks.