Copy a block of layers from a model

Hi everyone,

I’m new to the forums, so correct me if I have not framed the problem correctly.

I’m in a situation where I have to make a copy of a specific block of the network for some further calculations (like Jacobian)

My network looks like this:

class AutoCanonical(torch.nn.Module):
    def __init__(self, d_in, d_hidden, d_out, activation_fn, device, experiment):
        super(AutoCanonical, self).__init__()
        self.layers = torch.nn.ModuleList()

  ## Block for converting provided coordinates to canonical coordinates
        self.layer1 = torch.nn.Linear(d_in, 50)
        self.layers.append(self.layer1)

        self.layer2 = torch.nn.Linear(50, 50)
        self.layers.append(self.layer2)

        self.layer3 = torch.nn.Linear(50, d_in)
        self.layers.append(self.layer3)
        ## -------------------------------------------------

        ## Hamiltonian block
        self.layer4 = torch.nn.Linear(d_in, 200)
        self.layers.append(self.layer4)

        self.layer5 = torch.nn.Linear(200, 200)
        self.layers.append(self.layer5)

        self.layer6 = torch.nn.Linear(200, 1)
        self.layers.append(self.layer6)
        ## -------------------------------------------------

  def to_canonical(self, x):
        y1 = self.nonlinear_fn(self.layer1(x))
        y2 = self.nonlinear_fn(self.layer2(y1))
        #out = self.nonlinear_fn(self.layer3(y2))
        out = self.layer3(y2)
        return out

    def Ham(self, x):
        y1 = self.nonlinear_fn(self.layer4(x))
        y2 = self.nonlinear_fn(self.layer5(y1))
        #nn_fn = torch.nn.LeakyReLU(0.9) 
        out = self.layer6(y2)
        return out

 def forward(self, x):
        # set_trace()
        can_coords = self.to_canonical(x) # convert to canonical variables
        H = self.ham(can_coords)        
        return  H

I would like to make a copy of self.to_canonical block and would like to use it calculate it’s jacobian. How do I make a copy of that ?

Hi,

Given that all the modules are on the same module, you will have to work with the whole Module.
You can make a simply copy of it here by doing

new_model = AutoCanonical()
new_model.load_state_dict(current_model.state_dict())

Thanks!

Last night, I stumbled upon this issue. Looks like one can also make a copy using : new_model = copy.deepcopy(AutoCanonical()) . Am I right ?

Yes that would work as well.

1 Like