Are weights frozen in nn.Identity?

Quick question - I want to use nn.Identity as a placeholder.

layer_one = nn.Linear(input_size, hidden_size) if use_linear else nn.Identity

However, I am worried that during backprop the Identity weight values will be updated. Will it?


The identity layer shouldn’t have any weights that could be updated.

import torch.nn
>>> a = torch.nn.Linear(100, 100)
>>> a.weight
tensor([[-0.0092, -0.0083, -0.0101,  ..., -0.0416,  0.0169, -0.0232],
        [ 0.0707,  0.0684, -0.0826,  ..., -0.0583, -0.0801, -0.0349],
        [ 0.0531,  0.0917, -0.0934,  ...,  0.0632, -0.0696, -0.0597],
        [-0.0661,  0.0780,  0.0926,  ...,  0.0099, -0.0024, -0.0690],
        [ 0.0313,  0.0154, -0.0628,  ...,  0.0512,  0.0821, -0.0196],
        [ 0.0760,  0.0127, -0.0037,  ..., -0.0742, -0.0545, -0.0989]],
>>> b = torch.nn.Identity()
>>> b.weight
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/pytorch/torch/nn/modules/", line 1130, in __getattr__
    raise AttributeError("'{}' object has no attribute '{}'".format(
AttributeError: 'Identity' object has no attribute 'weight'

What is I declare a nn.Parameter() and attach it to Identity() as ‘weight’ using register_parameter(), will the weight for Identity() then be updated using back propagation?

No, since the newly added weight attribute won’t be used in nn.Identity's forward method.

How do I make sure that it is?
I know it is a bit weird, so I’m trying to find an architecture, and on a edge there is a MixedOp containing Identity(), Zero(), Conv2d() etc.
Conv2d() has its own weights which are used, and given to the optimizer, however for identity() and zero() I’m registering_parameter ‘weight’ and the optimizer is getting these parameters, and the mixedOp is basically the sum of outputs from all the edge operations.

If you want to create a new nn.Parameter and use it, you could write a custom nn.Module and initialize it in the __init__ method and use it in the forward. nn.Identity does not use any parameters or buffers and will just return the input itself without any modification. Adding attributes to this module won’t change its forward implementation, so these attributes will never be used.

1 Like

Thank you, this is helpful.