Why doesnt Pytorch provide properties for Class names?

Hi everyone.
I come from a C++/C# background and this has made me thinking for sometime now! Here is my question :
Since Pytorch has a dynamic nature, there will be many situations in which you’d be dealing with different classes on the fly and checking for which classes you are dealing with.

Currently one way to access the class names is through __class__ and __class__.__name__ (class attributes) .

My question is, knowing that names with __, mean they are supposed to be private and not be accessed by the developer, Why doesn’t Pytorch provide a simple property to access them?

Simple things such as :

for m in model.modules():
    if (m.__class__.__name__ == torch.nn.Linear.__class__.__name__):
         ...

become very long and ugly needlessly! let alone more complex ones.

I could ask this about the Python langauge, but thought maybe, due to compatibility reasons, they can not do that there, but nothing stops a library to provide its own set of utility methods, properties.
So What is stopping you from providing such things that would make life easier and codes much more readable.

Thanks in advance

Hi,

I think the point is that comparing names should not be used?
For what you want, isinstance(m, torch.nn.Linear) is the pythonic way to do it I think.

2 Likes

Hi, Thank you very much.
The example was given in the Udacity Pytorch course and I just gave that as an example.
But how do you get the type of any class?
It seems type() only works on tensors, and I couldnt findy any ways to query and get a class type!

I was actually trying to finetune a resnet18. I wanted to set the gradients of all layers to zero except the last linear layer! the isinstance simply doesn’t work!
I wrote :

resnet18 = models.resnet18(pretrained=True)
resnet18.fc = nn.Linear(512, 10) 
  
for module in resnet18.modules():
    if not isinstance(module, nn.Linear):
        for param in module.parameters():
            param.requires_grad = False

What seems to work is to compare the names which is :

for module in resnet18.modules():
    if module._get_name() != 'Linear':
        print('layer: ',module._get_name())
        for param in module.parameters():
            param.requires_grad_(False)
    elif module._get_name() == 'Linear':
        for param in module.parameters():
            param.requires_grad_(True)

something like nn.Linear.__class__.__name__ doesnt have the proper name! it contains type as the name which is weird!
Thats why I’m asking.

on a side note :
Something weird happens, if I omit the second elseif block, the resnet18.fc parameters gradient will be False! while they are initially True, and the first if clause clearly checks for all layers except ‘Linear’.
I’d like to know why this is happening as well, and if this is a bug!?

The isinstance comparison is working:

resnet18 = models.resnet18(pretrained=True)
resnet18.fc = nn.Linear(512, 10) 

for module in resnet18.modules():
    if isinstance(module, nn.Linear):
        print(module)

> Linear(in_features=512, out_features=10, bias=True)

as it returns the only nn.Linear layer in the model.

However, note that modules will be called recursively on your main model.
E.g. the first result will be the parent ResNet class.
If you call module.parameters() you will get all parameters of the model, including the parameters of the linear layer.

1 Like

also __name__ and __class__ are special attributes defined by Python (not specific to PyTorch) and they aren’t really private and you can expect they to always be there: https://docs.python.org/3/reference/datamodel.html

2 Likes

Thanks, but do you know why nn.Linear.__class__.__name__ doesnt have a proper name in it (it returns type? should it not return ‘Linear’?

Because obj.__class__ accesses the class (type) of the object. Classes are also objects in Python. Most classes are just instances of the type metaclass. Hence, if you do A_Class.__class__ or type(A_Class) you (often) get the type metaclass.

On the other hand, an instance of nn.Linear, e.g., nn.Linear(3, 4) has class nn.Linear so you get nn.Linear(3, 4).__class__ == nn.Linear and type(nn.Linear(3, 4)) == nn.Linear.

You probably now realize that if you want to get the name of nn.Linear, you should instead use nn.Linear.__name__ rather than nn.Linear.__class__.__name__ which returns the name of the metaclass type.

2 Likes

Aha ! that was great! Thanks a gazillion times sir, your explanation was great :slight_smile: