Torch.tensor doesnt work, provides following error: " TypeError: 'module' object is not callable"

torch.tensor([])
Traceback (most recent call last):
File “”, line 1, in
TypeError: ‘module’ object is not callable

torch.tensor([0,2])
Traceback (most recent call last):
File “”, line 1, in
TypeError: ‘module’ object is not callable

Possible reasons why that would happen?

2 Likes

i had same issue in version 0.3
upgraded to version 0.4. solved for me.

3 Likes

Ah yes updating to the newest version did solve the issue. Thank you!

Hi,
I’m also facing the same issue. Can you please tell me, how you updated the package? I tried

conda update pytorch

but still its showing the same version.

Thanks in advance

Try to uninstall pytorch and torchvision first, update conda and install it again using the instructions from the website.

1 Like

Ahh. That worked. Thank you.

I had the same problem. Upgrading to 0.4 works, however I have to use 0.3 for other reasons. Is there any way to work around this issue but still using 0.3 version? (The reason to avoid 0.4 is that the custom function seems to be very slow, so…) Can anyone help me on this? Thank you.

In older version you could use:

torch.FloatTensor([])
torch.FloatTensor([1., 2.])

Also any supported type, e.g. torch.LongTensor.

For 0.3 version, I want to add a learnable parameter like "self.scale = nn.Parameter(torch.tensor([10.0]), requires_grad=True) in nn.Module. But it throws an error, TypeError: ‘module’ object is not callable. If I use a tensor, torch.FloatTensor([10.0]), it is not learnable, right?

class PreActBottleneck(nn.Module):
def init(self, inplanes, planes, stride=1, downsample=None):
super(PreActBottleneck, self).init()
self.conv1 = nn.Conv2d(inplanes, planes, kernel_size=1, bias=False)
self.scale = nn.Parameter(torch.tensor([10.0]), requires_grad=True)

You could still wrap the FloatTensor into a Parameter:

self.scale = nn.Parameter(torch.FloatTensor([10.]))
1 Like

Thank you. It works.

Could you post your code so that we can have a look at it?

Sorry my apologies, I was making a stupid mistake. I am not facing this issue. Thanks for your help.