But my input is indeed a variable since I put the input as input = Variable(torch.randn(2,1,32,32)), I still can not figure out what’s wrong with my script.
And my script is the following:

class DHTtransform(Function):
@staticmethod
def forward(ctx, input, center):
ctx.center = center
ctx.save_for_backward(input)
dht = DHT2d(input.data.numpy())
if ctx.center:
dht = np.fft.fftshift(dht, axes=(2,3,))
dht = torch.from_numpy(dht)
return Variable(dht)
@staticmethod
def backward(ctx, grad_output):
input, = ctx.saved_variables
grad_input = None
if ctx.needs_input_grad[0]:
dht = DHT2d(grad_output.data.numpy())
if ctx.center:
dht = np.fft.fftshift(dht, axes=(2,3,))
dht = torch.from_numpy(dht)
grad_input = Variable(dht)
return grad_input
class DHTlayer(nn.Module):
def __init__(self, center=False):
super(DHTlayer, self).__init__()
self.center = center
def forward(self, input):
return DHTtransform.apply(input, self.center)

Could someone help me out? Any advice would be appreciated.

I would suggest reading about Variables and Tensors. Basically, only Variables have the .data property (not Tensors). “.data” is the Tensor for a given Variable.

Well, actually I have tried this before but it didn’t work at all. Another error was raised that the input data must be a tensor. What I want to figure out is the format of data flow such as data or grad in the network is tensor or variable？

Same line number? Different error? With the code you have here I can’t fully debug it. But if that’s the case then my guess is that you need to get rid of the .numpy(). Most pytorch functionality requires either a Variable or a Tensor.

Can you explain what you mean by that question a bit more?

I have fixed the problem. In fact the format data flow in a neural network is tensor, which means that returns of the methods forward and backward should be tensors. So I eliminated the Variable operation. That works fine!

You’re welcome! One thing though. You can’t get the gradients (after calling .backward()) if you only use Tensors. Pytorch builds an acyclic graph that is associated with leaf (user created) Variables that have “requires_grad=True”.

I would recommend checking out the documentation further. It’s quite thorough and you should be able to use Variables and Tensors to get the job done (although some functions only take Tensors).

I would recommend setting up one of the basic examples that Pytorch has, like mnist, and then looking through it to understand how all of the components fit together. Use print statements to see if data is a Variable or a Tensor. Try breaking it to understand how it works and why it sometimes doesn’t work.

Also, make sure you read and understand this. There’s a bunch of good documentation on the site.

What do you mean by “why the input variable can not be recognized as a variable”? Please post examples and errors when you run into issues.