Pytorch custom function RuntimeError

Hi, I have written pytorch custom function to achieve some kind of mathematic transform. But I come across a RuntimeError:

But my input is indeed a variable since I put the input as input = Variable(torch.randn(2,1,32,32)), I still can not figure out what’s wrong with my script.
And my script is the following:

class DHTtransform(Function):
	def forward(ctx, input, center): = center
		dht = DHT2d(
			dht = np.fft.fftshift(dht, axes=(2,3,))
		dht = torch.from_numpy(dht)
		return Variable(dht)

	def backward(ctx, grad_output):
		input, = ctx.saved_variables
		grad_input = None
		if ctx.needs_input_grad[0]:
			dht = DHT2d(
				dht = np.fft.fftshift(dht, axes=(2,3,))
			dht = torch.from_numpy(dht)
			grad_input = Variable(dht)
		return grad_input

class DHTlayer(nn.Module):
	def __init__(self, center=False):
		super(DHTlayer, self).__init__() = center

	def forward(self, input):
		return DHTtransform.apply(input,

Could someone help me out? Any advice would be appreciated.

1 Like


I would suggest reading about Variables and Tensors. Basically, only Variables have the .data property (not Tensors). “.data” is the Tensor for a given Variable.

So you should be able to just do

dht = DHT2d(input.numpy())

Lemme know if that fixes your issue.

Well, actually I have tried this before but it didn’t work at all. Another error was raised that the input data must be a tensor. What I want to figure out is the format of data flow such as data or grad in the network is tensor or variable?

Same line number? Different error? With the code you have here I can’t fully debug it. But if that’s the case then my guess is that you need to get rid of the .numpy(). Most pytorch functionality requires either a Variable or a Tensor.

Can you explain what you mean by that question a bit more?

I have fixed the problem. In fact the format data flow in a neural network is tensor, which means that returns of the methods forward and backward should be tensors. So I eliminated the Variable operation. That works fine!:wink:

Isaac, thank you for sparing your time on this issue:blush:

You’re welcome! One thing though. You can’t get the gradients (after calling .backward()) if you only use Tensors. Pytorch builds an acyclic graph that is associated with leaf (user created) Variables that have “requires_grad=True”.

I would recommend checking out the documentation further. It’s quite thorough and you should be able to use Variables and Tensors to get the job done (although some functions only take Tensors).

1 Like

You are quit right. I am still confused why the input variable can not be recognized as a variable.

I would recommend setting up one of the basic examples that Pytorch has, like mnist, and then looking through it to understand how all of the components fit together. Use print statements to see if data is a Variable or a Tensor. Try breaking it to understand how it works and why it sometimes doesn’t work.

Also, make sure you read and understand this. There’s a bunch of good documentation on the site.

What do you mean by “why the input variable can not be recognized as a variable”? Please post examples and errors when you run into issues.