Linear Algebra Gradients

Hi,

I’m looking to back-propagate gradients through a singular value decomposition for regularisation purposes.

I know that I could write my own custom function that operates on a Variable; takes its .data tensor, applies the torch.svd to it, wraps a Variable around its singular values and returns it in the forward pass, and in the backward pass applies the appropriate Jacobian matrix to the incoming gradients.

However, I was wondering whether there was a more elegant (and potentially faster) solution, where I could overwrite the “Type Variable doesn’t implement stateless method svd” Error directly, call Lapack, etc. ?

If someone could guide me through the appropriate steps and source files I need to look at, I’d be very grateful. I suppose these steps would similarly apply to other linear algebra operations which have no associated backward method currently.

Many thanks,
Max

You can use scipy/numpy to do write custom forward and backward! It’s quite straightforward.
Check out this tutorial: http://pytorch.org/tutorials/advanced/numpy_extensions_tutorial.html

Hi! Thanks for your reply. I know I can write my own custom function. However, I’m looking for a solution that directly overrides the existing implementation of torch.svd - for educational purposes mainly and because I’m wondering if it may be quicker if I can call LAPACK directly for example.

you cannot override torch.svd with your own function in a simple way.

Hello,

I encountered the same problem you described and could not find an answer.

would you be able to inform me what you did in the end?

Did you write a new function, and if so would you be able to share it?

1 Like

Hello Daniel,

yes I did not change any of the backend files in C. I just used the custom interface of PyTorch to define a new function and implemented backward in it myself using python functions. it worked okay i think. I can probably dig it out for you later today. What linear algebra operation do you require? svd?

That would be fantastic. Thanks!

Yes I require SVD

In fact, I am also searching for implementation of svds (only k largest
values of SVD)

FYI, svd is now availble in PyTorch master with full Variable support (forward + backward).

3 Likes

Does Simon’s comment help you or do you still require help? Wasn’t aware that this functionality got added already, but it’s good to know! Thanks Simon!

1 Like

Simon’s comment helped me out

But thanks for the offer!

Hi Simon,

I updated to torch 0.3 but am still getting error "the derivative for ‘svd’ is not implemented"
when using “loss.backward()”

am I doing something wrong?

Daniel, the derivative of svd is implemented after we released 0.3.
You can wait for the next release, or install from source using instructions at: https://github.com/pytorch/pytorch#from-source

1 Like

When you say install from source
is there a source to install from?
(I ask since you said svd is implemented)

Maybe I don’t understand your question, but you should use soumith’s link and follow the install guide.

apologies.
I am new to Torch

I just understood the previous comment

thanks for the help!

1 Like