Hi all,
What is the reshape layer in pytorch?
In torch7 it seems to be nn.View, but what it is in pytorch?
What I want is to add a reshpe layer in nn.Sequential.
Thanks.
Hi all,
What is the reshape layer in pytorch?
In torch7 it seems to be nn.View, but what it is in pytorch?
What I want is to add a reshpe layer in nn.Sequential.
Thanks.
We don’t recommend that. Use nn.Sequential
only for trivial sequences, if you need to insert some reshaping or views, wrap it in the container. You can see how torchvision models are implemented.
Hi, good example. Thanks.
what is not recommended? My network is a bit complex so I use nn.Sequential.
BTW, if I do not use nn.Sequential, what is the reshape layer in pytorch?
Thank you.
There’s no reshape layer. You just call .view
on the output you want to reshape in the forward
function of your custom model.
I didn’t find how reshape is wrapped in a container in that example. Could you elabrate a little more?
Thanks!
Thanks for your reply. But it is still in forward function.
How could I do something like self.node = nn.sequential(*layers)
, layers contain reshape
, and later I only need to call self.node(input)
If you really want a reshape layer, maybe you can wrap it into a nn.Module
like this:
import torch.nn as nn
class Reshape(nn.Module):
def __init__(self, *args):
super(Reshape, self).__init__()
self.shape = args
def forward(self, x):
return x.view(self.shape)
Thanks~ but it is still so many codes, a lambda layer like the one used in keras would be very helpful.
We are not big on layers, in fact you can avoid entire sequential and put a for loop
Knowing that Flatten()
layer was recently added, how about adding the Reshape
layer as well for the very same reason Flatten() was added.
It just makes life easier, specially for new comers from Keras. and Also, it can come in handy in normal sequential models as well.
I have to ask why reshaping does not count as “trivial”?
Current way of work forces me to separate logic of data flow to two separate places - definition of the nn.Sequential, and forward()
I think in Pytorch the way of thinking, differently from TF/Keras, is that layers are generally used on some process that requires some gradients, Flatten()
, Reshape()
, Add()
, etc… are just formal process, no gradients involved, so you can just use helper functions like the ones in torch.nn.functional.*
…
There’s some use cases where a Reshape()
layer can come in handy, like in embedded systems where you add to your model firstly a reshape, so that all the model is compacted to be flashed in the device, and the reshape can adjust incoming data from sensors…
For high level DL, those layers are more confusing than beneficial…
I think the layer nn.Unflatten() may do the job. It can be inserted into a Sequential model.