The 1th argument dim does not have a specified value or default value

0

I’m trying to deploy an image to image model on Android. Model consists of encoder and decoder. The encoder works fine. But this issue comes in decoder. My model takes one tensor as input and it should return a tensor of size [1,3,1024,1024] as output. But there comes a Fatal in the app.

final long[] styleshape = new long[]{3};
final FloatBuffer styledBuffer = Tensor.allocateFloatBuffer(3);
styledBuffer.put(-0.4951f);
styledBuffer.put(-1.0065f);
styledBuffer.put(-1.0413f);

styledBuffer.get(0);
Tensor styledTensor = Tensor.fromBlob(styledBuffer, styleshape);
IValue style = IValue.from(styledTensor);
final Tensor styledOutputTensor = decoderModule.forward(style).toTensor();

I tried passing a random tensor of the similar shape and then I didn’t get the error.

def forward(self,  style_tensor):
        spade_input=None 
        pure_generation=False
        content_list=[]
        content_list.append(torch.rand([1, 192, 256, 256]))
        content_list.append(torch.rand([1, 5, 512, 512]))
        content_list.append(torch.rand([1, 5, 1024, 1024]))
        style_tensor = torch.randn(1,1,3,1)
        style_tensor=style_tensor.view(1,1,3,1)
        if self.pred_adain_params:
            adain_params = self.adain_net(style_tensor)
            self.assign_adain_params(adain_params, self)

        if self.pred_conv_kernel:
            assert style_tensor.size(0) == 1, 'prediction of convilution does not work with batch size > 1'
            self.assign_style(style_tensor.view(1, -1), self)

        tensor = module_list_forward(self.body, content_list[0], spade_input)
        for skip_content, up_layer, up_postprocess_layer, skip_preprocess_layer in     zip(content_list[1:], self.upsample_head, self.upsample_postprocess,self.skip_preprocess):                            
                                                                                       
                                                                                       
                                                                                       
            tensor = up_layer(tensor)
            skip_tensor = skip_preprocess_layer(skip_content)
            tensor = torch.cat([tensor, skip_tensor], 1)
            tensor = up_postprocess_layer(tensor)
        for layer in self.model_postprocess:
            tensor = layer(tensor)    

        tensor = torch.tanh(tensor[:, : 3])

                            
        #ret = torch.randn([1, 3, 1024, 1024])
        
        return tensor

Expecting a Tensor but getting the above error.