Too many indices for tensor of dimension 1

Good day all,

I was trying to do some signal processing in Pytorch. I have written the same code in Tensorflow and it worked, but the one below is not.
Thank you.

import os
import torch
from torch.autograd import Variable
import numpy as np

class Modulator(object):

def __init__(self, mod_type, K): 
    # Set modulation type
    if (mod_type not in ['BPSK', '4PAM']):
        raise(Exception('Modulator: Unknown modulation format'))
    self.mod_type = mod_type
    self.K = K
    
    # Create constellation
    if (self.mod_type == 'BPSK'):
        self.constellation = np.array([-1.0, 1.0])
        self.constellation = torch.from_numpy(self.constellation)
    elif (self.mod_type == '4PAM'):
        self.constellation = np.array([-3.0, -1.0, 1.0, 3.0])
        self.constellation = torch.from_numpy(self.constellation)
    
    self.constellation_size = self.constellation[:,0].shape
   
    # Normalize constellation to unit power and convert to tensor
    self.constellation /= torch.sqrt(torch.mean(torch.abs(self.constellation)**2))
    self.constellation = torch.Variable(self.constellation, requires_grad = False)
    return

def random_indices(self, batch_size=4):
    '''Generate random constellation symbol indices'''
    indices = torch.FloatTensor(, self.K).uniform_(0,self.constellation_size).int()

    return indices

indices = Modulator(‘BPSK’, 10)

When the code is run, the following error is returned:
in
33
34 return indices
—> 35 indices = Modulator(‘BPSK’, 10)

in init(self, mod_type, K)
21 self.constellation = torch.from_numpy(self.constellation)
22
—> 23 self.constellation_size = self.constellation[:,0].shape
24
25 # Normalize constellation to unit power and convert to tensor

IndexError: too many indices for tensor of dimension 1

Hi,

I have figured out the problem. Instead of:

self.constellation_size = self.constellation.shape[0]

I write:

self.constellation_size = self.constellation[:,0].shape

However, another error still appears:
AttributeError Traceback (most recent call last)
in
33
34 return indices
—> 35 indices = Modulator(‘BPSK’, 10)

in init(self, mod_type, K)
25 # Normalize constellation to unit power and convert to tensor
26 self.constellation /= torch.sqrt(torch.mean(torch.abs(self.constellation)**2))
—> 27 self.constellation = torch.Variable(self.constellation, requires_grad = False)
28 return
29

AttributeError: module ‘torch’ has no attribute ‘Variable’

But I have actually imported the Variable from torch.autograd

If you have already imported the Variable class, you won’t need to use the torch namespace before it.
However, since Variables are deprecated since 0.4.0, you should completely skip this step and just use tensors.

Thank you very much. It worked.