LSTM TypeError invalid combination

My LSTM implementation is below
x_train is a size of torch.Size([129, 252, 4])

class LSTM(nn.Module):
  def __init__(self,input_size ,hidden_size,num_layers,batch_size,bias):
    super(LSTM, self).__init__()
    self.input_size = input_size
    self.hidden_size = hidden_size
    self.num_layers = num_layers
    self.bias = bias
    self.batch_size = batch_size

    self.lstm = nn.LSTM(input_size,hidden_size,num_layers,bias)
    self.logsoftmax = torch.nn.LogSoftmax(dim=None)

  def forward(self,input):
    hidden = (torch.zeros(self.num_layers,self.batch_size,self.hidden_size),torch.zeros(self.num_layers,self.batch_size,self.hidden_size))
    lstm_out,hidden = self.lstm(input,hidden)
    y_pred = logsoftmax(lstm_out[-1,:,:])
    return y_pred

when I enter that:

model = LSTM(4,2,1,252,bias = "True")
y_pred = model(x_train)

colab gives a type error:

stm() received an invalid combination of arguments - got (Tensor, tuple, list, str, int, float, bool, bool, bool), but expected one of:
 * (Tensor data, Tensor batch_sizes, tuple of Tensors hx, tuple of Tensors params, bool has_biases, int num_layers, float dropout, bool train, bool bidirectional)
      didn't match because some of the arguments have invalid types: (Tensor, !tuple!, !list!, !str!, !int!, !float!, !bool!, bool, bool)
 * (Tensor input, tuple of Tensors hx, tuple of Tensors params, bool has_biases, int num_layers, float dropout, bool train, bool bidirectional, bool batch_first)
      didn't match because some of the arguments have invalid types: (Tensor, !tuple!, !list!, !str!, int, float, bool, bool, bool)

sorry for my grammatical mistakes and I am new in forum

Pass the bias argument as the bool value True instead of a string:

model = LSTM(4, 2, 1, 252, bias=True)
x = torch.randn(129, 252, 4)
out = model(x)