How to transform recall@top-k metric to regular recall?

Could anybody help me out by translating this recall@top-k formula to a “regular” recall?
It concerns a classification task which ranges from 3087 to 1027 classes.

Current code for recall@10,20,30

def recallTop(y_true, y_pred, rank=[10, 20, 30]):
  outer = []
  for x in range(len(y_pred)):
    pred_value = y_pred[x]
    true_value = y_true[x]
    pred_value = torch.round(pred_value).clone().detach()
    TP = torch.sum(torch.logical_and(true_value == 1, pred_value))   # True positives (predictions)
    inner = []
    for i in rank:
      TP_k = torch.sum(torch.logical_and(pred_value[:, :i] == 1, true_value[:, :i]))  # True positives @top 10, 20, 30
      inner.append(TP_k)
    avg = torch.div(torch.tensor(inner), TP)
    avg[torch.isnan(avg)] = 0
    outer.append(avg.tolist())

  return (np.array(outer)).mean(axis=0)

Current code in training loop

train_recall.append(recallTop(y, output))

Current code in every epochs

avg_train_recall = (np.array(train_recall)).mean(axis=0)

print("Epoch: {}/{}...".format(e + 1, epochs),
"Train Recall@10, Recall@20, Recall@30", avg_train_recall)