GPU and its Memory usage is very low

GPU usage is very less(1% to 10%, sometimes 30%) and GPU memory usage is also very less(less than 10%).
I am using transformer model with reinforcement learning to solve graph related problems. So my data is basically, list of nodes with its features.

I feel that resources are not being used efficiently, kindly help me resolve this.

Here is how my DataLoader look like:
training_dataloader = DataLoader(training_dataset, batch_size=opts.batch_size, num_workers=32, pin_memory=True)
my data is basically, random based generation like below,

def generate_instance(size, prize_type):
# Details see paper
MAX_LENGTHS = {
20: 2.,
50: 3.,
100: 4.
}

loc = torch.FloatTensor(size, 2).uniform_(0, 1)
depot = torch.FloatTensor(2).uniform_(0, 1)
# Methods taken from Fischetti et al. 1998
if prize_type == 'const':
    prize = torch.ones(size)
elif prize_type == 'unif':
    prize = (1 + torch.randint(0, 100, size=(size, ))) / 100.
else:  # Based on distance to depot
    assert prize_type == 'dist'
    prize_ = (depot[None, :] - loc).norm(p=2, dim=-1)
    prize = (1 + (prize_ / prize_.max(dim=-1, keepdim=True)[0] * 99).int()).float() / 100.

return {
    'loc': loc,
    # Uniform 1 - 9, scaled by capacities
    'prize': prize,
    'depot': depot,
    'max_length': torch.tensor(MAX_LENGTHS[size])
}

Thanks!
Anil

What is your batch size? You could try increasing it and see what your GPU usage is after that.

Batch size is 512 and I think it’s large enough. But I’ll try increasing