MNIST autograd profiling

(Dhakshiin) #1


In the code I wanted to estimate the profiling information for each layer in my model.
Is the placement of the profiler function correct.

def train(args, model, device, train_loader, optimizer, epoch):
for batch_idx, (data, target) in enumerate(train_loader):
data, target =,

	***with torch.autograd.profiler.profile() as prof:***

*** output = model(data)***
*** loss = F.nll_loss(output, target)***
*** loss.backward()***
*** print(prof)***

	if batch_idx % args.log_interval == 0:
		print('Train Epoch: {} [{}/{} ({:.0f}%)]\tLoss: {:.6f}'.format(
		epoch, batch_idx * len(data), len(train_loader.dataset),
		100. * batch_idx / len(train_loader), loss.item()))

Do I have to make "requires_grad=True" anywhere in the code before running the profiler