Could I use 'torch.utils.tensorboard' to see the compute time and memory of each step in model? If yes, how to do it? Are there any other ways to do?

I want to check the step’s compute time and memory of model, and I find tensorboard in tensorflow can do it. But I run the document in official website (https://pytorch.org/docs/stable/tensorboard.html?highlight=tensorboard), it’s not clearly. Could I use it to see network graph’s compute time and memory? If yes, how to do it?