Clone and detach in v0.4.0

Sorry if this repetitive but I still don’t get it. What is wrong with doing clone first and then detach i.e. .clone().detach()?

If we clone and then detach then we still have a new tensor with it’s own memory and we’ve blocked the gradient flow to the earlier graph.

If we do .detach().clone()

then we create a tensor that shares the same memory but forget the the old gradient flow but then we made a clone of it, so now it has new memory for it (but since its a copy of the detached it still doesn’t have the gradient flow to the earlier part of the graph).

Which seem equivalent. Are they not? Is there an error in my reasoning?