Achieving sth like _foreach_clone?

Hi there! I have been exploring the multi-tensor operators (torch._foreach_*) for implementing a custom optimizer. And I am wondering why there isn’t sth like _foreach_clone to copy the list of tensors. Right now I am achieving this with _foreach_add(..., 0) but this feels more like a hack. Is there sth obvious I am missing here? Or maybe I should just go with list comprehension instead?