Question about gradient calculation in backward() of actor network of DDPG

Hi,
My question is about gradient calculation and computational graphs in DDPG.

Following is the code snippet for actor-critic update in DDPG.

self.actor_optimizer = torch.optim.Adam(self.actor.parameters(), lr=actor_lr, weight_decay=actor_decay)
self.critic_optimizer = torch.optim.Adam(self.critic.parameters(), lr=critic_lr, weight_decay=critic_decay)

Sample from the experience replay buffer
state, action, next_state, reward, not_done = replay_buffer.sample(batch_size)

Compute the target Q-value
** The computational graph with target_Q is created**
target_Q = self.critic_target(next_state, self.actor_target(next_state))
make a new tensor for target_Q.detach()
target_Q = reward + (not_done * self.discount * target_Q).detach()

Get the current Q-value estimate
The computational graph with self.critic is created
current_Q = self.critic(state, action)

Compute the critic loss
critic_loss = F.mse_loss(current_Q, target_Q)

Optimize the critic
self.critic_optimizer.zero_grad()
gradients w.r.t. weights of self.critic is calculated and the computational graph is freed
critic_loss.backward()
updates weights of self.critic based on gradients
self.critic_optimizer.step()
Compute the actor loss
the computational graph with self.critic and self.actor is created
actor_loss = -self.critic(state, self.actor(state)).mean()

Optimize the actor
self.actor_optimizer.zero_grad()
actor_loss.backward()
self.actor_optimizer.step()

My question is “actor_loss.backward()” calculates gradients w.r.t weights of both self.critic and self.actor and if so, gradients of self.critic calculated in critic_loss.backward() remains so it seems accumulated. However, it doesn’t affect updating self.actor because self.actor_optimizer only cares about self.actor’s parameters. Do I understand correctly?

My second question is that the computational graph for Target_Q looks not freed. correct? if so, how to free the computational graph for Target_Q without backward() operation?