Auto grad for a matrix with only a parameter

I want to get the gradient of a parameter. For example:
parameter: t
A = [[cos(t), -sin(t)], [sin(t), cos(t)]]
x = [[1], [10]]

y = A x
z = ||y||^2

I want to know how to code to get the gradient of theta where loss function is z.

theta = torch.tensor(1.0, dtype=torch.float32, requires_grad=True)
A = torch.Tensor([
    [torch.sin(theta), torch.cos(theta)],
    [torch.tan(theta), torch.exp(theta)]
])

x = torch.tensor([[1], [10]], dtype=torch.float32)
y = torch.matmul(A, x)
z = torch.matmul(y.T, y)

But I found A.requires_grad is False, and the same to z.