Torch.nn.functional.interpolate memory issue

I am trying to down sample a .mp4 file for a vision project.

t = torchvision.io.read_video(r"path.mp4")
video_tensor= t[0]   # video_tensor.shape  ---> [853, 1080, 1920, 3]
video_tensor= video_tensor.permute(0,3,1,2)
video_tensor= torch.reshape(video_tensor, (video_tensor.shape[0], video_tensor.shape[1], -1))
video_tensor= torch.nn.functional.interpolate(video_tensor.float(), scale_factor=0.3)

I get this error:

RuntimeError: [enforce fail at ..\c10\core\CPUAllocator.cpp:72] data. DefaultCPUAllocator: not enough memory: you tried to allocate 22071398400 bytes. Buy new RAM!

Is this a viable way to try to downscale video, or are there other functions I should be using?

Cheers!