How to free up vision::model's memory when I don't use it?

When I finished predicting, I found out that my program took up 8G. How do I free it up if I don’t make a prediction next? I tried using detach delete c10::cuda::CUDACachingAllocator::emptyCache().
But it doesn’t seem to help with memory.

    vision::models::WideResNet50_2 module_wideresnet_50_;//
	auto anomaly_features = torch::jit::load("D:\\");
	auto ap = anomaly_features.attr("feature").toTensor().to(torch::kCUDA);
	torch::load(module_wideresnet_50_, "D:\\");