How to free up vision::model's memory when I don't use it?

When I finished predicting, I found out that my program took up 8G. How do I free it up if I don’t make a prediction next? I tried using detach delete c10::cuda::CUDACachingAllocator::emptyCache().
But it doesn’t seem to help with memory.

    vision::models::WideResNet50_2 module_wideresnet_50_;//https://github.com/pytorch/vision
	auto anomaly_features = torch::jit::load("D:\\patchcore_features.pt");
	auto ap = anomaly_features.attr("feature").toTensor().to(torch::kCUDA);
	torch::load(module_wideresnet_50_, "D:\\patchcore_model.pt");
	module_wideresnet_50_->eval();
	module_wideresnet_50_->to(torch::kCUDA);