Hello!
I’m recently studying about cudnn. To be more specific, I’m trying to extract data from (cudnnBackendDescriptor_t variantPack and cudnnBackendDescriptor_t executionPlan). However, I extracted doubtful data from it. Below is my code and result.
The code
void printVariantPackPointers(cudnnBackendDescriptor_t variantPack, cudnnBackendDescriptor_t executionPlan) {
void* ptrValue = nullptr;
int64_t count1 = 0;
int64_t count2 = 0;
int64_t requested_count = 10;
int64_t workspaceSize = 32;
cudnnStatus_t status = cudnnBackendGetAttribute(
executionPlan,
CUDNN_ATTR_EXECUTION_PLAN_WORKSPACE_SIZE,
CUDNN_TYPE_INT64,
1,
&count1,
&workspaceSize
);
if (status == CUDNN_STATUS_SUCCESS) {
std::cout << "Workspace Size: " << workspaceSize << " bytes" << std::endl;
} else {
std::cout << "Failed to get workspace size: " << cudnnGetErrorString(status) << std::endl;
}
status = cudnnBackendGetAttribute(
variantPack,
CUDNN_ATTR_VARIANT_PACK_DATA_POINTERS,
CUDNN_TYPE_VOID_PTR,
requested_count,
&count2,
ptrValue
);
if (status == CUDNN_STATUS_SUCCESS) {
std::cout << "Attribute (VOID_PTR): " << ptrValue << std::endl;
} else {
std::cout << "Failed to get attribute: " << cudnnGetErrorString(status) << std::endl;
}
}
The result
Workspace Size: 0 bytes
Failed to get attribute: CUDNN_STATUS_NOT_SUPPORTED
I’m pretty sure that the execution plan is from pytorch cudnn run_conv_plan function. Is there any chance that conv plan actually does not have any workspace?
I’m also curious that why CUDNN_STATUS_NOT_SUPPORTED error happens. I will appreciate if someone help me here
Thank you in advance!