I am trying to write a custom CNN layer that applies softmax to each convolution operation. So each pixel in the output image is gonna be valued between [0, 1] and it is the sum of the convolved pixel. An example of TensorFlow implementation can be seen here.
Ideally, this should be trained with binary cross-entropy loss. I tried below but it does not train.
Could you check, if your code runs on the CPU?
I assume F.binary_cross_entropy might throw an error, as it expects probabilities as the model output, while you are passing the raw output as pred_s_p.
Try to use F.binary_cross_entropy_with_logits instead.
The model and data are on GPU. F.binary_cross_entropy_with_logits works but the loss does not change. Since I am applying softmax, the values should be probabilities in the output that’s why I thought I should use F.binary_cross_entropy.
You are applying the softmax on the weights, not the output.
Depending on the distribution of your input you will not get probabilities as the output, which would raise an error as:
RuntimeError: Assertion `x >= 0. && x <= 1.' failed. input value should be between 0~1, but got -1.429985
Doesn’t this apply the softmax over all the pixels in the output image? I need to apply it to each conv operation. Let’s say we have a kernel with size of [3, 3] and image [10, 10].