Issue of weight and biases in fully connected linear layers and CNN

Hi,
In the past I was working with fully connected linear layers. there I noticed that each batch has different weights and biases but now I am working with CNN there for all the batches weights and biases are same. Can you explain me why is this happening?

I don’t understand this claim as “batch” usually refers to the input batch and does not provide any weights and biases, so could you explain in more detail what exactly you are referring to?