Freezing weights for some layers, for selective elements of a batch?

Does anyone know how to freeze weights at a deep layer (to prevent backpropagation), for only some elements of a batch? I want all elements to still backpropagate through the final layer, and some but not all to backprop beyond that.
I know how to selectively freezing some elements of a batch (with a mask), as well as how to freeze a deep layer (deep_layer.requires_grad = False), but I can’t think of a way to do both simultaneously. Thanks a lot!