Function to set elements below learned cutoff to zero

Is there a function/layer that can take in an input and set elements below some learned parameter to 0, leaving those above that learned parameter the same?

What do you mean by this?
Are you looking for torch.clamp?

Not exactly; clamp takes values lower than the minimum and brings them to the minimum. I am looking for the opposite in some sense - taking values lower than some threshold (but above 0) and bringing them to 0. In some sense I am looking for a right-shifted ReLU by some parameter, but that parameter should be learned and tunable.