Hello,
I am having some troubles dealing with masks. What I do is the following:
- I start with a (very) large set of data points.
- I mask them and I take only the relevant ones.
- I make some calculations
- I mask them and I take only the relevant ones
…
basically I don’t know at the very beginning which data points are relevant and which ones are not, and in order to avoid useless calculations I perform consecutive masking operation. At the end of the run, I want to merge all the masks.
I was working with pytorch 1.7 until a couple of hours ago and I used to do something like this:
fullMask = torch.ones(shapeTensor).bool()
fullMask[fullMask] = first_level_mask
fullMask[fullMask] = second_level_mask
…
with pytorch 1.9 this doesn’t work anymore and I get the “some elements of the input tensor and the written-to tensor refer to a single memory location. Please clone() the tensor before performing the operation.” error. I would definitely avoiding spawning .clone() everywhere since I am dealing with very large tensors.
I was wondering if someone can help me in finding the cleanest way to perform this operation.
Thank you in advance for the help.
Gabriele