Hi everyone!
I’m writing a program to add a watermark to a very simple FFNN model. To do this I have to manually change almost every weight in the model; the process is: I take the value, transform it into binary, change the LSB, transform it back into float.
Now, I managed to do everything, but I’m having a problem that I can’t solve and above all understand.
I take the value, modify it, check that the modification was successful and insert it into the specific weight, the problem is that when I go to verify that the value has been inserted, the value returns to the initial weight, and not the modified one, here I leave log data to make it better understood:
param item -1.2060359716415405
rounded param -1.2060359716415405
changed param -1.2060359716415407
-----
bin param 1011111111110011010010111110110001100000000000000000000000000000
bin changed 1011111111110011010010111110110001100000000000000000000000000001
-----
param final -1.2060359716415405
-----
and the code that creates it:
print(“param” + str(param[2,9]))
print("param item " + str(param[2,9].item()))
work_value = round(param[2,9].item(),16)
print("rounded param " + str(work_value))
lsb_changed = operation(work_value, 1)
print("changed param " + str(lsb_changed))
print(“-----”)
print("bin param " + str(float_to_bin(work_value)))
print("bin changed " + str(float_to_bin(lsb_changed)))
print(“-----”)
param[2,9] = lsb_changed
print("param final " + str(param[2,9].item()))
print(“-----”)
By working on it a bit I think, more or less, I have understood that it is a problem of precision of the representation of the value, when I go to insert the new value Pytorch changes it in some way that I don’t understand; but this is just a guess.
Please help me I’m almost going crazy.