Adnan1588
(Adnan Siraj Rakin)
November 16, 2017, 6:24pm
#1
I want to create a binary linear layer where the weights would be binary however during the backward pass i want to retain the real valued weights. Something like this one in torch:
--require 'randomkit'
local BinaryLinear, parent = torch.class('BinaryLinear', 'nn.Linear')
function BinaryLinear:__init(inputSize, outputSize,stcWeights)
local delayedReset = self.reset
self.reset = function() end
parent.__init(self, inputSize, outputSize)
self.reset = delayedReset
self.weight = torch.Tensor(outputSize, inputSize)
self.weightB = torch.Tensor(outputSize, inputSize)
self.weightOrg = torch.Tensor(outputSize, inputSize)
self.maskStc = torch.Tensor(outputSize, inputSize)
self.randmat = torch.Tensor(outputSize, inputSize)
self.bias = torch.Tensor(outputSize)
self.gradWeight = torch.Tensor(outputSize, inputSize)
self.gradBias = torch.Tensor(outputSize)
self.stcWeights=stcWeights
self:reset()
This file has been truncated. show original
How do i handle the weight binary weight during only forward pass and reall weights during the backward pass?
smth
November 17, 2017, 5:18pm
#2
The right and clean solution would be to create a custom autograd Function for this:
See this page for guidance on writing custom autograd Functions http://pytorch.org/docs/master/notes/extending.html
1 Like