I got problem with RBM code… Here is the code of problem part.
def v_to_h(self, v):
h_bias = (self.h_bias.clone()).expand(10)
v = v.clone().expand(10)
w = self.W.clone().squeeze()
print(h_bias.size(), v.size(), w.size())
print(h_bias.dim(), v.dim(), w.dim())
p_h = F.sigmoid(
F.linear(v, w, bias=h_bias)
)
sample_h = self.sample_from_p(p_h)
return p_h, sample_h
and got errors like this.
Traceback (most recent call last):
File "/Users/bahk_insung/Documents/Github/ecg-dbn/model.py", line 67, in <module>
v, v1 = rbm(sample_data)
File "/Users/bahk_insung/miniforge3/lib/python3.9/site-packages/torch/nn/modules/module.py", line 1102, in _call_impl
return forward_call(*input, **kwargs)
File "/Users/bahk_insung/Documents/Github/ecg-dbn/RBM.py", line 55, in forward
pre_h1, h1 = self.v_to_h(v)
File "/Users/bahk_insung/Documents/Github/ecg-dbn/RBM.py", line 37, in v_to_h
F.linear(v, w, bias=h_bias)
File "/Users/bahk_insung/miniforge3/lib/python3.9/site-packages/torch/nn/functional.py", line 1850, in linear
return torch._C._nn.linear(input, weight, bias)
RuntimeError: output with shape [] doesn't match the broadcast shape [10]
weight, v (is data from visible layer) and bias is same size and same dimension. 10 tensor size and 1-D. Seriously Im figuring out for 3days. I still can’t find what’s wrong with me.