I am trying to use RFF kernel from the gpytorch library < https://gpytorch.ai >.
The source code can be found below:
https://github.com/cornellius-gp/gpytorch/blob/master/gpytorch/kernels/rff_kernel.py
This is basically an approximation to the RBF kernel.
The forward function looks like this:
if last_dim_is_batch:
x1 = x1.transpose(-1, -2).unsqueeze(-1)
x2 = x2.transpose(-1, -2).unsqueeze(-1)
num_dims = x1.size(-1)
if not hasattr(self, "randn_weights"):
self._init_weights(num_dims, self.num_samples)
x1_eq_x2 = torch.equal(x1, x2)
z1 = self._featurize(x1, normalize=False)
if not x1_eq_x2:
z2 = self._featurize(x2, normalize=False)
else:
z2 = z1
D = float(self.num_samples)
if diag:
return (z1 * z2).sum(-1) / D
if x1_eq_x2:
# Exploit low rank structure, if there are fewer features than data points
if z1.size(-1) < z2.size(-2):
return LowRankRootLinearOperator(z1 / math.sqrt(D))
else:
return RootLinearOperator(z1 / math.sqrt(D))
else:
return MatmulLinearOperator(z1 / D, z2.transpose(-1, -2))
I only want access to z1 but a simple modification of this file to return z1 like:
def forward(self, x1: Tensor, x2: Tensor, diag: bool = False, last_dim_is_batch: bool = False, **kwargs) → Tensor:
if last_dim_is_batch:
x1 = x1.transpose(-1, -2).unsqueeze(-1)
x2 = x2.transpose(-1, -2).unsqueeze(-1)
num_dims = x1.size(-1)
if not hasattr(self, “randn_weights”):
self._init_weights(num_dims, self.num_samples)
x1_eq_x2 = torch.equal(x1, x2)
z1 = self._featurize(x1, normalize=False)
return z1
This gives error b/c it is expecting output of dimension torch.matmul(z1.T,z1) and NOT z1.
Please can someone help