Hi, I’m new to python, pytorch and DL, so please bear with me.

I’m following Nando de Freitas Oxford youtube lectures and in one of the exercises we need to construct Polynomial Regression. I found an example here Polynomial Regression

Now I’m trying to modify it to my needs, but having issues. I think the problem is that in the function make_features(x) produces tensors x with size (10,2,4) and tensor y_train with size (10) and I need to align them and make the tensor x of only one row, but I don’t know how to do it. I tried transforming it with numpy.reshape() but couldn’t get it to work. It’s be great if someone could help me out, Thanks, Code below

```
from __future__ import print_function
from itertools import count
import torch
import torch.autograd
import torch.nn.functional as F
from torch.autograd import Variable
train_data = torch.Tensor([
[40, 6, 4],
[44, 10, 4],
[46, 12, 5],
[48, 14, 7],
[52, 16, 9],
[58, 18, 12],
[60, 22, 14],
[68, 24, 20],
[74, 26, 21],
[80, 32, 24]])
test_data = torch.Tensor([
[6, 4],
[10, 5],
[4, 8]])
x_train = train_data[:,1:3]
y_train = train_data[:,0]
POLY_DEGREE = 4
input_size = 2
output_size = 1
def make_features(x):
"""Builds features i.e. a matrix with columns [x, x^2, x^3, x^4]."""
x = x.unsqueeze(1)
return torch.cat([x ** i for i in range(1, POLY_DEGREE+1)], 1)
def poly_desc(W, b):
"""Creates a string description of a polynomial."""
result = 'y = '
for i, w in enumerate(W):
result += '{:+.2f} x^{} '.format(w, len(W) - i)
result += '{:+.2f}'.format(b[0])
return result
def get_batch():
"""Builds a batch i.e. (x, f(x)) pair."""
x = make_features(x_train)
return Variable(x), Variable(y_train)
# Define model
fc = torch.nn.Linear(input_size, output_size)
for batch_idx in range(1000):
# Get data
batch_x, batch_y = get_batch()
# Reset gradients
fc.zero_grad()
# Forward pass
output = F.smooth_l1_loss(fc(batch_x), batch_y)
loss = output.data[0]
# Backward pass
output.backward()
# Apply gradients
for param in fc.parameters():
param.data.add_(-0.1 * param.grad.data)
# Stop criterion
if loss < 1e-3:
break
print('Loss: {:.6f} after {} batches'.format(loss, batch_idx))
print('==> Learned function:\t' + poly_desc(fc.weight.data.view(-1), fc.bias.data))
# print('==> Actual function:\t' + poly_desc(W_target.view(-1), b_target))
```