Multiple Continuous Outputs from NN

I need to create a NN with 8 continuous inputs (predictors) and 5 continuous outputs. The code below works great with one output. If I change the line “y = dataset[:,8]” to “y = dataset[:,8:10]” and resize the train and test sensor using “y = torch.tensor(y, dtype=torch.float32).reshape(-1, 2)” (so that there are two columns for outputs instead of one) then I obviously get errors. I’ve tried a zillion things, such as changing " y_pred = model(Xbatch)" to " y_pred1, y_pred2 = model(Xbatch)". Is there any best practice for this?


import numpy as np
import torch
import torch.nn as nn
import torch.optim as optim
from sklearn.model_selection import train_test_split


# load the dataset, split into input (X) and output (y) variables
dataset = np.loadtxt('data.csv', delimiter=',')
X = dataset[:,0:8]
y = dataset[:,8]


X, x_test, y, y_test = train_test_split(X, y, test_size=0.2)
#x_test, x_val, y_test, y_val = train_test_split(x_test, y_test, test_size=0.5)
 
#train tensort
X = torch.tensor(X, dtype=torch.float32)
y = torch.tensor(y, dtype=torch.float32).reshape(-1, 1)
 
#test tensor
x_test = torch.tensor(x_test, dtype=torch.float32)
y_test = torch.tensor(y_test, dtype=torch.float32).reshape(-1, 1)


# define the model
model = nn.Sequential(
    nn.Linear(8, 22),
    nn.ReLU(),
    nn.Linear(22, 22),
    nn.ReLU(),
    nn.Linear(22, 8),
    nn.ReLU(),
    nn.Linear(8, 1)#,
   # nn.Sigmoid()
)
print(model)
 
# train the model
loss_fn   = nn.L1Loss()
optimizer = optim.Adam(model.parameters(), lr=0.001)
 
n_epochs = 200
batch_size = 100
 
for epoch in range(n_epochs):
    for i in range(0, len(X), batch_size):
        Xbatch = X[i:i+batch_size]
        y_pred = model(Xbatch)
        ybatch = y[i:i+batch_size]
        loss = loss_fn(y_pred, ybatch)
        optimizer.zero_grad()
        loss.backward()
        optimizer.step()

Hi Dirk!

Replace your final Linear (8, 1) layer with Linear (8, 5). That will produce
5 continuous outputs for each sample you input.

Best.

K. Frank

1 Like

Well that was easy. Thank you very much!