Pytorch全连接前馈网络可解决回归问题-为所有输入提供相同的结果

时间:2019-04-05 12:20:47

标签: deep-learning regression pytorch

我在Pytorch中建立了一个神经网络模型以解决一个简单的回归问题(w1x1 + w2x2 + w3x3 = y),其中我生成了2000条训练数据记录,其中x1,x2,x3和W1 = 4,W2 = 6为随机值,W3 = 2。我创建了一个包含20条记录的测试数据集,其中仅包含x1,x2,x3的值,但我希望得到But的结果,但是该模型对所有20条输入行都返回相同的值。我不知道问题出在哪里。下面是代码段。

inputs = df[['x1', 'x2', 'x3']]
target = df['y']
inputs = torch.tensor(inputs.values).float()
target = torch.tensor(target.values).float()

test_data = torch.tensor(test_data.values).float()

import torch.nn as nn
import torch.nn.functional as F

class Net(nn.Module):

  def __init__(self):
    super(Net,self).__init__()

    hidden1 = 10
    hidden2 = 15

    self.fc1 = nn.Linear(3,hidden1)
    self.fc2 = nn.Linear(hidden1,hidden2)
    self.fc3 = nn.Linear(hidden2,1)


  def forward(self,x):
    x = F.relu(self.fc1(x))
    x = F.relu(self.fc2(x))
    x = self.fc3(x)
    return x

#instantiate the model

model = Net()
print(model)

criterion = nn.MSELoss()
optimizer = torch.optim.SGD(model.parameters(),lr=0.01)

model.train()

#epochs
epochs = 500


for x in range(epochs):
  #initialize the training loss to 0
  train_loss = 0
  #clear out gradients
  optimizer.zero_grad() 

  #calculate the output
  output = model(inputs)

  #calculate loss
  loss = criterion(output,target)

  #backpropagate
  loss.backward() 

  #update parameters
  optimizer.step()

  if ((x%5)==0):
    print('Training Loss after epoch {:2d} is {:2.6f}'.format(x,loss))

#set the model in evaluation mode
model.eval()

#Test the model on unseen data

test_output = model(test_data)

print(test_output)

0 个答案:

没有答案
相关问题