我的线性回归算法不起作用

时间:2019-10-18 07:26:31

标签: machine-learning linear-regression

我编写了线性回归算法,但是权重和偏差无法学习正确的值

实际训练数据是根据y = x1 + x2

生成的

因此,w1,w2,b分别应为1、1、0

但是他们没有一个在学习后获得正确的价值观

我不知道我的代码出了什么问题

提前谢谢:)

这是我的代码 ===================================================

机器学习,线性回归,pytorch

import numpy as np
import matplotlib.pyplot as plt
from mpl_toolkits.mplot3d import axes3d

import torch
import torch.nn as nn

#%%
N = 10
x1 = np.arange(N)
x2 = x1
y = x1 + x2 + 2

fig = plt.figure()
ax = fig.add_subplot(111,
                     projection = '3d')
plt.plot(x1, x2, y, 'bo')

x1 = x1.reshape(-1,1)
x2 = x2.reshape(-1,1)
x = np.hstack((x1, x2))

x_data = torch.tensor(x, dtype = torch.float)
y_data = torch.tensor(y, dtype = torch.float)

x_data.cuda()
y_data.cuda()
#%%
class LR(nn.Module):
    def __init__(self, input_size, output_size):
        super().__init__()
        self.linear = nn.Linear(input_size, output_size)

    def forward(self, x):
        pred = self.linear(x)
        return pred

model = LR(2,1)

epochs = 100000
lr = 0.00001
check_freq = 1000


criterion = nn.MSELoss()
optimizer = torch.optim.SGD(model.parameters(), lr = lr)


x1_axis = np.linspace(-4, 4, 100)
x2_axis = x1_axis

losses = []
for i in range(epochs):
    pred = model.forward(x_data)
    loss = criterion(pred, y_data)
    losses.append(loss.item())

    optimizer.zero_grad()
    loss.backward()
    optimizer.step()


    if i%check_freq == 0:
        print("epoch: ", i, "loss:", loss.item())
        [w, b] = model.parameters()

        w1 = w[0][0].item()
        w2 = w[0][1].item()

        b = b[0].item()

        z = b + w1*x1_axis + w2*x2_axis
        plt.plot(x1_axis, x2_axis, z, 'r')
        plt.xlabel('x')
        plt.ylabel('y') 



for param in model.parameters():
    print(param)

plt.figure()
plt.plot(range(epochs), losses)

0 个答案:

没有答案