我不知道我的线性回归代码有什么问题

时间:2016-12-15 07:17:53

标签: matlab linear-regression gradient-descent

我尝试了正规方程,结果是正确的。 然而,当我使用渐变下降时,这个数字证明是错误的。我提到了在线资源,但我没有发现什么是错的。我认为以下代码中没有任何特殊内容。

clear;
clc;
m = 100; % generate 100 points
noise = randn(m,1); % 100 noise of normal distribution
x = rand(m, 1) * 10; % generate 100 x's ranging from 0 to 10
y = 10 + 2 * x + noise; 
plot (x, y, '.');
hold on;


X = [ones(m, 1) x];
theta = [0; 0];
plot (x, X * theta, 'y');
hold on;

% Method 1 gradient descent
alpha = 0.02; % alpha too big will cause going far away from the result
num_iters = 5;
[theta, J_history] = gradientDescent(X, y, theta, alpha, num_iters)

% Method 2 normal equation
% theta = (pinv(X' * X )) * X' * y

plot (x, X * theta, 'r');



function [theta, J_history] = gradientDescent(X, y, theta, alpha, num_iters)
    m = length(y); 
    J_history = zeros(num_iters, 1);
    for iter = 1:num_iters,
        theta = theta - alpha * (1/m) * (X' * (X * theta - y));

        % plot (X(:, 2), X * theta, 'g');
        % hold on;

        J_history(iter) = costFunction(X, y, theta);
    end
end

function J = costFunction( X, y, theta )
    m = length(y);  
    predictions = X * theta; % prediction on all m examples 
    sqrErrors = (predictions - y).^2; % Squared errors
    J = 1/(2*m) * sum(sqrErrors); 
end

1 个答案:

答案 0 :(得分:1)

您的代码是正确的。问题是迭代次数很少。 一个人可以拿num_iters = 5000;并且看到theta收敛到正确的值([10; 2])。