梯度下降的多元线性回归

时间:2019-05-23 04:36:56

标签: python machine-learning regression gradient-descent

晕,

我是机器学习和Python的新手,我想通过我的梯度下降来预测Kaggle House Sales in King County dataset

我要分配70%(15,000行)的训练和30%(6k行)的测试,我从19个中选择5个功能,但是存在性能问题,该算法花费了很多时间(超过11小时), 100%的内存,无法执行。

这是我的梯度下降课程:

class GradientDescent:

    X_train = []
    Y_train = []
    X_test  = []
    Y_test  = []
    lr = 0
    max_iter = 0
    theta = 0

    def __init__(self, X_train,Y_train,X_test,Y_test, lr=0.01, max_iter=100):
        self.X_train = X_train
        self.Y_train = Y_train
        self.X_test  = X_test
        self.Y_test  = Y_test
        self.lr = lr
        self.max_iter = max_iter
        self.theta = np.random.randn(X_train.shape[1], 1)
        print(self.theta)

    def costFunction(self,theta,X,y):
        "1/2m * E(h0-y)**2"
        m = len(y)
        y_pred = X.dot(theta)
        cost = (1/2*m) * np.sum(np.square(y_pred-y))

        return cost


    def estimate(self):
        m = len(self.Y_train)

        mse_hist = np.zeros(self.max_iter)

        #theta_hist = np.zeros(max_iter)
        i = 0
        while i < self.max_iter or mse_hist[i] > 0.01:
            y_pred = np.dot(self.X_train,self.theta)

            error = y_pred-self.Y_train
            self.theta = self.theta - (1/m)*self.lr*(self.X_train.T.dot((error)))
            mse_hist[i] = self.costFunction(self.theta,self.X_train, self.Y_train)

            #print(mse_hist[i])
            i+=1            
        return (self.theta, mse_hist)


    def test(self):
        res = pd.DataFrame()
        for i,row in self.X_test.iterrows():
            price_pred = np.dot(row.values,self.theta)
            res = row
            res['price_actual'] = self.Y_test[i]
            res['price_predict'] = price_pred

        res['r2_score'] = r2_score(res['price_actual'].values, res['price_predict'])
        res.to_csv('output.csv')

有什么建议可以改善它?

1 个答案:

答案 0 :(得分:1)

虽然我还没有测试过,但是总体来说代码看起来还不错。我发现的唯一错误是,您可能不会在while循环中递增i,因此循环永远不会退出。

相关问题