Negative log likelihood in theano (cox regression)

时间:2015-09-01 21:53:56

标签: python machine-learning theano cox-regression

I'm trying to implement cox regression in theano.

I'm using the logistic regression tutorial (http://deeplearning.net/tutorial/logreg.html) as a framework and replacing the logistic log likelihood (LL) function by the cox regression LL function (https://en.wikipedia.org/wiki/Proportional_hazards_model#The_partial_likelihood).

Here's what I have so far:

class CoxRegression(object):
def __init__(self, x, n_in):
    self.W = theano.shared(value=numpy.zeros(n_in,dtype=theano.config.floatX), name='W',borrow=True)
    self.b = theano.shared(numpy.cast['float64'](0), borrow=True)
    self.theta = T.dot(x, self.W) + self.b
    self.exp_theta = T.exp(self.theta)
    self.params = [self.W, self.b]
    self.x = x

def negative_log_likelihood(self, ytime, ystatus):
    LL_i = T.switch(T.eq(ystatus[i],1), self.theta - T.log(T.sum(self.exp_theta * T.gt(ytime, ytime[i]))),0)

Basically, I need to sum over LL_i (where i is 0 to ytime.shape - 1). But I'm not sure how to do this. Should I use the scan function?

1 个答案:

答案 0 :(得分:1)

想出来。诀窍不是使用扫描功能,而是将双重求和转换为纯矩阵运算。