张量流RNN损失不减少

时间:2017-10-14 13:35:04

标签: python optimization tensorflow

我是tensorflow的新手。我正在尝试训练一个rnn(用于音素标记)。当我运行代码时,损失根本没有降低。(我的批量大小为1,因为每个序列具有不同的长度)。我想知道我做错了什么?我可以通过像这样的for循环来提供数据来优化模型吗?我想每次使用sess.run进行优化时,它都会从它之前的值更新变量。我错了吗?

with tf.variable_scope("foo",reuse=True):    
    x = tf.placeholder(tf.float32, [1, None, 69])
    y = tf.placeholder(tf.int32, [None])
    cell = tf.nn.rnn_cell.LSTMCell(48)
    outputs, state = tf.nn.dynamic_rnn(cell, x, dtype=tf.float32)
    loss =tf.reduce_mean(tf.nn.sparse_softmax_cross_entropy_with_logits(labels = y, logits = tf.reshape(outputs,[-1, 48]))) 
    optimizer = tf.train.AdamOptimizer(learning_rate=0.1).minimize(loss)
    saver = tf.train.Saver()
    with tf.Session() as sess:
         init = tf.global_variables_initializer()
         sess.run(init)
         for epoch in range(1):
            for index, ID in enumerate(list(X.keys())):                 

                 x_temp = np.array(X[ID])[np.newaxis,:,:]
                 y_temp = np.array(Y_int[ID])
                 _ , _loss = sess.run([optimizer,loss], feed_dict = {x: x_temp, y : y_temp } ) 
                 if index % 10 == 0:
                     print("epoch "+str(epoch)+" samples "+str(index)+" loss: "+str(_loss))
         prediction = []
         save_path = saver.save(sess, "/tmp/model.ckpt")
         for index, ID in enumerate(list(X.keys())):
             if index % 10 == 0:
                 print("epoch "+str(epoch)+" samples "+str(index))
             x_temp = np.array(X[ID])[np.newaxis,:,:]
             y_temp = np.array(Y_int[ID])
             out = sess.run(outputs, feed_dict = {x: x_temp, y : y_temp })
             prediction.append(out)

0 个答案:

没有答案