tensorflow训练减慢

时间:2017-12-26 05:41:38

标签: tensorflow deep-learning

训练时,张力流似乎减慢了。有人说这是因为随着培训的进行,在图表中添加了一些操作。但是,我找不到任何地方。这是我的代码。

with tf.Session() as sess:
    tf.global_variables_initializer().run()
    tf.local_variables_initializer().run()
    loader.restore(sess, checkpoint_file)
    summaries_loger_train.add_graph(sess.graph)

    coord = tf.train.Coordinator()
    threads = tf.train.start_queue_runners(coord=coord)
    tf.get_default_graph().finalize()
    for i in range(10000000):
        imgs_r, labels_r= sess.run([imgs, labels])
        _, loss_r, cross_entropy_loss_r, center_loss_r, acc_r, lr_r, s_r, gs_r= sess.run([opt, loss, cross_entropy_mean, center_loss_without_decay, accuracy, lr, summaries, global_step],
                                        feed_dict={imgs_p: imgs_r, labels_p: labels_r})
        print(gs_r, loss_r, cross_entropy_loss_r, center_loss_r, acc_r, lr_r)
        summaries_loger_train.add_summary(s_r, gs_r)
        if (gs_r) % 100 == 1:
            imgs_r, labels_r= sess.run([imgs_test, labels_test])
            cross_entropy_loss_r, acc_r, s_r= sess.run([cross_entropy_mean, accuracy, summaries],
                                        feed_dict={imgs_p: imgs_r, labels_p: labels_r})
            summaries_loger_test.add_summary(s_r, gs_r)
            print('test', cross_entropy_loss_r, acc_r)
        if (gs_r) % 1000 == 0:
            saver.save(sess, 'models2/model.ckpt', global_step=gs_r)

    coord.request_stop()
    coord.join(threads)

0 个答案:

没有答案
相关问题