Backquery前馈NN

时间:2019-01-22 13:08:43

标签: tensorflow

我想通过我的简单前馈网络进行反向查询。它学习并识别MNIST手写字母。现在,我要进行反向反向查询:

例如,我在7的标签上提供一个具有较高值的​​标签蒙版,并进行反向计算以获取数字7的28x28表示图像。但是,我期望它不起作用。该图像看起来应该像字母7。我所得到的看起来更像是具有随机像素的图片。我现在看不到我在做什么错!

# Inputs / Lables
X_Input = tf.placeholder(dtype=tf.float32, shape=[None, 784], name="X_Input")
Y_Learn = tf.placeholder(dtype=tf.float32, shape=[None, 10], name="Y_Learn")

# Input to 1 Layer
W_val_hidden1 = tf.truncated_normal(shape=[784, 784], stddev=0.1)
W_hidden1 = tf.Variable(W_val_hidden1)

# 1 Layer to out
W_val_out = tf.truncated_normal(shape=[784, 10], stddev=0.1)
W_out = tf.Variable(W_val_out)

# Calculate
y_hidden1_out = tf.matmul(X_Input, W_hidden1)
y_out = tf.nn.sigmoid(tf.matmul(y_hidden1_out, W_out))

# Backquery here:
backwards0 = tf.matmul(tf.transpose(W_out),tf.transpose(tf.math.log_sigmoid(Y_Learn)))
backwards = tf.matmul(tf.transpose(W_hidden1),backwards0)

# Train
loss = tf.reduce_mean(tf.square(Y_Learn - y_out))
train_step = tf.train.AdamOptimizer(0.0001).minimize(loss)

print("Training...")
with tf.Session() as s:
    s.run(tf.global_variables_initializer())

    for n in range(100):
        s.run(train_step, feed_dict={X_Input: x_data, Y_Learn: y_data})
        print("Step {}".format(n))

    bw_in = np.array([[0.1, 0.1, 0.1, 0.1, 0.1, 0.1, 0.1, 0.9, 0.1, 0.1]], dtype=float)

    bw = s.run(backwards, feed_dict={Y_Learn: bw_in})

    out = np.reshape(bw, (28, 28))
   # out = np.clip(out, 0.0, 1.0)
    plt.imshow(out)
    plt.show()

0 个答案:

没有答案