最简单的RNN与最简单的应用程序?

时间:2016-11-24 04:32:09

标签: machine-learning neural-network deep-learning recurrent-neural-network

我想要一个申请来证明我自己的RNN。

我只是自己编写最简单的RNN 没有 tensorflow。

所以我需要一个简单的RNN应用程序来确保实现是正确的。越简越好。

例如,我可以使用MNIST来证明我自己的CNN。

谢谢。

1 个答案:

答案 0 :(得分:0)

查看简单的RNN模型

def model(X, W, B, lstm_size):
    # X, input shape: (batch_size, time_step_size, input_vec_size)
    XT = tf.transpose(X, [1, 0, 2])  # permute time_step_size and batch_size
    # XT shape: (time_step_size, batch_size, input_vec_size)
    XR = tf.reshape(XT, [-1, lstm_size]) # each row has input for each lstm cell (lstm_size=input_vec_size)
    # XR shape: (time_step_size * batch_size, input_vec_size)
    X_split = tf.split(0, time_step_size, XR) # split them to time_step_size (28 arrays)
    # Each array shape: (batch_size, input_vec_size)

    # Make lstm with lstm_size (each input vector size)
    lstm = tf.nn.rnn_cell.BasicLSTMCell(lstm_size, forget_bias=1.0, state_is_tuple=True)

    # Get lstm cell output, time_step_size (28) arrays with lstm_size output: (batch_size, lstm_size)
    outputs, _states = tf.nn.rnn(lstm, X_split, dtype=tf.float32)

    # Linear activation
    # Get the last output
    return tf.matmul(outputs[-1], W) + B, lstm.state_size # State size to initialize the stat

您可以从https://github.com/nlintz/TensorFlow-Tutorials/找到更多示例。