在输出层上使用softmax或sigmoid时,在张量流中获得Nan损失

时间:2017-03-22 10:26:06

标签: python tensorflow keras

我在Keras中使用张量流作为后端安装LSTM网络时出现问题:

def build_recurrent(input_dim, output_dim):
    model = Sequential()
    model.add(LSTM(200,input_dim=input_dim,activation='tanh'))
    #model.add(Dense)
    model.add(Dropout(0.5))
    model.add(Dense(output_dim,activation='softmax'))
    return model

当我在输出图层上使用softmax(使用比例)时,我得到了这个:

Epoch 1/1500
0s - loss: nan - mean_squared_error: nan - mean_absolute_error: nan
Epoch 2/1500
0s - loss: nan - mean_squared_error: nan - mean_absolute_error: nan
Epoch 3/1500
0s - loss: nan - mean_squared_error: nan - mean_absolute_error: nan
Epoch 4/1500
0s - loss: nan - mean_squared_error: nan - mean_absolute_error: nan
Epoch 5/1500
0s - loss: nan - mean_squared_error: nan - mean_absolute_error: nan

while,当使用其他激活功能时,例如 tanh

Epoch 1/1500
0s - loss: 0.9173 - mean_squared_error: 0.9595 - mean_absolute_error: 0.9173
Epoch 2/1500
0s - loss: 1.0652 - mean_squared_error: 1.1457 - mean_absolute_error: 1.0652
Epoch 3/1500
0s - loss: 1.0652 - mean_squared_error: 1.1457 - mean_absolute_error: 1.0652
Epoch 4/1500
0s - loss: 1.0652 - mean_squared_error: 1.1457 - mean_absolute_error: 1.0652
Epoch 5/1500
0s - loss: 1.0652 - mean_squared_error: 1.1457 - mean_absolute_error: 1.0652

它会是什么样的问题?我应该更改隐藏层的激活功能吗?

另一个奇怪的事情是,这只发生在LSTM模型中,而当我使用简单的前馈网络时,我没有收到任何错误

0 个答案:

没有答案
相关问题