标签: python scikit-learn
在下面的代码中:
mlp = MLPClassifier(hidden_layer_sizes=(450,200,100), activation='relu', alpha = 0.01, max_iter=200, solver='adam',random_state=1, early_stopping = True, verbose=True, shuffle = True)
激活函数relu用于网络中的所有隐藏层。前两个隐藏层有relu,最后一层有softmax吗?
relu