不使用lambda图层将Relu应用于输入?

时间:2017-07-16 22:35:10

标签: python lambda keras keras-layer

我目前在加载我的模型时遇到了一些问题,因为它包含了一个lambda图层..

这是我使用lambda图层的神经网络层。

#
#   Python scritpt -  Keras RCNN model.
#
import keras
from keras.models import Model
from keras.layers import Input, Dense, Dropout, Flatten, Activation
from keras.layers import merge, Conv2D, MaxPooling2D, Input
from keras.layers.normalization import BatchNormalization
from keras.layers.core import Lambda
import numpy as np
from keras.layers import add
from keras import backend as K


#   RCL:
#   BatchNorm(Relu(conv(L-1) + conv(L)))
#

def make_RCNN(input,number_of_rcl,num_of_filter, filtersize,alpha,pool):
    feed_forward = Conv2D(filters=num_of_filter, kernel_size=1, name='init')(input)

    for x in xrange(number_of_rcl):
        output = RCL(feed_forward,num_of_filter,filtersize,alpha,pool)
        feed_forward = output

    return feed_forward

def RCL(feed_forward_input,num_of_filter, filtersize, alpha,pool):
    conv = Conv2D(filters=num_of_filter, kernel_size=filtersize, padding='same')
    recurrent_input = conv(feed_forward_input)
    merged = add([feed_forward_input,recurrent_input])
    conv_relu = Lambda(lambda x : K.relu(x,alpha=alpha))(merged)
    conv_relu_batchnorm = BatchNormalization()(conv_relu)
    if pool:
        conv_relu_batchnorm_pool = MaxPooling2D()(conv_relu_batchnorm)
        return conv_relu_batchnorm_pool
    else:

        return conv_relu_batchnorm

input = Input(shape=(30,30,3))
output = make_RCNN(input,number_of_rcl=3,num_of_filter=3,filtersize=3,alpha=0.2, pool=True)

model = Model(input = input, output = output)
model.compile(optimizer='rmsprop', loss='binary_crossentropy')
model.summary()

如何在不改变功能的情况下删除图层?...

0 个答案:

没有答案
相关问题