我已定义了一些图层,我想手动设置此网络的权重和偏差。 为此,我在Matlab中编写了一个代码,如下所示:
%% Defining Layers
layers = [imageInputLayer([28 28 1],'Normalization','none');
convolution2dLayer(5,20);
reluLayer();
maxPooling2dLayer(2,'Stride',2);
fullyConnectedLayer(10);
softmaxLayer();];
%% Set Manually Weights and Biases
layers(2,1).Weights = Conv1_Weights ;
layers(2,1).Bias = CONV1_Biases ;
layers(5,1).Weights = FC1_Weights ;
layers(5,1).Bias = FC2_Bias ;
net = SeriesNetwork(layers);
%%
X = imread(____)
X_Single = single(X);
X_Single_Normalize = (X_Single-min(X_Single(:)))/(max(X_Single(:))-min(X_Single(:)));
F1 = activations(net,X_Single_Normalize,2,'OutputAs','channels');
F2 = activations(net,X_Single_Normalize,3,'OutputAs','channels');
F3 = activations(net,X_Single_Normalize,4,'OutputAs','channels');
F4 = activations(net,X_Single_Normalize,5,'OutputAs','channels');
实际上,我收到了这个错误:
Error using nnet.internal.cnn.layer.FullyConnected/forwardPropagateSize (line 101)
An input size for the layer must be defined in order to call forwardPropagateSize.
Error in SeriesNetwork>iDetermineLayerOutputSize (line 952)
inputSize = layers{i}.forwardPropagateSize(inputSize);
Error in SeriesNetwork/activations (line 775)
outputSize = iDetermineLayerOutputSize(this.PrivateNetwork.Layers,
layerID, inputSize );
Error in untitled1 (line 21)
F4 = activations(net,X_Single_Normalize,5,'OutputAs','channels');
任何人都可以帮我解决这个问题吗?
答案 0 :(得分:1)
问题在于,从R2017b开始,NNT中的SeriesNetwork目前没有“最终确定”网络,以便为推理做好准备。
您可以将repro步骤简化为:
>> layers = [imageInputLayer([28 28 1],'Normalization','none');
convolution2dLayer(5,20);
reluLayer();
maxPooling2dLayer(2,'Stride',2);
fullyConnectedLayer(10);
softmaxLayer();];
>>
>> net = SeriesNetwork(layers);
>>
>> A = rand(28,28);
>> activations(net,A,5);
作为一种解决方法,您需要使用 trainNetwork 来至少训练一次迭代。由此产生的网络应该准备好进行培训/推理。您可以使用可学习的参数在每个图层上设置学习率,以避免培训实际对您的体重初始化进行任何更新。
function example
layers = [imageInputLayer([28 28 1],'Normalization','none'),...
convolution2dLayer(5,20),...
reluLayer(),...
maxPooling2dLayer(2,'Stride',2),...
fullyConnectedLayer(10),...
softmaxLayer(),...
classificationLayer];
A = rand(28,28);
net = constructInitializedNetwork(layers);
activations(net,A,5);
function net = constructInitializedNetwork(layers)
X = rand(28,28);
Y = categorical(1,1:10);
layers = freezeLayers(layers);
options =
trainingOptions('sgdm','MaxEpochs',1,'InitialLearnRate',eps);
net = trainNetwork(X,Y,layers,options);
function layers = freezeLayers(layers)
for idx = 1:length(layers)
if isprop(layers(idx),'WeightLearnRateFactor')
layers(idx).WeightLearnRateFactor = 0;
end
if isprop(layers(idx),'BiasLearnRateFactor')
layers(idx).BiasLearnRateFactor = 0;
end
end