在caffe的prototxt中出错,解析器期望一个标识符?

时间:2018-02-05 13:42:29

标签: python caffe

我试图使用caffe:

在python中执行这个简单的代码
import caffe
net = caffe.Net("myfile.prototxt", caffe.TEST)

我收到此消息,所以我想我的.prototxt

有错误
[libprotobuf ERROR google/protobuf/text_format.cc:274] Error parsing text-format caffe.NetParameter: 26:22: Expected identifier.
WARNING: Logging before InitGoogleLogging() is written to STDERR
F0205 14:29:24.097086  1120 upgrade_proto.cpp:88] Check failed: ReadProtoFromTextFile(param_file, param) Failed to parse NetParameter file: caffeModel.prototxt

但是我无法理解这个错误,它只是一个深度可分离卷积的架构,其间有批量标准化层,也许我应该有一些输入层?

name: "UNIPINET"
#  transform_param {
#    scale: 0.017
#    mirror: false
#    crop_size: 224
#    mean_value: [103.94,116.78,123.68]
#  }
input: "data"
input_dim: 1
input_dim: 1
input_dim: 63
input_dim: 13
layer {
  name: "conv1/dw"
  type: "Convolution"
  bottom: "data"
  top: "conv1/dw"
  param {
    lr_mult: 1
    decay_mult: 1
  }
  convolution_param {
    num_output: 1
    bias_term: true
    pad: 0
    kernel_size: 15, 3
    group: 1
    #engine: CAFFE
    stride: 1
    weight_filler {
      type: "msra"
    }
  }
}
layer {
  name: "conv1/sep"
  type: "Convolution"
  bottom: "conv1/dw"
  top: "conv1/sep"
  param {
    lr_mult: 1
    decay_mult: 1
  }
  convolution_param {
    num_output: 16
    bias_term: false
    pad: 0
    kernel_size: 1
    stride: 1
    weight_filler {
      type: "msra"
    }
  }
}
layer {
  name: "conv1/sep/bn"
  type: "BatchNorm"
  bottom: "conv1/sep"
  top: "conv1/sep"
  param {
    lr_mult: 0
    decay_mult: 0
  }
  param {
    lr_mult: 0
    decay_mult: 0
  }
  param {
    lr_mult: 0
    decay_mult: 0
  }
}
layer {
  name: "relu1/sep"
  type: "ReLU"
  bottom: "conv1/sep"
  top: "conv1/sep"
}

layer {
  name: "avg_pool"
  type: "Pooling"
  bottom: "conv1/sep"
  top: "pool6"
  pooling_param {
    pool: AVE
    global_pooling: true
  }
}
layer {
  name: "fc"
  type: "InnerProduct"
  bottom: "avg_pool"
  top: "fc"
  param {
    lr_mult: 1
    decay_mult: 1
  }
  param {
    lr_mult: 2
    decay_mult: 0
  }
  inner_product_param {
    num_output: 12
    weight_filler {
      type: "msra"
    }
    bias_filler {
      type: "constant"
      value: 0
    }
  }
}
layer {
  name: "output"
  type: "Softmax"
  bottom: "fc"
  top: "output"
}

1 个答案:

答案 0 :(得分:0)

解决了,kernel_size意味着两个维度相等,如果我想为H使用不同的维度,W应该使用kernel_w kernel_h。