神经网络训练时间呈指数增长

时间:2017-05-11 19:36:37

标签: r neural-network

我有一个60,000个数据条目的训练数据集,每个数据条目有28个。在训练了具有所有数据崩溃R的神经网络之后,我仅使用训练数据的子集并且注意到神经网络的训练时间随着训练数据中使用的行数呈指数增长。

对于100行,训练需要0.5秒。对于400行,它是10.5秒。对于1000行,它是170秒。

我的(新手)问题是:这是指数增长吗?它背后的直觉是什么?任何提示如何加快训练(目前我正在使用隐藏的10x5x10层)?

更新:添加了代码和示例数据

library(neuralnet)
library(readr)

training_data <- read_csv("traindata.csv",col_names=TRUE, skip=8, col_types = cols(
  .default = col_double(),
  digit = col_character()
))

# Convert the "digits" column to 10 individual binary columns
training_data <- cbind(training_data[, 1:28], nnet::class.ind(training_data$digit))

# DEBUG use first n rows only
training_data <- training_data[1:1000,]

feats <- names(training_data)
f <- paste(feats[1:28],collapse=' + ')
f <- paste('D0 + D1 + D2 + D3 + D4 + D5 + D6 + D7 + D8 + D9 ~',f)
f <- as.formula(f)

nn <- neuralnet(f,training_data,hidden=c(10,5,10),linear.output=FALSE)  

训练数据如下所示:

head (training_data)
  C1 C2 C3 C4            C5            C6           C7            C8           C9          C10
1  0  0  0  0 0.00000000000 0.18417366947 0.4061624650 0.40812324930 0.3319327731 0.1767507003
2  0  0  0  0 0.09411764706 0.17913165266 0.2203081232 0.27507002801 0.3334733894 0.2981792717
3  0  0  0  0 0.00000000000 0.04733893557 0.0675070028 0.09719887955 0.1156862745 0.1326330532
4  0  0  0  0 0.00000000000 0.09733893557 0.1268907563 0.13221288515 0.1127450980 0.1238095238
5  0  0  0  0 0.00000000000 0.00000000000 0.0000000000 0.18515406162 0.3008403361 0.2282913165
6  0  0  0  0 0.00000000000 0.03739495798 0.1812324930 0.26610644258 0.2815126050 0.2732492997
            C11           C12           C13          C14          C15          C16          C17
1 0.07170868347 0.08179271709 0.07338935574 0.1078431373 0.1359943978 0.1280112045 0.1121848739
2 0.24425770308 0.23123249300 0.20938375350 0.1854341737 0.1883753501 0.1875350140 0.1659663866
3 0.13473389356 0.15042016807 0.16554621849 0.1892156863 0.3408963585 0.4490196078 0.2428571429
4 0.11904761905 0.11554621849 0.12142857143 0.1130252101 0.1102240896 0.1330532213 0.1320728291
5 0.22955182073 0.22338935574 0.21344537815 0.2236694678 0.2756302521 0.3422969188 0.2172268908
6 0.24411764706 0.12577030812 0.10098039216 0.1575630252 0.2481792717 0.3191876751 0.2941176471
            C18           C19           C20           C21           C22           C23           C24
1 0.11414565826 0.15042016807 0.22507002801 0.23109243697 0.22338935574 0.23865546218 0.25952380952
2 0.17450980392 0.15686274510 0.18641456583 0.27955182073 0.33151260504 0.26134453782 0.15238095238
3 0.07843137255 0.06722689076 0.06722689076 0.07254901961 0.07268907563 0.08067226891 0.08081232493
4 0.13207282913 0.13221288515 0.11820728291 0.13319327731 0.12941176471 0.11036414566 0.11036414566
5 0.09509803922 0.08361344538 0.07366946779 0.08361344538 0.08851540616 0.08039215686 0.07366946779
6 0.38067226891 0.38627450980 0.29201680672 0.28459383754 0.21694677871 0.05588235294 0.00000000000
            C25           C26          C27 C28 D0 D1 D2 D3 D4 D5 D6 D7 D8 D9
1 0.19467787115 0.00000000000 0.0000000000   0  0  0  0  0  0  1  0  0  0  0
2 0.00000000000 0.00000000000 0.0000000000   0  1  0  0  0  0  0  0  0  0  0
3 0.07044817927 0.00000000000 0.0000000000   0  0  0  0  0  1  0  0  0  0  0
4 0.09663865546 0.00000000000 0.0000000000   0  0  1  0  0  0  0  0  0  0  0
5 0.08109243697 0.08837535014 0.0637254902   0  0  0  0  0  0  0  0  0  0  1
6 0.00000000000 0.00000000000 0.0000000000   0  0  0  1  0  0  0  0  0  0  0

0 个答案:

没有答案
相关问题