Keras:处理大型图像数据集

时间:2018-11-10 16:33:22

标签: keras large-data-volumes imagedata

我正在尝试使用大型图像数据集拟合模型。我有14 GB的内存RAM,数据集的大小为40 GB。我尝试使用fit_generator,但最终使用的方法是在使用主题后不会删除已加载的批次。

如果仍然有问题或资源,请多指教我。

谢谢。

生成器代码为:

class Data_Generator(Sequence):

    def __init__(self, image_filenames, labels, batch_size):
        self.image_filenames, self.labels = image_filenames, labels
        self.batch_size = batch_size

    def __len__(self):
        return int(np.ceil(len(self.image_filenames) / float(self.batch_size)))
    def __format_labels__(self, gd_truth):
        cols=gd_truth.columns
        y=[]
        for col in cols:
            y.append(gd_truth[col].values)
        return y

    def __getitem__(self, idx):
        batch_x = self.image_filenames[idx * self.batch_size:(idx + 1) * self.batch_size]
        batch_y = self.labels[idx * self.batch_size:(idx + 1) * self.batch_size]
        gd_truth=pd.DataFrame(data=batch_y,columns=self.labels.columns)
        #gd_truth=batch_y
        return np.array([read_image(file_name) for file_name in batch_x]),self.__format_labels__(gd_truth) #np.array(batch_y)

然后,我为火车和验证图像创建了两个生成器:

training_batch_generator = Data_Generator(training_filenames, trainTargets, batch_size)
mvalidation_batch_generator = Data_Generator(validation_filenames, valTargets, batch_size)

fit_generator调用如下:

num_epochs=10
model.fit_generator(generator=my_training_batch_generator,
                                          steps_per_epoch=(num_training_samples // batch_size),
                                          epochs=num_epochs,
                                          verbose=1,
                                          validation_data=my_validation_batch_generator,
                                          validation_steps=(num_validation_samples // batch_size),
                                          max_queue_size=16)

0 个答案:

没有答案
相关问题