我正在尝试研究一个主题,我正在使用Encog Library for JAVA。
我有以下问题
org.encog.EncogError: Must have at least one output for feedforward.
at org.encog.ml.factory.method.FeedforwardFactory.create(FeedforwardFactory.java:70)
at org.encog.plugin.system.SystemMethodsPlugin.createMethod(SystemMethodsPlugin.java:133)
at org.encog.ml.factory.MLMethodFactory.create(MLMethodFactory.java:104)
at org.encog.ml.model.EncogModel.createMethod(EncogModel.java:366)
at org.encog.ml.model.EncogModel.fitFold(EncogModel.java:190)
at org.encog.ml.model.EncogModel.crossvalidate(EncogModel.java:285)
at org.encog.examples.guide.classification.Classification.run(roadClassification.java:126)
at org.encog.examples.guide.classification.Classification.main(roadClassification.java:188)
我的文件如下
public class Classification {
private String tempPath;
public void run(String[] args) {
try {
// Download the data that we will attempt to model.
System.out.println("Read File");
File File = new File("E:/xyz.csv");
// Define the format of the data file.
// This area will change, depending on the columns and
// format of the file that you are trying to model.
System.out.println("Create source");
VersatileDataSource source = new CSVDataSource(roadFile, false,
CSVFormat.DECIMAL_POINT);
VersatileMLDataSet data = new VersatileMLDataSet(source);
data.defineSourceColumn("abc", 0, ColumnType.continuous);
data.defineSourceColumn("def", 1, ColumnType.continuous);
data.defineSourceColumn("ghi", 2, ColumnType.continuous);
data.defineSourceColumn("jkl", 3, ColumnType.continuous);
data.defineSourceColumn("mno", 4, ColumnType.continuous);
System.out.println("Create output column");
// Define the column that we are trying to predict.
ColumnDefinition outputColumn = data.defineSourceColumn("Prediction", 5,
ColumnType.nominal);
System.out.println("Map the prediction column to the output of the model, and all other columns to the input.");
data.defineOutput(outputColumn);
// Analyze the data, determine the min/max/mean/sd of every column.
System.out.println("Start Analysis");
data.analyze();
System.out.println("End Analysis");
data.defineSingleOutputOthersInput(outputColumn);
// Create feedforward neural network as the model type. MLMethodFactory.TYPE_FEEDFORWARD.
// You could also other model types, such as:
// MLMethodFactory.SVM: Support Vector Machine (SVM)
// MLMethodFactory.TYPE_RBFNETWORK: RBF Neural Network
// MLMethodFactor.TYPE_NEAT: NEAT Neural Network
// MLMethodFactor.TYPE_PNN: Probabilistic Neural Network
System.out.println("Create TYPE_FEEDFORWARD neural network as the model type");
EncogModel model = new EncogModel(data);
model.selectMethod(data, MLMethodFactory.TYPE_FEEDFORWARD);
// Send any output to the console.
model.setReport(new ConsoleStatusReportable());
// Now normalize the data. Encog will automatically determine the correct normalization
// type based on the model you chose in the last step.
System.out.println("Now normalize the data");
data.normalize();
// Hold back some data for a final validation.
// Shuffle the data into a random ordering.
// Use a seed of 1001 so that we always use the same holdback and will get more consistent results.
model.holdBackValidation(0.03, true, 1001);
// Choose whatever is the default training type for this model.
model.selectTrainingType(data);
// Use a 5-fold cross-validated train. Return the best method found.
MLRegression bestMethod = (MLRegression)model.crossvalidate(3, true);
// Display the training and validation errors.
System.out.println( "Training error: " + EncogUtility.calculateRegressionError(bestMethod, model.getTrainingDataset()));
System.out.println( "Validation error: " + EncogUtility.calculateRegressionError(bestMethod, model.getValidationDataset()));
// Display our normalization parameters.
NormalizationHelper helper = data.getNormHelper();
System.out.println(helper.toString());
// Display the final model.
System.out.println("Final model: " + bestMethod);
// Loop over the entire, original, dataset and feed it through the model.
// This also shows how you would process new data, that was not part of your
// training set. You do not need to retrain, simply use the NormalizationHelper
// class. After you train, you can save the NormalizationHelper to later
// normalize and denormalize your data.
System.out.println("Loop over entire dataset.");
ReadCSV csv = new ReadCSV(roadFile, false, CSVFormat.DECIMAL_POINT);
String[] line = new String[4];
MLData input = helper.allocateInputVector();
int threshold = 50,count=0;
float error=0,accuracy=1;
StringBuilder result = new StringBuilder();
while(csv.next()) {
if(count >=threshold){
break;
}
count++;
line[0] = csv.get(0);
line[1] = csv.get(1);
line[2] = csv.get(2);
line[3] = csv.get(3);
String correct = csv.get(4);
helper.normalizeInputVector(line,input.getData(),false);
MLData output = bestMethod.compute(input);
String chosen = helper.denormalizeOutputVectorToString(output)[0];
result.append("\n"+Arrays.toString(line));
result.append(" -> predicted: ");
result.append(chosen);
error = (Float.valueOf(correct)) - (Float.valueOf(chosen));
result.append("(error: "+error);
result.append(")");
accuracy*=error/100*count;
}
result.append("\naccuracy in (%) :"+accuracy);
System.out.println(result.toString());
// Delete data file ande shut down.
//roadFile.delete();
Encog.getInstance().shutdown();
} catch (Exception ex) {
ex.printStackTrace();
}
}
public static void main(String[] args) {
roadClassification prg = new roadClassification();
prg.run(args);
}
}
在这里你可以看到我写了两个陈述
data.defineOutput(outputColumn);
data.defineSingleOutputOthersInput(outputColumn);
即使我添加了
行 VersatileMLDataSet data = new VersatileMLDataSet(source);
data.getNormHelper().getOutputColumns().add(outputColumn);
但它仍然会出现此错误。