问题描述
在我的代码中,我想对神经网络(模型)的训练进行基准测试,但是当我运行它时,我看到了标题中的错误。在指南中,它说“有时你会想要初始化一些你的基准代码需要的变量,但你不想成为基准测试代码的一部分。这些变量被称为状态变量”,我注释了@在我的主要方法上方说明,但它并没有改变结果。我做错了什么?
public class IrisClassifier {
private static Logger log = LoggerFactory.getLogger(IrisClassifier.class);
public static void main(String[] args) throws Exception {
//First: get the dataset using the record reader. CSVRecordReader handles loading/parsing
int numLinesToSkip = 0;
char delimiter = ',';
RecordReader recordReader = new CSVRecordReader(numLinesToSkip,delimiter);
recordReader.initialize(new FileSplit(new File(DownloaderUtility.IRISDATA.Download(),"iris.txt")));
//Second: the RecordReaderDataSetIterator handles conversion to DataSet objects,ready for use in neural network
int labelIndex = 4; //5 values in each row of the iris.txt CSV: 4 input features followed by an integer label (class) index. Labels are the 5th value (index 4) in each row
int numClasses = 3; //3 classes (types of iris flowers) in the iris data set. Classes have integer values 0,1 or 2
int batchSize = 150; //Iris data set: 150 examples total. We are loading all of them into one DataSet (not recommended for large data sets)
DataSetIterator iterator = new RecordReaderDataSetIterator(recordReader,batchSize,labelIndex,numClasses);
DataSet allData = iterator.next();
allData.shuffle();
SplitTestAndTrain testAndTrain = allData.splitTestAndTrain(0.65); //Use 65% of data for training
DataSet trainingData = testAndTrain.getTrain();
DataSet testData = testAndTrain.gettest();
//We need to normalize our data. We'll use normalizeStandardize (which gives us mean 0,unit variance):
Datanormalization normalizer = new normalizerStandardize();
normalizer.fit(trainingData); //Collect the statistics (mean/stdev) from the training data. This does not modify the input data
normalizer.transform(trainingData); //Apply normalization to the training data
normalizer.transform(testData); //Apply normalization to the test data. This is using statistics calculated from the *training* set
final int numInputs = 4;
int outputNum = 3;
long seed = 6;
log.info("Build model....");
MultiLayerConfiguration conf = new NeuralNetConfiguration.Builder()
.seed(seed)
.activation(Activation.TANH)
.weightinit(Weightinit.XAVIER)
.updater(new Sgd(0.1))
.l2(1e-4)
.list()
.layer(new DenseLayer.Builder().nIn(numInputs).nOut(3)
.build())
.layer(new DenseLayer.Builder().nIn(3).nOut(3)
.build())
.layer( new OutputLayer.Builder(LossFunctions.LossFunction.NEGATIVELOGLIKELIHOOD)
.activation(Activation.softmax) //Override the global TANH activation with softmax for this layer
.nIn(3).nOut(outputNum).build())
.build();
//run the model
MultiLayerNetwork model = new MultiLayerNetwork(conf);
model.init();
//record score once every 100 iterations
model.setListeners(new scoreIterationListener(100));
model.setListeners(new PerformanceListener(100));
benchmarkingTraining(model,trainingData);
//evaluate the model on the test set
Evaluation eval = new Evaluation(3);
Indarray output = model.output(testData.getFeatures());
eval.eval(testData.getLabels(),output);
log.info(eval.stats());
}
public static class BenchmarkRunner {
public static void main(String[] args) throws Exception {
org.openjdk.jmh.Main.main(args);
}
}
@Fork(value = 1,warmups = 2)
@Benchmark
@BenchmarkMode(Mode.AverageTime)
public static void benchmarkingTraining(MultiLayerNetwork model,DataSet trainingData) {
for (int i = 0; i < 1000; i++) {
model.fit(trainingData);
}
}
}
解决方法
暂无找到可以解决该程序问题的有效方法,小编努力寻找整理中!
如果你已经找到好的解决方法,欢迎将解决方案带上本链接一起发送给小编。
小编邮箱:dio#foxmail.com (将#修改为@)