有 Java 编程相关的问题?

你可以在下面搜索框中键入要查询的问题!

预测函数的java DeepLearning4j NN不收敛

我试图在DL4j中做一个简单的预测(稍后将用于具有n个功能的大型数据集),但无论我做什么,我的网络都不想学习,行为非常怪异。当然,我学习了所有教程,并按照dl4j repo中所示的步骤进行了学习,但不知怎么的,这对我来说并不管用

对于虚拟功能数据,我使用:

*双[val][x]功能;其中val=linspace(-10,10)。。。;x=数学。sqrt(Math.abs(val))*val

我的y是:双[y]标签;其中y=数学。sin(val)/val

DataSetIterator dataset_train_iter = getTrainingData(x_features, y_outputs_train, batchSize, rnd);
    DataSetIterator dataset_test_iter = getTrainingData(x_features_test, y_outputs_test, batchSize, rnd);

    // Normalize data, including labels (fitLabel=true)
    NormalizerMinMaxScaler normalizer = new NormalizerMinMaxScaler(0, 1);
    normalizer.fitLabel(false);
    normalizer.fit(dataset_train_iter);              
    normalizer.fit(dataset_test_iter);

    // Use the .transform function only if you are working with a small dataset and no iterator
    normalizer.transform(dataset_train_iter.next());
    normalizer.transform(dataset_test_iter.next());

    dataset_train_iter.setPreProcessor(normalizer);
    dataset_test_iter.setPreProcessor(normalizer);

    //DataSet setNormal = dataset.next();

//创建网络

MultiLayerConfiguration conf = new NeuralNetConfiguration.Builder()
                .seed(seed)
                .optimizationAlgo(OptimizationAlgorithm.STOCHASTIC_GRADIENT_DESCENT)
                .weightInit(WeightInit.XAVIER)
                //.miniBatch(true)
                //.l2(1e-4)
                //.activation(Activation.TANH)
                .updater(new Nesterovs(0.1,0.3))
                .list()
                .layer(new DenseLayer.Builder().nIn(numInputs).nOut(20).activation(Activation.TANH)
                        .build())
                .layer(new DenseLayer.Builder().nIn(20).nOut(10).activation(Activation.TANH)
                        .build())
                .layer( new DenseLayer.Builder().nIn(10).nOut(6).activation(Activation.TANH)
                        .build())
                .layer(new OutputLayer.Builder(LossFunctions.LossFunction.MSE)
                        .activation(Activation.IDENTITY)
                        .nIn(6).nOut(1).build())
                .build();

//训练和健身网络

final MultiLayerNetwork net = new MultiLayerNetwork(conf);
net.init();
    net.setListeners(new ScoreIterationListener(100));
    //Train the network on the full data set, and evaluate in periodically
    final INDArray[] networkPredictions = new INDArray[nEpochs / plotFrequency];
    for (int i = 0; i < nEpochs; i++) {
        //in fit we have already Backpropagation. See Release deeplearning
        // https://deeplearning4j.konduit.ai/release-notes/1.0.0-beta3
        net.fit(dataset_train_iter);
        dataset_train_iter.reset();
        if((i+1) % plotFrequency == 0)  networkPredictions[i/ plotFrequency] = net.output(x_features, false);
    }

//评估并绘制

    dataset_test_iter.reset();
    dataset_train_iter.reset();

    INDArray predicted = net.output(dataset_test_iter, false);
    System.out.println("PREDICTED ARRAY                " + predicted);
    INDArray output_train = net.output(dataset_train_iter, false);

    //Revert data back to original values for plotting
    // normalizer.revertLabels(predicted);
    normalizer.revertLabels(output_train);
    normalizer.revertLabels(predicted);

    PlotUtil.plot(om, y_outputs_train, networkPredictions);

我的输出看起来很奇怪(见下图),甚至当我使用miniBatch(120100个样本/批)更改纪元数或添加隐藏节点和隐藏层(尝试添加1000个节点和5层)时。网络要么输出非常随机的值,要么输出一个常数y。我就是不知道,这里出了什么问题。为什么网络甚至没有接近列车功能

另一个问题:iter不支持什么。reset()精确执行。迭代器是否将指针返回到DataSetitor中的0-Batch

enter image description here


共 (0) 个答案