GithubHelp home page GithubHelp logo

deeplearning4j / deeplearning4j-examples Goto Github PK

View Code? Open in Web Editor NEW
2.4K 213.0 1.8K 347.13 MB

Deeplearning4j Examples (DL4J, DL4J Spark, DataVec)

Home Page: http://deeplearning4j.konduit.ai

License: Other

Java 98.55% Shell 0.87% Kotlin 0.58%
javafx dl4j deeplearning4j intellij python zeppelin-notebook deeplearning artificial-intelligence

deeplearning4j-examples's Introduction

Documentation Get help at the community forum javadoc javadoc License GitHub commit activity

The Eclipse Deeplearning4J (DL4J) ecosystem is a set of projects intended to support all the needs of a JVM based deep learning application. This means starting with the raw data, loading and preprocessing it from wherever and whatever format it is in to building and tuning a wide variety of simple and complex deep learning networks.

Because Deeplearning4J runs on the JVM you can use it with a wide variety of JVM based languages other than Java, like Scala, Kotlin, Clojure and many more.

The DL4J stack comprises of:

  • DL4J: High level API to build MultiLayerNetworks and ComputationGraphs with a variety of layers, including custom ones. Supports importing Keras models from h5, including tf.keras models (as of 1.0.0-beta7) and also supports distributed training on Apache Spark
  • ND4J: General purpose linear algebra library with over 500 mathematical, linear algebra and deep learning operations. ND4J is based on the highly-optimized C++ codebase LibND4J that provides CPU (AVX2/512) and GPU (CUDA) support and acceleration by libraries such as OpenBLAS, OneDNN (MKL-DNN), cuDNN, cuBLAS, etc
  • SameDiff : Part of the ND4J library, SameDiff is our automatic differentiation / deep learning framework. SameDiff uses a graph-based (define then run) approach, similar to TensorFlow graph mode. Eager graph (TensorFlow 2.x eager/PyTorch) graph execution is planned. SameDiff supports importing TensorFlow frozen model format .pb (protobuf) models. Import for ONNX, TensorFlow SavedModel and Keras models are planned. Deeplearning4j also has full SameDiff support for easily writing custom layers and loss functions.
  • DataVec: ETL for machine learning data in a wide variety of formats and files (HDFS, Spark, Images, Video, Audio, CSV, Excel etc)
  • LibND4J : C++ library that underpins everything. For more information on how the JVM acceses native arrays and operations refer to JavaCPP
  • Python4J: Bundled cpython execution for the JVM

All projects in the DL4J ecosystem support Windows, Linux and macOS. Hardware support includes CUDA GPUs (10.0, 10.1, 10.2 except OSX), x86 CPU (x86_64, avx2, avx512), ARM CPU (arm, arm64, armhf) and PowerPC (ppc64le).

Community Support

For support for the project, please go over to https://community.konduit.ai/

Using Eclipse Deeplearning4J in your project

Deeplearning4J has quite a few dependencies. For this reason we only support usage with a build tool.

<dependencies>
  <dependency>
      <groupId>org.deeplearning4j</groupId>
      <artifactId>deeplearning4j-core</artifactId>
      <version>1.0.0-M2.1</version>
  </dependency>
  <dependency>
      <groupId>org.nd4j</groupId>
      <artifactId>nd4j-native-platform</artifactId>
      <version>1.0.0-M2.1</version>
  </dependency>
</dependencies>

Add these dependencies to your pom.xml file to use Deeplearning4J with the CPU backend. A full standalone project example is available in the example repository, if you want to start a new Maven project from scratch.

Code samples

Due to DL4J being a multi faceted project with several modules in the mono repo, we recommend looking at the examples for a taste of different usages of the different modules. Below we'll link to examples for each module.

  1. ND4J: https://github.com/deeplearning4j/deeplearning4j-examples/tree/master/nd4j-ndarray-examples
  2. DL4J: https://github.com/deeplearning4j/deeplearning4j-examples/tree/master/dl4j-examples
  3. Samediff: https://github.com/deeplearning4j/deeplearning4j-examples/tree/master/samediff-examples
  4. Datavec: https://github.com/deeplearning4j/deeplearning4j-examples/tree/master/data-pipeline-examples
  5. Python4j: https://deeplearning4j.konduit.ai/python4j/tutorials/quickstart

For users looking for being able to run models from other frameworks, see:

  1. Onnx: https://github.com/deeplearning4j/deeplearning4j-examples/tree/master/onnx-import-examples
  2. Tensorflow/Keras: https://github.com/deeplearning4j/deeplearning4j-examples/tree/master/tensorflow-keras-import-examples

Documentation, Guides and Tutorials

You can find the official documentation for Deeplearning4J and the other libraries of its ecosystem at http://deeplearning4j.konduit.ai/.

Want some examples?

We have separate repository with various examples available: https://github.com/eclipse/deeplearning4j-examples

Building from source

It is preferred to use the official pre-compiled releases (see above). But if you want to build from source, first take a look at the prerequisites for building from source here: https://deeplearning4j.konduit.ai/multi-project/how-to-guides/build-from-source. Various instructions for cpu and gpu builds can be found there. Please go to our forums for further help.

Running tests

In order to run tests, please see the platform-tests module. This module only runs on jdk 11 (mostly due to spark and bugs with older scala versions + JDK 17)

platform-tests allows you to run dl4j for different backends. There are a few properties you can specify on the command line:

  1. backend.artifactId: this defaults to nd4j-native and will run tests on cpu,you can specify other backends like nd4j-cuda-11.6
  2. dl4j.version: You can change the dl4j version that the tests run against. This defaults to 1.0.0-SNAPSHOT.

More parameters can be found here:

<platform.classifier>${javacpp.platform}</platform.classifier>

Running project in Intellij IDEA:

  1. Ensure you follow https://stackoverflow.com/questions/45370178/exporting-a-package-from-system-module-is-not-allowed-with-release on jdk 9 or later
  2. Ignore all nd4j-shade submodules. Right click on each folder and click: Maven -> Ignore project

License

Apache License 2.0

Commercial Support

Deeplearning4J is actively developed by the team at Konduit K.K..

[If you need any commercial support feel free to reach out to us. at [email protected]

deeplearning4j-examples's People

Contributors

agibsonccc avatar alexdblack avatar bam4d avatar dariuszzbyrad avatar donaldalan avatar eraly avatar k-tamura avatar lavajaw avatar loxal avatar maxpumperla avatar mstrap avatar onebeartoe avatar robaltena avatar saudet avatar shamsulazeem avatar skymindops avatar treo avatar yptheangel avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

deeplearning4j-examples's Issues

Windows quick start docs

I was suggested in Gitter to post this as an issue.

The getting started page says that for an easier install we should follow this link: ``Windows instructions are available here''. However, those instructions are long, for compiing ND4J with VS.

Much later I realized that the quick solution was two lines above, with this dropbox link for all dlls. I think it would be helpful to clean up a little bit the quick part of the instructions so that others don't spend as much time as me and others in the past :)

P. S.: In the quickstart section, the link to ``Windows instructions are available here'' is broken.

Thanks

Modified CSV example, unclear why the model is not trained properly

Based upon the CSVExample, here's a slightly modified version. The net uses only one hidden layer, with 8 nodes. I'm trying to train a network to recognize how high or low the output of a function f(x,y) = x + y^2 should be. First a file myfunc.csv is generated that contains 'num' lines of 3 integers per line. First two are x and y (the features) and the third is the label. The label is one of the classes in {0, 1, 2, 3}. The range of x and y is bound by 0 <= x,y < 0.1 * num
Let K be 0.1_num + 0.01_num^2 then
x,y are associated with class 0 iff 0 <= x + y^2 < K/4
x,y are associated with class 1 iff K/4 <= x + y^2 < K/2
x,y are associated with class 0 iff K/2 <= x + y^2 < 3K/4
x,y are associated with class 0 iff 3K/4 <= x + y^2 < K
After training the net (after 4000 iterations the error doesn't drop any further) all that the model predicts is that every input should be associated with class '0', which clearly can't be the case. I'm also confused about why model.predict(Nd4j.create(new float[]{2,3}) returns an array of length 2 - why not 4: the number of classes possible?

Thanks!

Mark Smeets.

package org.deeplearning4j.examples.csv;

import org.canova.api.records.reader.RecordReader;
import org.canova.api.records.reader.impl.CSVRecordReader;
import org.canova.api.split.FileSplit;
import org.deeplearning4j.datasets.canova.RecordReaderDataSetIterator;
import org.deeplearning4j.eval.Evaluation;
import org.deeplearning4j.nn.conf.GradientNormalization;
import org.deeplearning4j.nn.conf.MultiLayerConfiguration;
import org.deeplearning4j.nn.conf.NeuralNetConfiguration;
import org.deeplearning4j.nn.conf.layers.DenseLayer;
import org.deeplearning4j.nn.conf.layers.OutputLayer;
import org.deeplearning4j.nn.multilayer.MultiLayerNetwork;
import org.deeplearning4j.nn.weights.WeightInit;
import org.deeplearning4j.optimize.api.IterationListener;
import org.deeplearning4j.optimize.listeners.ScoreIterationListener;
import org.nd4j.linalg.api.ndarray.INDArray;
import org.nd4j.linalg.dataset.DataSet;
import org.nd4j.linalg.dataset.SplitTestAndTrain;
import org.nd4j.linalg.dataset.api.iterator.DataSetIterator;
import org.nd4j.linalg.factory.Nd4j;
import org.nd4j.linalg.lossfunctions.LossFunctions;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;

import java.io.File;
import java.io.FileNotFoundException;
import java.io.PrintWriter;
import java.util.Arrays;

public class CSVModExample
{

  private static Logger log = LoggerFactory.getLogger(CSVModExample.class);

  public static void main(String[] args) throws Exception
  {

    int num_samples = 1000;
    String fileName = "myfunc.csv";
    if (!new File(fileName).exists())
    {
      log.info("Generating new file with data: " + fileName);
      generate(fileName, num_samples);
    }

    RecordReader recordReader = new CSVRecordReader(1, ",");
    recordReader.initialize(new FileSplit(new File(fileName)));

    // take large batchsize, so we don't have to iterate.
    int batchSize = num_samples;
    DataSetIterator iterator = new RecordReaderDataSetIterator(recordReader,batchSize, 2, 4);
    //get the dataset using the record reader. The datasetiterator handles vectorization
    DataSet next = iterator.next();
    // Customizing params
    Nd4j.MAX_SLICES_TO_PRINT = 10;
    Nd4j.MAX_ELEMENTS_PER_SLICE = 10;

    final int numInputs = 2;
    final int numHiddenNodes = 8;
    int outputNum = 4;
    int iterations = 400;
    long seed = 6;
    int listenerFreq = 10;


    log.info("Build model....");
    MultiLayerConfiguration conf = new NeuralNetConfiguration.Builder()
      .seed(seed)
      .iterations(iterations)
      .useDropConnect(true)
      .learningRate(1e-1)
      .l1(0.3).regularization(true).l2(1e-3)
      .gradientNormalization(GradientNormalization.RenormalizeL2PerLayer)
      .list(2)
      .layer(0, new DenseLayer.Builder().nIn(numInputs).nOut(numHiddenNodes)
        .activation("relu")
        .weightInit(WeightInit.XAVIER)
        .build())
      .layer(1, new OutputLayer.Builder(LossFunctions.LossFunction.NEGATIVELOGLIKELIHOOD)
        .weightInit(WeightInit.XAVIER)
        .activation("softmax")
        .nIn(numHiddenNodes).nOut(outputNum).build())
      .backprop(true).pretrain(false)
      .build();

    //run the model
    MultiLayerNetwork model = new MultiLayerNetwork(conf);
    model.init();
    model.setListeners(Arrays.asList((IterationListener) new ScoreIterationListener(listenerFreq)));

    next.normalizeZeroMeanZeroUnitVariance();
    next.shuffle();
    //split test and train, don't bother with a validation set for now
    SplitTestAndTrain testAndTrain = next.splitTestAndTrain(0.8);

    DataSet trainSet = testAndTrain.getTrain();
    log.info("Training set has " + trainSet.numExamples() + " records ");
    model.fit(trainSet);

    //evaluate the model
    // print some stats on the test set
    DataSet testSet = testAndTrain.getTest();
    log.info("Test set has " + testSet.numExamples() + " records ");
    // show that at all classes occur in the test set at least once.
    for (int i = 0; i < 15; i++)
    {
      log.info(testSet.get(i).getFeatures() + " -> " + testSet.get(i).getLabels() );
    }

    Evaluation eval = new Evaluation(4);
    DataSet test = testAndTrain.getTest();
    INDArray output = model.output(test.getFeatureMatrix());
    eval.eval(test.getLabels(), output);
    // strange.. every possible input is always mapped to class '0'.
    log.info(eval.stats());

    // let's try to predict in what class x+y^2 would be for x = 90, y = 90 ( should give class '3' )
    int[] prediction = model.predict(Nd4j.create(new float[]{90, 90}));
    log.info("length of output: " + prediction.length); // why is the length 2, and not 4?
    for (int i = 0; i < prediction.length; i++)
    {
      log.info("" + i + " -> " + prediction[i]);
    }
  }

  /**
   *
   * @param fileName_
   * @param num_
   * @throws FileNotFoundException
   * generate a file with three integers per line: x, y, f.
   * x and y are features, f is one of the classes in {0, 1, 2, 3}
   * the range of x and y is bound by 0 <= x,y < 0.1 * num
   * Let K be 0.1*num + 0.01*num^2 then
   * x,y are associated with class 0 iff  0 <= x + y^2 < K/4
   * x,y are associated with class 1 iff  K/4 <= x + y^2 < K/2
   * x,y are associated with class 0 iff  K/2 <= x + y^2 < 3K/4
   * x,y are associated with class 0 iff  3K/4 <= x + y^2 < K
   *
   * In other words, we generate some random numbers, apply some function on them and associate them with classes
   * depending on how high or low the output of that function is.
   */
  private static void generate(String fileName_, int num_) throws FileNotFoundException
  {
    java.util.Random r = new java.util.Random(123);
    PrintWriter pw = new PrintWriter(fileName_);
    int bound = (int) (0.1 * num_);
    int K = bound + bound*bound;
//        pw.println("#x,y,f with K = " + K);
    for (int i = 0; i < num_; i++)
    {
      int x = r.nextInt(bound);
      int y = r.nextInt(bound);
      int f = x + y*y;
      int cls = 0;
      if (f < K / 4)
        cls = 0;
      else if (f < (K / 2))
        cls = 1;
      else if (f < (K * 3/4))
        cls = 2;
      else
        cls = 3;
      pw.println(x + "," + y + "," + cls);
    }
    pw.flush();
    pw.close();
  }

}

Not running code

cd C:\Users\Abhilasha\Documents\NetBeansProjects\dl4j-0.4-examples-master; "JAVA_HOME=C:\Program Files\Java\jdk1.8.0" cmd /c """C:\Program Files\NetBeans 8.0.1\java\maven\bin\mvn.bat" -Dexec.args="-classpath %classpath org.deeplearning4j.examples.rnn.GravesLSTMCharModellingExample" -Dexec.executable="C:\Program Files\Java\jdk1.8.0\bin\java.exe" -Dexec.classpathScope=runtime -Dmaven.ext.class.path="C:\Program Files\NetBeans 8.0.1\java\maven-nblib\netbeans-eventspy.jar" org.codehaus.mojo:exec-maven-plugin:1.2.1:exec""
Running NetBeans Compile On Save execution. Phase execution is skipped and output directories of dependency projects (with Compile on Save turned on) will be used instead of their jar artifacts.
Scanning for projects...

Some problems were encountered while building the effective model for org.deeplearning4j:deeplearning4j-examples:jar:0.4-rc0-SNAPSHOT
'build.plugins.plugin.version' for org.apache.maven.plugins:maven-compiler-plugin is missing. @ line 146, column 21

It is highly recommended to fix these problems because they threaten the stability of your build.

For this reason, future Maven versions might no longer support building such malformed projects.


Building DeepLearning4j Examples 0.4-rc0-SNAPSHOT

--- exec-maven-plugin:1.2.1:exec (default-cli) @ deeplearning4j-examples ---
File downloaded to C:\Users\ABHILA~1\AppData\Local\Temp\Shakespeare.txt
Loaded and converted file: 5459809 valid characters of 5465100 total characters (5291 removed)
SLF4J: The requested version 1.6 by your slf4j binding is not compatible with [1.5.5, 1.5.6, 1.5.7, 1.5.8]
SLF4J: See http://www.slf4j.org/codes.html#version_mismatch for further details.
Exception in thread "main" java.lang.ExceptionInInitializerError: unable to load from [netlib-native_system-win-x86_64.dll]
at com.github.fommil.jni.JniLoader.load(JniLoader.java:81)
at com.github.fommil.netlib.NativeSystemBLAS.(NativeSystemBLAS.java:42)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:259)
at org.nd4j.linalg.cpu.BlasWrapper.(BlasWrapper.java:41)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:259)
at org.nd4j.linalg.factory.Nd4j.initWithBackend(Nd4j.java:4532)
at org.nd4j.linalg.factory.Nd4j.initContext(Nd4j.java:4490)
at org.nd4j.linalg.factory.Nd4j.(Nd4j.java:136)
at org.deeplearning4j.nn.conf.NeuralNetConfiguration$Builder.seed(NeuralNetConfiguration.java:415)

at org.deeplearning4j.examples.rnn.GravesLSTMCharModellingExample.main(GravesLSTMCharModellingExample.java:73)

BUILD FAILURE

Total time: 1:01.600s
Finished at: Tue Oct 27 21:24:13 IST 2015

Final Memory: 12M/165M

Failed to execute goal org.codehaus.mojo:exec-maven-plugin:1.2.1:exec (default-cli) on project deeplearning4j-examples: Command execution failed. Process exited with an error: 1 (Exit value: 1) -> [Help 1]

To see the full stack trace of the errors, re-run Maven with the -e switch.
Re-run Maven using the -X switch to enable full debug logging.

For more information about the errors and possible solutions, please read the following articles:
[Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException

please help me to run this code

MLPBackpropIrisExample's Seed Variable Largely Affects Prediction Result

I'm experimenting with the MLPBackpropIrisExample and noticed that varying the seed variable has a big impact in the prediction result. The default seed is 6 and this gives varying prediction results for different rows (ex: higher probability for class 0 in some cases, higher probability for class 1 or class 2 in other cases). However, when I change the default seed to another value, say 61 or 10, all examples/rows were classified to belong to a single class. The prediction for each row has the following probability for the Iris dataset: [ 0.33, 0.33, 0.33]. I am wondering why varying the seed largely affects the prediction result. Isn't it mainly used for making prediction results reproducible? What are the parameters that are normally modified to improve the prediction result?

Saving, reloading, and predicting a regression model producing static scalars

I modified the regression example to output the mean and std. I pulled the logic out of the normalize method.

        DataSetIterator iter = new RecordReaderDataSetIterator(reader,null,450000,-1,1,true);
        DataSet next = iter.next();

        INDArray columnMeans = next.getFeatures().mean(0);
        INDArray columnStds = next.getFeatureMatrix().std(0);

        next.setFeatures(next.getFeatures().subiRowVector(columnMeans));
        columnStds.addi(Nd4j.scalar(Nd4j.EPS_THRESHOLD));
        next.setFeatures(next.getFeatures().diviRowVector(columnStds));

        SplitTestAndTrain testAndTrain = next.splitTestAndTrain(0.8);

I then output and measure the error at the end of RegressionExample:

output = model.output(testAndTrain.getTest().getFeatureMatrix());
for (int i = 0; i < output.vectorsAlongDimension(output.rank()-1); i++) {
    Double actual = testAndTrain.getTest().getLabels().getRow(i).getDouble(0);
    Double predicted = output.getRow(i).getDouble();
    log.info("actual " + actual + " vs predicted " + predicted);
    avgError = avgError + (Math.abs(actual - predicted) - avgError) / (tests + 1);
    tests++;
}

At this point the test set produces reasonable predictions.

I output the settings:

OutputStream fos = Files.newOutputStream(Paths.get(System.getProperty("user.home") + "/coefficients_" +
        filename + ".bin"));
DataOutputStream dos = new DataOutputStream(fos);
Nd4j.write(model.params(), dos);
dos.flush();
dos.close();

fos = Files.newOutputStream(Paths.get(System.getProperty("user.home") + "/means_" +
        filename + ".bin"));
dos = new DataOutputStream(fos);
Nd4j.write(columnMeans, dos);
dos.flush();
dos.close();

Then I later reload it:

confFromJson = MultiLayerConfiguration.fromJson(FileUtils.readFileToString(new File(System.getProperty("user.home") + "/conf.json")));
DataInputStream dis = new DataInputStream(new FileInputStream(System.getProperty("user.home") + "/coefficients.bin"));
INDArray newParams = Nd4j.read(dis);
dis.close();
savedNetwork = new MultiLayerNetwork(confFromJson);
savedNetwork.init();
savedNetwork.setParameters(newParams);

dis = new DataInputStream(new FileInputStream(System.getProperty("user.home") + "/means.bin"));
trainMeans = Nd4j.read(dis);
dis.close();

I produce an INDArray/DataSet with some of the training data:

INDArray pointInTime = Nd4j.zeros(949);
...
pointInTime.putScalar(i,dataset.getValue(i).toDouble());
...
next = new DataSet(pointInTime, resultsArray);

I transform the data:

next.setFeatures(next.getFeatures().subiRowVector(trainMeans));
next.setFeatures(next.getFeatures().diviRowVector(trainStd));

Then try to make the prediction:

INDArray test = next.getFeatureMatrix();
INDArray output = savedNetwork.output(test);
Double predicted = output.getRow(0).getDouble();

At this point I get the same scalar value no matter what data I feed in (with the exception of sometimes getting NaN).

I'm wondering if maybe it's only considering the bias? 1 x weight = same scalar value every time?

Can anyone spot somewhere I messed up?

DBNMnistFullExample Never Terminates

As written, org.deeplearning4j.examples.deepbelief.DBNMnistFullExample never terminates under deeplearning4j v0.4-rc3.6. Specifically, after running it for some time, I found the steady state was to continuously repeat:

15:49:08.110 [main] DEBUG o.d.o.solvers.BackTrackLineSearch - Model score after step = 0.0
15:49:08.110 [main] DEBUG o.d.o.solvers.BackTrackLineSearch - tmpStep: NaN
15:49:08.220 [main] DEBUG o.d.o.solvers.BackTrackLineSearch - Model score after step = 0.0
15:49:08.220 [main] DEBUG o.d.o.solvers.BackTrackLineSearch - tmpStep: NaN
15:49:08.333 [main] DEBUG o.d.o.solvers.BackTrackLineSearch - Model score after step = 0.0
15:49:08.333 [main] DEBUG o.d.o.solvers.BackTrackLineSearch - tmpStep: NaN
15:49:08.333 [main] DEBUG o.d.o.solvers.BackTrackLineSearch - Exited line search after maxIterations termination condition; score did not improve (bestScore=0.0, scoreAtStart=0.0). Resetting parameters
15:49:08.429 [main] DEBUG o.d.optimize.solvers.BaseOptimizer - Step size returned by line search is 0.0.
15:49:08.629 [main] DEBUG o.d.o.solvers.BackTrackLineSearch - slope = NaN
15:49:08.758 [main] DEBUG o.d.o.solvers.BackTrackLineSearch - Model score after step = 0.0
15:49:08.758 [main] DEBUG o.d.o.solvers.BackTrackLineSearch - tmpStep: NaN
15:49:08.863 [main] DEBUG o.d.o.solvers.BackTrackLineSearch - Model score after step = 0.0
15:49:08.863 [main] DEBUG o.d.o.solvers.BackTrackLineSearch - tmpStep: NaN
15:49:08.974 [main] DEBUG o.d.o.solvers.BackTrackLineSearch - Model score after step = 0.0
15:49:08.974 [main] DEBUG o.d.o.solvers.BackTrackLineSearch - tmpStep: NaN
15:49:09.093 [main] DEBUG o.d.o.solvers.BackTrackLineSearch - Model score after step = 0.0
15:49:09.093 [main] DEBUG o.d.o.solvers.BackTrackLineSearch - tmpStep: NaN
15:49:09.208 [main] DEBUG o.d.o.solvers.BackTrackLineSearch - Model score after step = 0.0
15:49:09.208 [main] DEBUG o.d.o.solvers.BackTrackLineSearch - tmpStep: NaN
15:49:09.208 [main] DEBUG o.d.o.solvers.BackTrackLineSearch - Exited line search after maxIterations termination condition; score did not improve (bestScore=0.0, scoreAtStart=0.0). Resetting parameters
15:49:09.301 [main] DEBUG o.d.optimize.solvers.BaseOptimizer - Step size returned by line search is 0.0.

VideoClassificationExample throws exception

I got the exceptions after the video is generated and training started about 10 seconds.

my env:

java version "1.8.0_73"
Java(TM) SE Runtime Environment (build 1.8.0_73-b02)
Java HotSpot(TM) 64-Bit Server VM (build 25.73-b02, mixed mode)

OS:
Linux gao-VirtualBox 4.2.0-34-generic #39~14.04.1-Ubuntu SMP Fri Mar 11 11:38:02 UTC 2016 x86_64 x86_64 x86_64 GNU/Linux

stack trace:

Starting data generation...
Data generation complete
Mar 20, 2016 6:42:48 PM com.github.fommil.jni.JniLoader liberalLoad
INFO: successfully loaded /tmp/jniloader8405270326952543581netlib-native_system-linux-x86_64.so
Number of parameters in network: 56844
Layer 0 nParams = 9030
Layer 1 nParams = 0
Layer 2 nParams = 2710
Layer 3 nParams = 24550
Layer 4 nParams = 20350
Layer 5 nParams = 204
Starting training...
Exception in thread "main" java.lang.RuntimeException: java.lang.RuntimeException: Unable to load image
at org.canova.codec.reader.CodecRecordReader.sequenceRecord(CodecRecordReader.java:101)
at org.deeplearning4j.datasets.canova.SequenceRecordReaderDataSetIterator.nextMultipleSequenceReaders(SequenceRecordReaderDataSetIterator.java:190)
at org.deeplearning4j.datasets.canova.SequenceRecordReaderDataSetIterator.next(SequenceRecordReaderDataSetIterator.java:127)
at org.deeplearning4j.datasets.canova.SequenceRecordReaderDataSetIterator.next(SequenceRecordReaderDataSetIterator.java:109)
at org.deeplearning4j.datasets.canova.SequenceRecordReaderDataSetIterator.next(SequenceRecordReaderDataSetIterator.java:23)
at org.deeplearning4j.datasets.iterator.AsyncDataSetIterator$IteratorRunnable.run(AsyncDataSetIterator.java:195)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.RuntimeException: Unable to load image
at org.canova.image.loader.ImageLoader.toRaveledTensor(ImageLoader.java:159)
at org.canova.codec.reader.CodecRecordReader.sequenceRecord(CodecRecordReader.java:95)
... 6 more
Caused by: java.lang.IndexOutOfBoundsException: 12675
at java.nio.ByteBufferAsIntBufferB.get(ByteBufferAsIntBufferB.java:115)
at org.nd4j.linalg.api.buffer.BaseDataBuffer.getDouble(BaseDataBuffer.java:975)
at org.nd4j.linalg.api.shape.Shape.getDouble(Shape.java:164)
at org.nd4j.linalg.api.ndarray.BaseNDArray.getDouble(BaseNDArray.java:1398)
at org.nd4j.linalg.api.ndarray.BaseNDArray.assign(BaseNDArray.java:1031)
at org.nd4j.linalg.api.ndarray.BaseNDArray.reshape(BaseNDArray.java:3206)
at org.nd4j.linalg.api.ndarray.BaseNDArray.reshape(BaseNDArray.java:3246)
at org.nd4j.linalg.api.ndarray.BaseNDArray.reshape(BaseNDArray.java:3475)
at org.nd4j.linalg.api.ndarray.BaseNDArray.ravel(BaseNDArray.java:3448)
at org.canova.image.loader.ImageLoader.toRaveledTensor(ImageLoader.java:157)
... 7 more

CNNLFWExample has no result ?

where i run CNNLFWExample, i got the result as below

Warning: class Anna_Faris was never predicted by the model. This class was excluded from the average precision
Warning: class Anna_Faris has never appeared as a true label. This class was excluded from the average recall
Warning: class Carmen_Electra was never predicted by the model. This class was excluded from the average precision
Warning: class Carmen_Electra has never appeared as a true label. This class was excluded from the average recall
Warning: class Phil_Donahue was never predicted by the model. This class was excluded from the average precision
Warning: class Phil_Donahue has never appeared as a true label. This class was excluded from the average recall
Warning: class Grant_Rossenmeyer has never appeared as a true label. This class was excluded from the average recall
Warning: class 1672 was never predicted by the model. This class was excluded from the average precision
Warning: class 1672 has never appeared as a true label. This class was excluded from the average recall
Warning: class 1675 was never predicted by the model. This class was excluded from the average precision
Warning: class 1675 has never appeared as a true label. This class was excluded from the average recall
Warning: class 1674 was never predicted by the model. This class was excluded from the average precision
Warning: class 1674 has never appeared as a true label. This class was excluded from the average recall
==========================Scores========================================
 Accuracy:  0
 Precision: 0
 Recall:    0
 F1 Score:  0
========================================================================
o.d.e.c.CNNLFWExample - ****************Example finished********************

somebody said the pom config shoud be

<properties>
    <nd4j.version>0.4-rc3.4</nd4j.version>
    <dl4j.version>0.4-rc3.4</dl4j.version>
    <canova.version>0.0.0.11</canova.version>
    <jackson.version>2.5.1</jackson.version>
</properties>

so i edit the pom
from

<properties>
    <nd4j.version>0.4-rc3.8</nd4j.version>
    <dl4j.version>0.4-rc3.8</dl4j.version>
    <canova.version>0.0.0.14</canova.version>
    <jackson.version>2.5.1</jackson.version>
</properties>

to

<properties>
    <nd4j.version>0.4-rc3.4</nd4j.version>
    <dl4j.version>0.4-rc3.4</dl4j.version>
    <canova.version>0.0.0.11</canova.version>
    <jackson.version>2.5.1</jackson.version>
</properties>

then i meet many compile problems, it means current dl4j-0.4-examples does not match 0.4-rc3.4

or which branch of dl4j-0.4-examples should i clone to match 0.4-rc3.4 ?

or is there another way to solve ?

can anyboy help?

For better results

git diff
diff --git a/src/main/java/org/deeplearning4j/examples/convolution/CNNIrisExample.java b/src/main/java/org/deeplearning4j/examples/convolution/CNNIrisExample.java
index 25ba171..38b896e 100644
--- a/src/main/java/org/deeplearning4j/examples/convolution/CNNIrisExample.java
+++ b/src/main/java/org/deeplearning4j/examples/convolution/CNNIrisExample.java
@@ -40,9 +40,9 @@ public class CNNIrisExample {
int nChannels = 1;
int outputNum = 3;
int numSamples = 150;

  •    int batchSize = 110;
    
  •    int iterations = 10;
    
  •    int splitTrainNum = 100;
    
  •    int batchSize = 60;
    
  •    int iterations = 100;
    
  •    int splitTrainNum = 11;
     int seed = 123;
     int listenerFreq = 1;
    

@@ -69,12 +69,12 @@ public class CNNIrisExample {
.list(2)
.layer(0, new ConvolutionLayer.Builder(new int[]{1, 1})
.nIn(nChannels)

  •                    .nOut(6).dropOut(0.5)
    
  •                    .nOut(19).dropOut(0.5)
                     .activation("relu")
                     .weightInit(WeightInit.XAVIER)
                     .build())
             .layer(1, new OutputLayer.Builder(LossFunctions.LossFunction.MCXENT)
    
  •                    .nIn(6)
    
  •                    .nIn(19)
                     .nOut(outputNum)
                     .weightInit(WeightInit.XAVIER)
                     .activation("softmax")
    

parallel training example

Would be great if there would be an example demonstrating how to parallelize training. It would be particularly interesting for the use-case of multiple cores on the same machine since most machines nowadays have anywhere between 4 to 8 cores.

Is there some deeplearning4j infrastructure to do parallel training of one batch? ie. to accumulate the training deltas and merge them after the parallelized batch execution has finished?

Does not build under Intellij

Intellij can not work with the dependencies listed in this project: one of the maven dependencies com.github.jai-imageio has invalid metadata.

Screenshot is attached

screen shot 2016-01-26 at 1 15 08 pm

retraining model problem

I got the problem that I cant retrain a saved model
train a MLP by Train.scala
retrain a MLP by Retrain.scala

but the Retrain.scala seems like doing no training (observed by the only one iteration updated) and it goes like this:
trouble

and here's Train.scala (just changed its name to Train.txt for uploading)
Train.txt

Retrain.scala (just changed its name to Retrain.txt for uploading)
Retrain.txt

I use sbt to compile them
and
deeplearning4j-core-0.4-rc0.jar
nd4j-api-0.4-rc0.jar
nd4j-bytebuddy-0.4-rc0.jar
nd4j-x86-0.4-rc0.jar
are used in my case

thx for any help!

Version conflict with Guava dependency

When you use HistogramIterationListener, you'll get the following error. This is caused by deeplearning4j/deeplearning4j#756.

Exception in thread "main" java.lang.NoClassDefFoundError: com/google/common/collect/FluentIterable
    at com.fasterxml.jackson.datatype.guava.GuavaTypeModifier.modifyType(GuavaTypeModifier.java:45)
    at com.fasterxml.jackson.databind.type.TypeFactory._constructType(TypeFactory.java:413)
    at com.fasterxml.jackson.databind.type.TypeFactory.constructType(TypeFactory.java:358)
    at com.fasterxml.jackson.databind.cfg.MapperConfig.constructType(MapperConfig.java:268)
    at com.fasterxml.jackson.databind.cfg.MapperConfig.introspectClassAnnotations(MapperConfig.java:298)
    at com.fasterxml.jackson.databind.deser.BasicDeserializerFactory.findTypeDeserializer(BasicDeserializerFactory.java:1238)
    at com.fasterxml.jackson.databind.DeserializationContext.findRootValueDeserializer(DeserializationContext.java:445)
    at com.fasterxml.jackson.databind.ObjectMapper._findRootDeserializer(ObjectMapper.java:3666)
    at com.fasterxml.jackson.databind.ObjectMapper._readValue(ObjectMapper.java:3529)
    at com.fasterxml.jackson.databind.ObjectMapper.readTree(ObjectMapper.java:1978)
    at io.dropwizard.configuration.ConfigurationFactory.build(ConfigurationFactory.java:80)
    at io.dropwizard.cli.ConfiguredCommand.parseConfiguration(ConfiguredCommand.java:114)
    at io.dropwizard.cli.ConfiguredCommand.run(ConfiguredCommand.java:63)
    at io.dropwizard.cli.Cli.run(Cli.java:70)
    at io.dropwizard.Application.run(Application.java:73)
    at org.deeplearning4j.ui.UiServer.createServer(UiServer.java:174)
    at org.deeplearning4j.ui.UiServer.getInstance(UiServer.java:72)
    at org.deeplearning4j.ui.weights.HistogramIterationListener.<init>(HistogramIterationListener.java:50)
    at org.deeplearning4j.ui.weights.HistogramIterationListener.<init>(HistogramIterationListener.java:45)
    at org.deeplearning4j.examples.mlp.MLPBackpropIrisExample.main(MLPBackpropIrisExample.java:78)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:606)
    at com.intellij.rt.execution.application.AppMain.main(AppMain.java:144)
Caused by: java.lang.ClassNotFoundException: com.google.common.collect.FluentIterable
    at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
    at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
    ... 25 more

The current workaround is https://gist.github.com/anonymous/9399ac4610c6702b5e44 by Adam. Simply replace the whole POM content with the one.

Heap is breaking : Large Saved Model requires a full load in-memory [WordVectorSerializer.loadFullModel]

I was testing for higher accuracy for approx 2 Billion words and was playing around with larger degrees of freedom (close to 900) (vector size=900, window size=5, minWordFrequency=3),

The model file understandably explodes to in excess of approx 4Gb.

Now, saving and loading this with WordVectorSerializer.loadFullModel requires a heap allocation to (Java/Scala) about 6GB...

Is there a way to parallelize this? or make use of Spark/RDD ?

Windows Installation Guide From Scratch (by a DL4J User)

I first posted this on the gitter forum and then was told to create an issue so here it is:

Installation Guide for deeplearning4j as far as to get the examples running on windows 7 and above

1.) Install java (https://www.java.com/de/download/)
2.) Create an environment variable JAVA_HOME with the content "C:\Program Files\Java\jdk1.8.0_51"
3.) Install netbeans 8.1(currently beta) (https://netbeans.org/downloads/)
4.) Download Maven 3.3.3 (http://ftp.fau.de/apache/maven/maven-3/3.3.3/binaries/apache-maven-3.3.3-bin.zip)
5.) Extract Maven and add its bin location to the Path
6.) Install tortoise git(https://tortoisegit.org/download/)
7.) Git clone the following repositories to the same directory
7.1.) https://github.com/deeplearning4j/nd4j.git
7.2.) https://github.com/deeplearning4j/deeplearning4j.git
7.3.) https://github.com/deeplearning4j/Canova.git
7.4.) https://github.com/deeplearning4j/dl4j-0.4-examples.git
8.) Install Visual Studio 2013(if not free available download Visual Studio Community: https://www.visualstudio.com/en-us/products/visual-studio-community-vs.aspx)
9.) Add the Visual Studio bin directory to the path (ex.: C:\Program Files (x86)\Microsoft Visual Studio 12.0\VC\bin)(cou can take the 64 bit folder as well)
10.) To check if the Visual Studio directory was added correctly type cl into the command line tool and look if it is found.
11.) Open a command line tool and navigate to the directory to which the repositories were downloaded
12.) Copy each of the following lines seperately into the cmd and hit enter after each line

cd nd4j
vcvars32.bat(vcvard64.bat if you chose to take the 64 bit folder into your path)
mvn clean install -DskipTests -Dmaven.javadoc.skip=true
cd ../Canova
mvn clean install -DskipTests -Dmaven.javadoc.skip=true
cd ../deeplearning4j
mvn clean install -DskipTests -Dmaven.javadoc.skip=true
cd ../dl4j-0.4-examples
mvn clean install -DskipTests -Dmaven.javadoc.skip=true

13.) Open netpeans and import deeplearning4jExamples
14.) In netbeans go to Tools -> Options -> Java -> Maven and change the Maven Home to the path of your maven download from step 4
15.) Congratulation you can now run the examples

Running the examples on the GPU

1.) Install CUDA as described on this side: http://docs.nvidia.com/cuda/cuda-getting-started-guide-for-microsoft-windows/index.html#axzz3k6nvc1PO
2.) Copy nvcc, the Nvidia compiler(C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v7.0\bin) in your classpath(src\main\resources)
3.) Copy the following lines into the dependency part of the pom.xml file of your project.

org.nd4j
nd4j-jcublas-7.0
${nd4j.version}

4.) adapt the shown code to your GPU as shown on http://nd4j.org/dependencies.html
5.) Recompile the module and run it

"raw_sentences.txt" is missing in Word2VecRawTextExample.

-(~/.ghq/github.com/deeplearning4j/dl4j-0.4-examples)-
 [master|โ€ฆ13] $ mvn exec:java -Dexec.mainClass="org.deeplearning4j.examples.word2vec.Word2VecRawTextExample"
[INFO] Scanning for projects...
[WARNING]
[WARNING] Some problems were encountered while building the effective model for org.deeplearning4j:deeplearning4j-examples:jar:0.0.3.3.3-SNAPSHOT
[WARNING] 'build.plugins.plugin.version' for org.apache.maven.plugins:maven-compiler-plugin is missing. @ line 119, column 15
[WARNING]
[WARNING] It is highly recommended to fix these problems because they threaten the stability of your build.
[WARNING]
[WARNING] For this reason, future Maven versions might no longer support building such malformed projects.
[WARNING]
[INFO]
[INFO] ------------------------------------------------------------------------
[INFO] Building DeepLearning4j Examples 0.0.3.3.3-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO]
[INFO] --- exec-maven-plugin:1.4.0:java (default-cli) @ deeplearning4j-examples ---
[WARNING]
java.lang.reflect.InvocationTargetException
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:497)
    at org.codehaus.mojo.exec.ExecJavaMojo$1.run(ExecJavaMojo.java:293)
    at java.lang.Thread.run(Thread.java:745)
Caused by: java.io.FileNotFoundException: class path resource [raw_sentences.txt] cannot be resolved to URL because it does not exist
    at org.springframework.core.io.ClassPathResource.getURL(ClassPathResource.java:177)
    at org.springframework.core.io.AbstractFileResolvingResource.getFile(AbstractFileResolvingResource.java:48)
    at org.deeplearning4j.examples.word2vec.Word2VecRawTextExample.main(Word2VecRawTextExample.java:29)
    ... 6 more
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 1.708 s
[INFO] Finished at: 2015-08-27T00:32:10+09:00
[INFO] Final Memory: 22M/437M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.codehaus.mojo:exec-maven-plugin:1.4.0:java (default-cli) on project deeplearning4j-examples: An exception occured while executing the Java class. null: InvocationTargetException: class path resource [raw_sentences.txt] cannot be resolved to URL because it does not exist -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException

-(~/.ghq/github.com/deeplearning4j/dl4j-0.4-examples)-
 [master|โ€ฆ15] $ find . -name "raw_sentences.txt"
<no output>

A confusion about regularization and .l2

In example:MNISTAnomalyExample
image
I thought l2 should be along with regularization(true) because the default value of regularization is false .like example:LenetMnistExample
1
I wonder whether there is a bug.And what the function of l1 or l2,to regularize what?

Erro in line 155: net.fit(trainData);

getDataSetIterator returns a SequenceRecordReader but SequenceRecordReaderDataSetIterator dont suport reset

Log...

Starting training...
Exception in thread "main" java.lang.UnsupportedOperationException: Reset not supported for this iterator
at org.deeplearning4j.datasets.canova.SequenceRecordReaderDataSetIterator.reset(SequenceRecordReaderDataSetIterator.java:135)
at org.deeplearning4j.datasets.iterator.AsyncDataSetIterator.reset(AsyncDataSetIterator.java:88)
at org.deeplearning4j.nn.multilayer.MultiLayerNetwork.fit(MultiLayerNetwork.java:1134)
at org.deeplearning4j.examples.video.VideoClassificationExample.main(VideoClassificationExample.java:155)

CNNLFWExample has no result ?

i run CNNLFWExample. the result i get is as follows,why?

==========================Scores========================================
Accuracy: 0
Precision: ๏ฟฝ
Recall: ๏ฟฝ

F1 Score: NaN

o.d.e.c.CNNLFWExample - _Example finished_****

the detail log as follows

o.d.e.c.CNNLFWExample - Load data....
o.d.e.c.CNNLFWExample - Build model....
o.d.e.c.CNNLFWExample - Train model....
o.d.o.s.BaseOptimizer - Objective function automatically set to minimize. Set stepFunction in neural net configuration to change default settings.
Sep 29, 2015 1:55:32 PM com.github.fommil.jni.JniLoader liberalLoad
INFO: successfully loaded /tmp/jniloader6660844442673883924netlib-native_system-linux-x86_64.so
o.d.o.l.ScoreIterationListener - Score at iteration 0 is 8.76129150390625
o.d.o.l.ScoreIterationListener - Score at iteration 2 is 7.074580078125
o.d.o.l.ScoreIterationListener - Score at iteration 4 is 5.767039794921875
o.d.o.l.ScoreIterationListener - Score at iteration 6 is 5.7297515869140625
o.d.o.l.ScoreIterationListener - Score at iteration 8 is 5.140956420898437
o.d.o.l.ScoreIterationListener - Score at iteration 0 is 12.93232666015625
o.d.o.l.ScoreIterationListener - Score at iteration 2 is 7.90280517578125
o.d.o.l.ScoreIterationListener - Score at iteration 4 is 5.09256103515625
o.d.o.l.ScoreIterationListener - Score at iteration 6 is 5.122208251953125
o.d.o.l.ScoreIterationListener - Score at iteration 8 is 4.9057833862304685
o.d.o.l.ScoreIterationListener - Score at iteration 0 is 11.4191357421875
o.d.o.l.ScoreIterationListener - Score at iteration 2 is 7.864027709960937
o.d.o.l.ScoreIterationListener - Score at iteration 4 is 6.697731323242188
o.d.o.l.ScoreIterationListener - Score at iteration 6 is 5.208286743164063
o.d.o.l.ScoreIterationListener - Score at iteration 8 is 4.599414367675781
o.d.o.l.ScoreIterationListener - Score at iteration 0 is 16.5861083984375
o.d.o.l.ScoreIterationListener - Score at iteration 2 is 8.505087890625
o.d.o.l.ScoreIterationListener - Score at iteration 4 is 6.944073486328125
o.d.o.l.ScoreIterationListener - Score at iteration 6 is 5.156038818359375
o.d.o.l.ScoreIterationListener - Score at iteration 8 is 5.83273193359375
o.d.e.c.CNNLFWExample - Evaluate model....
o.d.e.c.CNNLFWExample -

Actual Class 138 was predicted with Predicted 747 with count 2 times

Actual Class 138 was predicted with Predicted 754 with count 2 times

Actual Class 138 was predicted with Predicted 813 with count 70 times

Actual Class 138 was predicted with Predicted 835 with count 2 times

Actual Class 139 was predicted with Predicted 813 with count 2 times

Actual Class 140 was predicted with Predicted 813 with count 2 times

Actual Class 141 was predicted with Predicted 813 with count 2 times

Actual Class 142 was predicted with Predicted 813 with count 2 times

Actual Class 143 was predicted with Predicted 813 with count 2 times

Actual Class 144 was predicted with Predicted 813 with count 2 times

Actual Class 144 was predicted with Predicted 835 with count 2 times

Actual Class 145 was predicted with Predicted 813 with count 8 times

Actual Class 146 was predicted with Predicted 740 with count 2 times

Actual Class 146 was predicted with Predicted 813 with count 2 times

Actual Class 147 was predicted with Predicted 813 with count 8 times

Actual Class 148 was predicted with Predicted 813 with count 4 times

Actual Class 149 was predicted with Predicted 835 with count 2 times

Actual Class 150 was predicted with Predicted 835 with count 2 times

Actual Class 151 was predicted with Predicted 813 with count 6 times

Actual Class 152 was predicted with Predicted 813 with count 2 times

Actual Class 153 was predicted with Predicted 813 with count 2 times

Actual Class 154 was predicted with Predicted 813 with count 2 times

Actual Class 155 was predicted with Predicted 813 with count 2 times

Actual Class 156 was predicted with Predicted 813 with count 14 times

Actual Class 156 was predicted with Predicted 862 with count 6 times

Actual Class 157 was predicted with Predicted 813 with count 2 times

Actual Class 158 was predicted with Predicted 813 with count 8 times

Actual Class 159 was predicted with Predicted 813 with count 2 times

Actual Class 160 was predicted with Predicted 813 with count 2 times

Actual Class 161 was predicted with Predicted 813 with count 4 times

Actual Class 162 was predicted with Predicted 813 with count 6 times

Actual Class 163 was predicted with Predicted 862 with count 2 times

Actual Class 164 was predicted with Predicted 835 with count 2 times

Actual Class 164 was predicted with Predicted 862 with count 6 times

Actual Class 165 was predicted with Predicted 813 with count 2 times

Actual Class 166 was predicted with Predicted 754 with count 2 times

Actual Class 166 was predicted with Predicted 813 with count 8 times

Actual Class 167 was predicted with Predicted 813 with count 2 times

Actual Class 379 was predicted with Predicted 813 with count 1 times

Actual Class 380 was predicted with Predicted 813 with count 1 times

Actual Class 381 was predicted with Predicted 813 with count 1 times

Actual Class 382 was predicted with Predicted 720 with count 1 times

Actual Class 382 was predicted with Predicted 813 with count 9 times

Actual Class 383 was predicted with Predicted 720 with count 1 times

Actual Class 384 was predicted with Predicted 813 with count 2 times

Actual Class 385 was predicted with Predicted 813 with count 1 times

Actual Class 386 was predicted with Predicted 813 with count 1 times

Actual Class 387 was predicted with Predicted 813 with count 2 times

Actual Class 388 was predicted with Predicted 862 with count 1 times

Actual Class 389 was predicted with Predicted 813 with count 1 times

Actual Class 390 was predicted with Predicted 813 with count 1 times

Actual Class 391 was predicted with Predicted 813 with count 1 times

Actual Class 392 was predicted with Predicted 813 with count 1 times

Actual Class 393 was predicted with Predicted 813 with count 1 times

Actual Class 394 was predicted with Predicted 862 with count 1 times

Actual Class 395 was predicted with Predicted 813 with count 1 times

Actual Class 396 was predicted with Predicted 835 with count 1 times

Actual Class 397 was predicted with Predicted 813 with count 3 times

Actual Class 398 was predicted with Predicted 835 with count 1 times

Actual Class 399 was predicted with Predicted 813 with count 1 times

Actual Class 400 was predicted with Predicted 813 with count 4 times

Actual Class 401 was predicted with Predicted 813 with count 1 times

Actual Class 402 was predicted with Predicted 754 with count 1 times

Actual Class 403 was predicted with Predicted 813 with count 1 times

Actual Class 404 was predicted with Predicted 813 with count 1 times

Actual Class 405 was predicted with Predicted 862 with count 1 times

Actual Class 406 was predicted with Predicted 835 with count 1 times

Actual Class 407 was predicted with Predicted 813 with count 1 times

Actual Class 408 was predicted with Predicted 813 with count 1 times

Actual Class 409 was predicted with Predicted 813 with count 5 times

Actual Class 409 was predicted with Predicted 862 with count 1 times

Actual Class 410 was predicted with Predicted 813 with count 1 times

Actual Class 411 was predicted with Predicted 813 with count 1 times

Actual Class 412 was predicted with Predicted 835 with count 1 times

Actual Class 413 was predicted with Predicted 813 with count 1 times

Actual Class 414 was predicted with Predicted 813 with count 1 times

Actual Class 415 was predicted with Predicted 813 with count 1 times

Actual Class 415 was predicted with Predicted 862 with count 1 times

Actual Class 416 was predicted with Predicted 813 with count 1 times

Actual Class 417 was predicted with Predicted 813 with count 1 times

Actual Class 418 was predicted with Predicted 813 with count 1 times

Actual Class 419 was predicted with Predicted 813 with count 1 times

Actual Class 420 was predicted with Predicted 813 with count 4 times

Actual Class 420 was predicted with Predicted 814 with count 1 times

Actual Class 420 was predicted with Predicted 862 with count 2 times

Actual Class 421 was predicted with Predicted 813 with count 1 times

Actual Class 422 was predicted with Predicted 813 with count 1 times

Actual Class 423 was predicted with Predicted 740 with count 1 times

Actual Class 424 was predicted with Predicted 754 with count 1 times

Actual Class 424 was predicted with Predicted 813 with count 2 times

Actual Class 424 was predicted with Predicted 862 with count 1 times

Actual Class 425 was predicted with Predicted 813 with count 1 times

Actual Class 426 was predicted with Predicted 740 with count 1 times

Actual Class 426 was predicted with Predicted 813 with count 1 times

Actual Class 427 was predicted with Predicted 862 with count 1 times

Actual Class 428 was predicted with Predicted 835 with count 1 times

Actual Class 429 was predicted with Predicted 862 with count 1 times

Actual Class 430 was predicted with Predicted 813 with count 1 times

Actual Class 431 was predicted with Predicted 813 with count 1 times

Actual Class 432 was predicted with Predicted 862 with count 1 times

Actual Class 433 was predicted with Predicted 862 with count 1 times

Actual Class 434 was predicted with Predicted 813 with count 1 times

Actual Class 435 was predicted with Predicted 813 with count 1 times

Actual Class 436 was predicted with Predicted 813 with count 1 times

Actual Class 437 was predicted with Predicted 813 with count 1 times

Actual Class 438 was predicted with Predicted 813 with count 1 times

Actual Class 438 was predicted with Predicted 862 with count 1 times

Actual Class 439 was predicted with Predicted 813 with count 1 times

Actual Class 440 was predicted with Predicted 813 with count 1 times

Actual Class 441 was predicted with Predicted 813 with count 3 times

Actual Class 441 was predicted with Predicted 835 with count 1 times

Actual Class 442 was predicted with Predicted 813 with count 1 times

Actual Class 653 was predicted with Predicted 720 with count 1 times

Actual Class 653 was predicted with Predicted 728 with count 2 times

Actual Class 653 was predicted with Predicted 740 with count 1 times

Actual Class 653 was predicted with Predicted 813 with count 16 times

Actual Class 653 was predicted with Predicted 835 with count 1 times

Actual Class 653 was predicted with Predicted 862 with count 2 times

Actual Class 654 was predicted with Predicted 835 with count 1 times

Actual Class 655 was predicted with Predicted 813 with count 1 times

Actual Class 656 was predicted with Predicted 740 with count 1 times

Actual Class 656 was predicted with Predicted 813 with count 3 times

Actual Class 657 was predicted with Predicted 813 with count 1 times

Actual Class 658 was predicted with Predicted 813 with count 1 times

Actual Class 659 was predicted with Predicted 813 with count 1 times

Actual Class 660 was predicted with Predicted 813 with count 1 times

Actual Class 661 was predicted with Predicted 862 with count 1 times

Actual Class 662 was predicted with Predicted 813 with count 1 times

Actual Class 663 was predicted with Predicted 835 with count 2 times

Actual Class 664 was predicted with Predicted 754 with count 1 times

Actual Class 664 was predicted with Predicted 813 with count 3 times

Actual Class 664 was predicted with Predicted 835 with count 1 times

Actual Class 664 was predicted with Predicted 862 with count 3 times

Actual Class 665 was predicted with Predicted 835 with count 1 times

Actual Class 666 was predicted with Predicted 813 with count 1 times

Actual Class 667 was predicted with Predicted 813 with count 1 times

Actual Class 668 was predicted with Predicted 813 with count 3 times

Actual Class 668 was predicted with Predicted 835 with count 1 times

Actual Class 669 was predicted with Predicted 835 with count 1 times

Actual Class 670 was predicted with Predicted 813 with count 1 times

Actual Class 671 was predicted with Predicted 813 with count 1 times

Actual Class 672 was predicted with Predicted 813 with count 1 times

Actual Class 673 was predicted with Predicted 813 with count 1 times

Actual Class 674 was predicted with Predicted 813 with count 1 times

Actual Class 675 was predicted with Predicted 835 with count 1 times

Actual Class 676 was predicted with Predicted 813 with count 11 times

Actual Class 677 was predicted with Predicted 813 with count 2 times

Actual Class 678 was predicted with Predicted 813 with count 1 times

Actual Class 679 was predicted with Predicted 862 with count 1 times

Actual Class 680 was predicted with Predicted 754 with count 1 times

Actual Class 680 was predicted with Predicted 813 with count 1 times

Actual Class 680 was predicted with Predicted 862 with count 1 times

Actual Class 681 was predicted with Predicted 835 with count 1 times

Actual Class 682 was predicted with Predicted 754 with count 1 times

Actual Class 683 was predicted with Predicted 813 with count 1 times

Actual Class 684 was predicted with Predicted 862 with count 1 times

Actual Class 685 was predicted with Predicted 754 with count 1 times

Actual Class 685 was predicted with Predicted 813 with count 1 times

Actual Class 686 was predicted with Predicted 813 with count 1 times

Actual Class 687 was predicted with Predicted 813 with count 4 times

Actual Class 688 was predicted with Predicted 813 with count 1 times

Actual Class 689 was predicted with Predicted 740 with count 1 times

Actual Class 689 was predicted with Predicted 813 with count 1 times

Actual Class 690 was predicted with Predicted 813 with count 2 times

Actual Class 691 was predicted with Predicted 813 with count 2 times

Actual Class 692 was predicted with Predicted 862 with count 1 times

Actual Class 693 was predicted with Predicted 740 with count 1 times

Actual Class 694 was predicted with Predicted 813 with count 1 times

Actual Class 695 was predicted with Predicted 813 with count 1 times

Actual Class 696 was predicted with Predicted 740 with count 1 times

Actual Class 913 was predicted with Predicted 813 with count 2 times

Actual Class 914 was predicted with Predicted 720 with count 1 times

Actual Class 914 was predicted with Predicted 813 with count 3 times

Actual Class 915 was predicted with Predicted 813 with count 1 times

Actual Class 916 was predicted with Predicted 813 with count 1 times

Actual Class 917 was predicted with Predicted 813 with count 1 times

Actual Class 918 was predicted with Predicted 813 with count 3 times

Actual Class 919 was predicted with Predicted 813 with count 2 times

Actual Class 920 was predicted with Predicted 813 with count 1 times

Actual Class 921 was predicted with Predicted 813 with count 1 times

Actual Class 922 was predicted with Predicted 813 with count 1 times

Actual Class 923 was predicted with Predicted 813 with count 1 times

Actual Class 924 was predicted with Predicted 813 with count 1 times

Actual Class 925 was predicted with Predicted 813 with count 1 times

Actual Class 926 was predicted with Predicted 813 with count 3 times

Actual Class 927 was predicted with Predicted 813 with count 2 times

Actual Class 928 was predicted with Predicted 813 with count 1 times

Actual Class 929 was predicted with Predicted 862 with count 1 times

Actual Class 930 was predicted with Predicted 813 with count 14 times

Actual Class 930 was predicted with Predicted 835 with count 2 times

Actual Class 930 was predicted with Predicted 862 with count 2 times

Actual Class 931 was predicted with Predicted 813 with count 1 times

Actual Class 932 was predicted with Predicted 813 with count 1 times

Actual Class 933 was predicted with Predicted 813 with count 1 times

Actual Class 934 was predicted with Predicted 813 with count 1 times

Actual Class 935 was predicted with Predicted 813 with count 1 times

Actual Class 936 was predicted with Predicted 813 with count 1 times

Actual Class 937 was predicted with Predicted 813 with count 2 times

Actual Class 938 was predicted with Predicted 813 with count 1 times

Actual Class 939 was predicted with Predicted 813 with count 1 times

Actual Class 940 was predicted with Predicted 813 with count 1 times

Actual Class 941 was predicted with Predicted 862 with count 1 times

Actual Class 942 was predicted with Predicted 813 with count 3 times

Actual Class 943 was predicted with Predicted 813 with count 1 times

Actual Class 944 was predicted with Predicted 740 with count 1 times

Actual Class 944 was predicted with Predicted 835 with count 1 times

Actual Class 945 was predicted with Predicted 813 with count 1 times

Actual Class 946 was predicted with Predicted 813 with count 1 times

Actual Class 947 was predicted with Predicted 813 with count 1 times

Actual Class 947 was predicted with Predicted 862 with count 1 times

Actual Class 948 was predicted with Predicted 813 with count 1 times

Actual Class 949 was predicted with Predicted 813 with count 1 times

Actual Class 950 was predicted with Predicted 813 with count 1 times

Actual Class 951 was predicted with Predicted 813 with count 1 times

Actual Class 951 was predicted with Predicted 862 with count 1 times

Actual Class 952 was predicted with Predicted 740 with count 1 times

Actual Class 952 was predicted with Predicted 813 with count 7 times

Actual Class 953 was predicted with Predicted 862 with count 1 times

Actual Class 954 was predicted with Predicted 862 with count 1 times

Actual Class 955 was predicted with Predicted 813 with count 1 times

Actual Class 955 was predicted with Predicted 862 with count 1 times

Actual Class 956 was predicted with Predicted 754 with count 1 times

Actual Class 957 was predicted with Predicted 754 with count 2 times

Actual Class 957 was predicted with Predicted 813 with count 7 times

Actual Class 957 was predicted with Predicted 835 with count 1 times

Actual Class 957 was predicted with Predicted 862 with count 4 times

Actual Class 958 was predicted with Predicted 813 with count 1 times

==========================Scores========================================
Accuracy: 0
Precision: ๏ฟฝ
Recall: ๏ฟฝ

F1 Score: NaN

o.d.e.c.CNNLFWExample - _Example finished_****

Process finished with exit code 0

FileNotFoundException on VideoClassificationExample

Got this error on the master.

commit 5a95825f3a4f3b1d70007e4e86142c88afa0aed3
Merge: 7ff97f0 60d1b43
Author: Melanie Warrick <[email protected]>
Date:   Thu Nov 12 10:33:17 2015 -0800

    Merge pull request #39 from JackSullivan/master

    Fix non-termination on DBNMnistFullExample
Starting data generation...
Exception in thread "main" java.io.FileNotFoundException: /tmpDL4JVideoShapesExample/shapes_0.mp4 (No such file or directory)
    at java.io.FileOutputStream.open(Native Method)
    at java.io.FileOutputStream.<init>(FileOutputStream.java:221)
    at java.io.FileOutputStream.<init>(FileOutputStream.java:171)
    at org.jcodec.common.NIOUtils.writableFileChannel(NIOUtils.java:277)
    at org.jcodec.api.SequenceEncoder.<init>(SequenceEncoder.java:36)
    at org.deeplearning4j.examples.video.VideoGenerator.generateVideo(VideoGenerator.java:68)
    at org.deeplearning4j.examples.video.VideoGenerator.generateVideoData(VideoGenerator.java:163)
    at org.deeplearning4j.examples.video.VideoClassificationExample.generateData(VideoClassificationExample.java:168)
    at org.deeplearning4j.examples.video.VideoClassificationExample.main(VideoClassificationExample.java:63)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:606)
    at com.intellij.rt.execution.application.AppMain.main(AppMain.java:144)

A slash is missing after tempDir.

        String dataDirectory = tempDir + "DL4JVideoShapesExample/";   //Location to store generated data set

How can I denine multiple labels in trainig set to use at Deeplearning4j

I am new no ML and I hava strated using Deeplearning4j library. And I literaly got lost in the source code. How can i read training set with multiple labels, but not just 1? For example I wan't to teach lstm to classify texts in 4 classes. How can i read trainig dataset for that?
Thanks

DeepAutoEncoderExample throws "java.lang.IllegalStateException", which is actually 'index out of bounds'.

When running DeepAutoEncoderExample, it throws "out of bounds" after few executions.

Any suggestion?

o.d.e.d.DeepAutoEncoderExample - Load data....
o.d.e.d.DeepAutoEncoderExample - Build model....
o.d.e.d.DeepAutoEncoderExample - Train model....
o.d.n.m.MultiLayerNetwork - Training on layer 1 with 1000 examples
o.d.o.s.BaseOptimizer - Objective function automatically set to minimize. Set stepFunction in neural net configuration to change default settings.
o.d.o.s.BaseOptimizer - Objective function automatically set to minimize. Set stepFunction in neural net configuration to change default settings.
9์›” 02, 2015 11:52:51 ์˜ค์ „ com.github.fommil.netlib.BLAS
๊ฒฝ๊ณ : Failed to load implementation from: com.github.fommil.netlib.NativeSystemBLAS
9์›” 02, 2015 11:52:52 ์˜ค์ „ com.github.fommil.jni.JniLoader liberalLoad
์ •๋ณด: successfully loaded C:\Users\dongsun.kim\AppData\Local\Temp\jniloader7936692758102469322netlib-native_ref-win-x86_64.dll
o.d.o.l.ScoreIterationListener - Score at iteration 0 is 436.196
o.d.n.m.MultiLayerNetwork - Training on layer 2 with 1000 examples
o.d.o.s.BaseOptimizer - Objective function automatically set to minimize. Set stepFunction in neural net configuration to change default settings.
o.d.o.s.BaseOptimizer - Objective function automatically set to minimize. Set stepFunction in neural net configuration to change default settings.
o.d.o.l.ScoreIterationListener - Score at iteration 0 is 560.22
o.d.n.m.MultiLayerNetwork - Training on layer 3 with 1000 examples
o.d.o.s.BaseOptimizer - Objective function automatically set to minimize. Set stepFunction in neural net configuration to change default settings.
o.d.o.s.BaseOptimizer - Objective function automatically set to minimize. Set stepFunction in neural net configuration to change default settings.
o.d.o.l.ScoreIterationListener - Score at iteration 0 is 280.255
o.d.n.m.MultiLayerNetwork - Training on layer 4 with 1000 examples
o.d.o.s.BaseOptimizer - Objective function automatically set to minimize. Set stepFunction in neural net configuration to change default settings.
o.d.o.s.BaseOptimizer - Objective function automatically set to minimize. Set stepFunction in neural net configuration to change default settings.
o.d.o.l.ScoreIterationListener - Score at iteration 0 is 140.876
o.d.n.m.MultiLayerNetwork - Training on layer 5 with 1000 examples
o.d.o.s.BaseOptimizer - Objective function automatically set to minimize. Set stepFunction in neural net configuration to change default settings.
o.d.o.s.BaseOptimizer - Objective function automatically set to minimize. Set stepFunction in neural net configuration to change default settings.
o.d.o.l.ScoreIterationListener - Score at iteration 0 is 55.717
o.d.n.m.MultiLayerNetwork - Training on layer 6 with 1000 examples
o.d.o.s.BaseOptimizer - Objective function automatically set to minimize. Set stepFunction in neural net configuration to change default settings.
o.d.o.s.BaseOptimizer - Objective function automatically set to minimize. Set stepFunction in neural net configuration to change default settings.
o.d.o.l.ScoreIterationListener - Score at iteration 0 is 17.182
o.d.n.m.MultiLayerNetwork - Training on layer 7 with 1000 examples
o.d.o.s.BaseOptimizer - Objective function automatically set to minimize. Set stepFunction in neural net configuration to change default settings.
o.d.o.s.BaseOptimizer - Objective function automatically set to minimize. Set stepFunction in neural net configuration to change default settings.
o.d.o.l.ScoreIterationListener - Score at iteration 0 is 57.741
o.d.n.m.MultiLayerNetwork - Training on layer 8 with 1000 examples
o.d.o.s.BaseOptimizer - Objective function automatically set to minimize. Set stepFunction in neural net configuration to change default settings.
o.d.o.s.BaseOptimizer - Objective function automatically set to minimize. Set stepFunction in neural net configuration to change default settings.
o.d.o.l.ScoreIterationListener - Score at iteration 0 is 143.117
o.d.n.m.MultiLayerNetwork - Training on layer 9 with 1000 examples
o.d.o.s.BaseOptimizer - Objective function automatically set to minimize. Set stepFunction in neural net configuration to change default settings.
o.d.o.s.BaseOptimizer - Objective function automatically set to minimize. Set stepFunction in neural net configuration to change default settings.
o.d.o.l.ScoreIterationListener - Score at iteration 0 is 287.66
o.d.n.m.MultiLayerNetwork - Finetune phase
o.d.o.s.BaseOptimizer - Objective function automatically set to minimize. Set stepFunction in neural net configuration to change default settings.
o.d.o.l.ScoreIterationListener - Score at iteration 0 is 390.84234375
o.d.o.s.BaseOptimizer - Objective function automatically set to minimize. Set stepFunction in neural net configuration to change default settings.
Exception in thread "main" java.lang.IllegalStateException: Index out of bounds 2878514
at org.nd4j.linalg.api.buffer.BaseDataBuffer.getDouble(BaseDataBuffer.java:545)
at org.nd4j.linalg.api.ops.executioner.DefaultOpExecutioner.exec(DefaultOpExecutioner.java:84)
at org.nd4j.linalg.api.ndarray.BaseNDArray.subi(BaseNDArray.java:2618)
at org.nd4j.linalg.api.ndarray.BaseNDArray.subi(BaseNDArray.java:2598)
at org.deeplearning4j.optimize.stepfunctions.NegativeGradientStepFunction.step(NegativeGradientStepFunction.java:37)
at org.deeplearning4j.optimize.solvers.StochasticGradientDescent.optimize(StochasticGradientDescent.java:59)
at org.deeplearning4j.optimize.Solver.optimize(Solver.java:52)
at org.deeplearning4j.nn.multilayer.MultiLayerNetwork.fit(MultiLayerNetwork.java:1233)
at org.deeplearning4j.nn.multilayer.MultiLayerNetwork.fit(MultiLayerNetwork.java:1262)
at org.deeplearning4j.examples.deepbelief.DeepAutoEncoderExample.main(DeepAutoEncoderExample.java:67)

two questions about RNN

1ใ€I know the structure of GravesLSTM.Can you tell me that what's the activation means,forget gate activation or sth.?
image
2ใ€I want to use the original RNN,and I find BaseRecurrentLayer.However the Eclipse tell me:Cannot instantiate the type BaseRecurrentLayer.Builder. How to use BaseRecurrentLayer?
image
Thanks a lot!

OutOfMemoryError when fetching nearest words using pretrained google news dataset

Tried giving 12G of memory, probably succeeded in that b/c it used to crash way earlier.

Heres the code:

private WordVectors wordVectors;

    public static void main(String... args) throws Exception {
        new HelloWord2Vec().run();
    }

    private void run() throws Exception {
        File f = new File("GoogleNews-vectors-negative300.bin");
        System.out.println("loading vectors... "+f.getAbsolutePath());
        wordVectors = WordVectorSerializer.loadGoogleModel(f, true);
        System.out.println("vectors loaded...");

        printIsTo("hat","head","glove");
        printIsTo("man","woman","king");
        printIsTo("man","woman","uncle");
        printIsTo("cat","dog","man");
        printIsTo("putin","russia","obama");


        System.out.println("vectors loaded...");
        log.debug("wordVectors:{}",wordVectors);
    }

    private void printIsTo(String a, String b, String c) {
        double[] av = wordVectors.getWordVector(a);
        double[] bv = wordVectors.getWordVector(b);
        double[] cv = wordVectors.getWordVector(c);

        double [] ret = new double[av.length];

        for(int i=0;i<ret.length;i++)
            ret[i]=cv[i]-(av[i]-bv[i]);


        Collection<String> col = wordVectors.wordsNearest(Nd4j.create(ret),1);

        System.out.println(a+ " is to "+b +" as "+c+ " is to ...");
        col.forEach(System.out::println);

    }
}

And here's my output:

/usr/lib/jvm/java-8-oracle/bin/java -Xmx12000M -Didea.launcher.port=7544 -Didea.launcher.bin.path=/home/jp/Desktop/tmp/eclipse-installer/idea-IC-143.382.35/bin -Dfile.encoding=UTF-8 -classpath /usr/lib/jvm/java-8-oracle/jre/lib/charsets.jar:/usr/lib/jvm/java-8-oracle/jre/lib/deploy.jar:/usr/lib/jvm/java-8-oracle/jre/lib/ext/cldrdata.jar:/usr/lib/jvm/java-8-oracle/jre/lib/ext/dnsns.jar:/usr/lib/jvm/java-8-oracle/jre/lib/ext/jaccess.jar:/usr/lib/jvm/java-8-oracle/jre/lib/ext/jfxrt.jar:/usr/lib/jvm/java-8-oracle/jre/lib/ext/localedata.jar:/usr/lib/jvm/java-8-oracle/jre/lib/ext/nashorn.jar:/usr/lib/jvm/java-8-oracle/jre/lib/ext/sunec.jar:/usr/lib/jvm/java-8-oracle/jre/lib/ext/sunjce_provider.jar:/usr/lib/jvm/java-8-oracle/jre/lib/ext/sunpkcs11.jar:/usr/lib/jvm/java-8-oracle/jre/lib/ext/zipfs.jar:/usr/lib/jvm/java-8-oracle/jre/lib/javaws.jar:/usr/lib/jvm/java-8-oracle/jre/lib/jce.jar:/usr/lib/jvm/java-8-oracle/jre/lib/jfr.jar:/usr/lib/jvm/java-8-oracle/jre/lib/jfxswt.jar:/usr/lib/jvm/java-8-oracle/jre/lib/jsse.jar:/usr/lib/jvm/java-8-oracle/jre/lib/management-agent.jar:/usr/lib/jvm/java-8-oracle/jre/lib/plugin.jar:/usr/lib/jvm/java-8-oracle/jre/lib/resources.jar:/usr/lib/jvm/java-8-oracle/jre/lib/rt.jar:/home/jp/projects/deeplearning/dl4j-0.4-examples/target/classes:/home/jp/.m2/repository/org/deeplearning4j/deeplearning4j-nlp/0.4-rc3.8/deeplearning4j-nlp-0.4-rc3.8.jar:/home/jp/.m2/repository/org/apache/lucene/lucene-analyzers-common/5.3.1/lucene-analyzers-common-5.3.1.jar:/home/jp/.m2/repository/org/apache/lucene/lucene-core/5.3.1/lucene-core-5.3.1.jar:/home/jp/.m2/repository/org/apache/lucene/lucene-queryparser/5.3.1/lucene-queryparser-5.3.1.jar:/home/jp/.m2/repository/org/apache/lucene/lucene-queries/5.3.1/lucene-queries-5.3.1.jar:/home/jp/.m2/repository/org/apache/lucene/lucene-sandbox/5.3.1/lucene-sandbox-5.3.1.jar:/home/jp/.m2/repository/org/deeplearning4j/deeplearning4j-scaleout-akka/0.4-rc3.8/deeplearning4j-scaleout-akka-0.4-rc3.8.jar:/home/jp/.m2/repository/javax/ws/rs/javax.ws.rs-api/2.0.1/javax.ws.rs-api-2.0.1.jar:/home/jp/.m2/repository/io/dropwizard/dropwizard-core/0.8.0/dropwizard-core-0.8.0.jar:/home/jp/.m2/repository/io/dropwizard/dropwizard-util/0.8.0/dropwizard-util-0.8.0.jar:/home/jp/.m2/repository/com/google/code/findbugs/jsr305/3.0.0/jsr305-3.0.0.jar:/home/jp/.m2/repository/joda-time/joda-time/2.7/joda-time-2.7.jar:/home/jp/.m2/repository/io/dropwizard/dropwizard-jackson/0.8.0/dropwizard-jackson-0.8.0.jar:/home/jp/.m2/repository/com/fasterxml/jackson/datatype/jackson-datatype-jdk7/2.5.1/jackson-datatype-jdk7-2.5.1.jar:/home/jp/.m2/repository/com/fasterxml/jackson/datatype/jackson-datatype-guava/2.5.1/jackson-datatype-guava-2.5.1.jar:/home/jp/.m2/repository/com/fasterxml/jackson/module/jackson-module-afterburner/2.5.1/jackson-module-afterburner-2.5.1.jar:/home/jp/.m2/repository/com/fasterxml/jackson/datatype/jackson-datatype-joda/2.5.1/jackson-datatype-joda-2.5.1.jar:/home/jp/.m2/repository/io/dropwizard/dropwizard-validation/0.8.0/dropwizard-validation-0.8.0.jar:/home/jp/.m2/repository/org/hibernate/hibernate-validator/5.1.3.Final/hibernate-validator-5.1.3.Final.jar:/home/jp/.m2/repository/javax/validation/validation-api/1.1.0.Final/validation-api-1.1.0.Final.jar:/home/jp/.m2/repository/org/jboss/logging/jboss-logging/3.1.3.GA/jboss-logging-3.1.3.GA.jar:/home/jp/.m2/repository/com/fasterxml/classmate/1.0.0/classmate-1.0.0.jar:/home/jp/.m2/repository/org/glassfish/javax.el/3.0.0/javax.el-3.0.0.jar:/home/jp/.m2/repository/io/dropwizard/dropwizard-configuration/0.8.0/dropwizard-configuration-0.8.0.jar:/home/jp/.m2/repository/io/dropwizard/dropwizard-logging/0.8.0/dropwizard-logging-0.8.0.jar:/home/jp/.m2/repository/io/dropwizard/metrics/metrics-logback/3.1.0/metrics-logback-3.1.0.jar:/home/jp/.m2/repository/org/slf4j/jul-to-slf4j/1.7.10/jul-to-slf4j-1.7.10.jar:/home/jp/.m2/repository/org/slf4j/jcl-over-slf4j/1.7.10/jcl-over-slf4j-1.7.10.jar:/home/jp/.m2/repository/org/eclipse/jetty/jetty-util/9.2.9.v20150224/jetty-util-9.2.9.v20150224.jar:/home/jp/.m2/repository/io/dropwizard/dropwizard-metrics/0.8.0/dropwizard-metrics-0.8.0.jar:/home/jp/.m2/repository/io/dropwizard/dropwizard-jersey/0.8.0/dropwizard-jersey-0.8.0.jar:/home/jp/.m2/repository/org/glassfish/jersey/core/jersey-server/2.16/jersey-server-2.16.jar:/home/jp/.m2/repository/org/glassfish/jersey/media/jersey-media-jaxb/2.16/jersey-media-jaxb-2.16.jar:/home/jp/.m2/repository/org/glassfish/jersey/ext/jersey-metainf-services/2.16/jersey-metainf-services-2.16.jar:/home/jp/.m2/repository/io/dropwizard/metrics/metrics-jersey2/3.1.0/metrics-jersey2-3.1.0.jar:/home/jp/.m2/repository/com/fasterxml/jackson/jaxrs/jackson-jaxrs-json-provider/2.5.1/jackson-jaxrs-json-provider-2.5.1.jar:/home/jp/.m2/repository/com/fasterxml/jackson/jaxrs/jackson-jaxrs-base/2.5.1/jackson-jaxrs-base-2.5.1.jar:/home/jp/.m2/repository/com/fasterxml/jackson/module/jackson-module-jaxb-annotations/2.5.1/jackson-module-jaxb-annotations-2.5.1.jar:/home/jp/.m2/repository/org/glassfish/jersey/containers/jersey-container-servlet/2.16/jersey-container-servlet-2.16.jar:/home/jp/.m2/repository/org/glassfish/jersey/containers/jersey-container-servlet-core/2.16/jersey-container-servlet-core-2.16.jar:/home/jp/.m2/repository/org/eclipse/jetty/jetty-server/9.2.9.v20150224/jetty-server-9.2.9.v20150224.jar:/home/jp/.m2/repository/javax/servlet/javax.servlet-api/3.1.0/javax.servlet-api-3.1.0.jar:/home/jp/.m2/repository/org/eclipse/jetty/jetty-io/9.2.9.v20150224/jetty-io-9.2.9.v20150224.jar:/home/jp/.m2/repository/org/eclipse/jetty/jetty-webapp/9.2.9.v20150224/jetty-webapp-9.2.9.v20150224.jar:/home/jp/.m2/repository/org/eclipse/jetty/jetty-xml/9.2.9.v20150224/jetty-xml-9.2.9.v20150224.jar:/home/jp/.m2/repository/org/eclipse/jetty/jetty-continuation/9.2.9.v20150224/jetty-continuation-9.2.9.v20150224.jar:/home/jp/.m2/repository/io/dropwizard/dropwizard-jetty/0.8.0/dropwizard-jetty-0.8.0.jar:/home/jp/.m2/repository/io/dropwizard/metrics/metrics-jetty9/3.1.0/metrics-jetty9-3.1.0.jar:/home/jp/.m2/repository/org/eclipse/jetty/jetty-servlet/9.2.9.v20150224/jetty-servlet-9.2.9.v20150224.jar:/home/jp/.m2/repository/org/eclipse/jetty/jetty-security/9.2.9.v20150224/jetty-security-9.2.9.v20150224.jar:/home/jp/.m2/repository/org/eclipse/jetty/jetty-servlets/9.2.9.v20150224/jetty-servlets-9.2.9.v20150224.jar:/home/jp/.m2/repository/org/eclipse/jetty/jetty-http/9.2.9.v20150224/jetty-http-9.2.9.v20150224.jar:/home/jp/.m2/repository/io/dropwizard/dropwizard-lifecycle/0.8.0/dropwizard-lifecycle-0.8.0.jar:/home/jp/.m2/repository/io/dropwizard/metrics/metrics-core/3.1.0/metrics-core-3.1.0.jar:/home/jp/.m2/repository/io/dropwizard/metrics/metrics-jvm/3.1.0/metrics-jvm-3.1.0.jar:/home/jp/.m2/repository/io/dropwizard/metrics/metrics-servlets/3.1.0/metrics-servlets-3.1.0.jar:/home/jp/.m2/repository/io/dropwizard/metrics/metrics-json/3.1.0/metrics-json-3.1.0.jar:/home/jp/.m2/repository/io/dropwizard/metrics/metrics-healthchecks/3.1.0/metrics-healthchecks-3.1.0.jar:/home/jp/.m2/repository/net/sourceforge/argparse4j/argparse4j/0.4.4/argparse4j-0.4.4.jar:/home/jp/.m2/repository/org/eclipse/jetty/toolchain/setuid/jetty-setuid-java/1.0.2/jetty-setuid-java-1.0.2.jar:/home/jp/.m2/repository/org/spark-project/akka/akka-remote_2.10/2.3.4-spark/akka-remote_2.10-2.3.4-spark.jar:/home/jp/.m2/repository/org/scala-lang/scala-library/2.10.4/scala-library-2.10.4.jar:/home/jp/.m2/repository/io/netty/netty/3.8.0.Final/netty-3.8.0.Final.jar:/home/jp/.m2/repository/org/spark-project/protobuf/protobuf-java/2.5.0-spark/protobuf-java-2.5.0-spark.jar:/home/jp/.m2/repository/org/uncommons/maths/uncommons-maths/1.2.2a/uncommons-maths-1.2.2a.jar:/home/jp/.m2/repository/org/spark-project/akka/akka-actor_2.10/2.3.4-spark/akka-actor_2.10-2.3.4-spark.jar:/home/jp/.m2/repository/com/typesafe/config/1.2.1/config-1.2.1.jar:/home/jp/.m2/repository/com/typesafe/akka/akka-cluster_2.10/2.3.4/akka-cluster_2.10-2.3.4.jar:/home/jp/.m2/repository/com/typesafe/akka/akka-contrib_2.10/2.3.4/akka-contrib_2.10-2.3.4.jar:/home/jp/.m2/repository/com/typesafe/akka/akka-persistence-experimental_2.10/2.3.4/akka-persistence-experimental_2.10-2.3.4.jar:/home/jp/.m2/repository/org/iq80/leveldb/leveldb/0.5/leveldb-0.5.jar:/home/jp/.m2/repository/org/iq80/leveldb/leveldb-api/0.5/leveldb-api-0.5.jar:/home/jp/.m2/repository/org/fusesource/leveldbjni/leveldbjni-all/1.7/leveldbjni-all-1.7.jar:/home/jp/.m2/repository/org/fusesource/leveldbjni/leveldbjni/1.7/leveldbjni-1.7.jar:/home/jp/.m2/repository/org/fusesource/hawtjni/hawtjni-runtime/1.8/hawtjni-runtime-1.8.jar:/home/jp/.m2/repository/org/fusesource/leveldbjni/leveldbjni-osx/1.5/leveldbjni-osx-1.5.jar:/home/jp/.m2/repository/org/fusesource/leveldbjni/leveldbjni-linux32/1.5/leveldbjni-linux32-1.5.jar:/home/jp/.m2/repository/org/fusesource/leveldbjni/leveldbjni-linux64/1.5/leveldbjni-linux64-1.5.jar:/home/jp/.m2/repository/org/fusesource/leveldbjni/leveldbjni-win32/1.5/leveldbjni-win32-1.5.jar:/home/jp/.m2/repository/org/fusesource/leveldbjni/leveldbjni-win64/1.5/leveldbjni-win64-1.5.jar:/home/jp/.m2/repository/com/google/protobuf/protobuf-java/2.5.0/protobuf-java-2.5.0.jar:/home/jp/.m2/repository/org/spark-project/akka/akka-slf4j_2.10/2.3.4-spark/akka-slf4j_2.10-2.3.4-spark.jar:/home/jp/.m2/repository/args4j/args4j/2.0.29/args4j-2.0.29.jar:/home/jp/.m2/repository/org/fusesource/sigar/1.6.4/sigar-1.6.4-native.jar:/home/jp/.m2/repository/org/deeplearning4j/deeplearning4j-scaleout-zookeeper/0.4-rc3.8/deeplearning4j-scaleout-zookeeper-0.4-rc3.8.jar:/home/jp/.m2/repository/org/deeplearning4j/deeplearning4j-scaleout-api/0.4-rc3.8/deeplearning4j-scaleout-api-0.4-rc3.8.jar:/home/jp/.m2/repository/org/apache/zookeeper/zookeeper/3.5.1-alpha/zookeeper-3.5.1-alpha.jar:/home/jp/.m2/repository/commons-cli/commons-cli/1.2/commons-cli-1.2.jar:/home/jp/.m2/repository/net/java/dev/javacc/javacc/5.0/javacc-5.0.jar:/home/jp/.m2/repository/org/apache/curator/curator-framework/2.4.0/curator-framework-2.4.0.jar:/home/jp/.m2/repository/org/apache/curator/curator-client/2.4.0/curator-client-2.4.0.jar:/home/jp/.m2/repository/org/apache/curator/curator-test/2.4.0/curator-test-2.4.0.jar:/home/jp/.m2/repository/org/javassist/javassist/3.15.0-GA/javassist-3.15.0-GA.jar:/home/jp/.m2/repository/org/apache/commons/commons-math/2.2/commons-math-2.2.jar:/home/jp/.m2/repository/org/apache/curator/curator-recipes/2.4.0/curator-recipes-2.4.0.jar:/home/jp/.m2/repository/com/hazelcast/hazelcast-all/3.4.2/hazelcast-all-3.4.2.jar:/home/jp/.m2/repository/net/sourceforge/findbugs/annotations/1.3.2/annotations-1.3.2.jar:/home/jp/.m2/repository/com/eclipsesource/minimal-json/minimal-json/0.9.1/minimal-json-0.9.1.jar:/home/jp/.m2/repository/it/unimi/dsi/dsiutils/2.2.2/dsiutils-2.2.2.jar:/home/jp/.m2/repository/it/unimi/dsi/fastutil/6.5.15/fastutil-6.5.15.jar:/home/jp/.m2/repository/com/martiansoftware/jsap/2.1/jsap-2.1.jar:/home/jp/.m2/repository/commons-configuration/commons-configuration/1.8/commons-configuration-1.8.jar:/home/jp/.m2/repository/commons-lang/commons-lang/2.6/commons-lang-2.6.jar:/home/jp/.m2/repository/commons-logging/commons-logging/1.1.1/commons-logging-1.1.1.jar:/home/jp/.m2/repository/commons-collections/commons-collections/20040616/commons-collections-20040616.jar:/home/jp/.m2/repository/org/cleartk/cleartk-snowball/2.0.0/cleartk-snowball-2.0.0.jar:/home/jp/.m2/repository/org/apache/lucene/lucene-snowball/3.0.3/lucene-snowball-3.0.3.jar:/home/jp/.m2/repository/org/cleartk/cleartk-util/2.0.0/cleartk-util-2.0.0.jar:/home/jp/.m2/repository/org/apache/uima/uimaj-core/2.5.0/uimaj-core-2.5.0.jar:/home/jp/.m2/repository/org/apache/uima/uimafit-core/2.0.0/uimafit-core-2.0.0.jar:/home/jp/.m2/repository/commons-logging/commons-logging-api/1.1/commons-logging-api-1.1.jar:/home/jp/.m2/repository/org/springframework/spring-core/3.1.2.RELEASE/spring-core-3.1.2.RELEASE.jar:/home/jp/.m2/repository/org/springframework/spring-asm/3.1.2.RELEASE/spring-asm-3.1.2.RELEASE.jar:/home/jp/.m2/repository/org/springframework/spring-context/3.1.2.RELEASE/spring-context-3.1.2.RELEASE.jar:/home/jp/.m2/repository/org/springframework/spring-aop/3.1.2.RELEASE/spring-aop-3.1.2.RELEASE.jar:/home/jp/.m2/repository/aopalliance/aopalliance/1.0/aopalliance-1.0.jar:/home/jp/.m2/repository/org/springframework/spring-expression/3.1.2.RELEASE/spring-expression-3.1.2.RELEASE.jar:/home/jp/.m2/repository/org/springframework/spring-beans/3.1.2.RELEASE/spring-beans-3.1.2.RELEASE.jar:/home/jp/.m2/repository/org/cleartk/cleartk-type-system/2.0.0/cleartk-type-system-2.0.0.jar:/home/jp/.m2/repository/org/cleartk/cleartk-opennlp-tools/2.0.0/cleartk-opennlp-tools-2.0.0.jar:/home/jp/.m2/repository/org/apache/opennlp/opennlp-maxent/3.0.3/opennlp-maxent-3.0.3.jar:/home/jp/.m2/repository/org/apache/opennlp/opennlp-tools/1.5.3/opennlp-tools-1.5.3.jar:/home/jp/.m2/repository/net/sf/jwordnet/jwnl/1.3.3/jwnl-1.3.3.jar:/home/jp/.m2/repository/org/apache/opennlp/opennlp-uima/1.5.3/opennlp-uima-1.5.3.jar:/home/jp/.m2/repository/io/dropwizard/dropwizard-assets/0.8.0/dropwizard-assets-0.8.0.jar:/home/jp/.m2/repository/io/dropwizard/dropwizard-servlets/0.8.0/dropwizard-servlets-0.8.0.jar:/home/jp/.m2/repository/io/dropwizard/metrics/metrics-annotation/3.1.0/metrics-annotation-3.1.0.jar:/home/jp/.m2/repository/io/dropwizard/dropwizard-views-mustache/0.8.0/dropwizard-views-mustache-0.8.0.jar:/home/jp/.m2/repository/io/dropwizard/dropwizard-views/0.8.0/dropwizard-views-0.8.0.jar:/home/jp/.m2/repository/com/github/spullara/mustache/java/compiler/0.8.17/compiler-0.8.17.jar:/home/jp/.m2/repository/io/dropwizard/dropwizard-views-freemarker/0.8.0/dropwizard-views-freemarker-0.8.0.jar:/home/jp/.m2/repository/org/freemarker/freemarker/2.3.21/freemarker-2.3.21.jar:/home/jp/.m2/repository/org/deeplearning4j/deeplearning4j-core/0.4-rc3.8/deeplearning4j-core-0.4-rc3.8.jar:/home/jp/.m2/repository/io/netty/netty-buffer/4.0.28.Final/netty-buffer-4.0.28.Final.jar:/home/jp/.m2/repository/io/netty/netty-common/4.0.28.Final/netty-common-4.0.28.Final.jar:/home/jp/.m2/repository/org/nd4j/canova-api/0.0.0.14/canova-api-0.0.0.14.jar:/home/jp/.m2/repository/org/slf4j/slf4j-api/1.7.12/slf4j-api-1.7.12.jar:/home/jp/.m2/repository/ch/qos/logback/logback-classic/1.1.2/logback-classic-1.1.2.jar:/home/jp/.m2/repository/ch/qos/logback/logback-core/1.1.2/logback-core-1.1.2.jar:/home/jp/.m2/repository/org/apache/commons/commons-math3/3.4.1/commons-math3-3.4.1.jar:/home/jp/.m2/repository/commons-io/commons-io/2.4/commons-io-2.4.jar:/home/jp/.m2/repository/org/apache/commons/commons-compress/1.8/commons-compress-1.8.jar:/home/jp/.m2/repository/org/tukaani/xz/1.5/xz-1.5.jar:/home/jp/.m2/repository/org/nd4j/nd4j-api/0.4-rc3.8/nd4j-api-0.4-rc3.8.jar:/home/jp/.m2/repository/junit/junit/4.11/junit-4.11.jar:/home/jp/.m2/repository/org/hamcrest/hamcrest-core/1.3/hamcrest-core-1.3.jar:/home/jp/.m2/repository/org/nd4j/nd4j-common/0.4-rc3.8/nd4j-common-0.4-rc3.8.jar:/home/jp/.m2/repository/org/reflections/reflections/0.9.10/reflections-0.9.10.jar:/home/jp/.m2/repository/com/google/code/findbugs/annotations/2.0.1/annotations-2.0.1.jar:/home/jp/.m2/repository/org/apache/commons/commons-lang3/3.3.1/commons-lang3-3.3.1.jar:/home/jp/.m2/repository/com/fasterxml/jackson/core/jackson-core/2.5.1/jackson-core-2.5.1.jar:/home/jp/.m2/repository/com/fasterxml/jackson/core/jackson-databind/2.5.1/jackson-databind-2.5.1.jar:/home/jp/.m2/repository/org/json/json/20131018/json-20131018.jar:/home/jp/.m2/repository/com/fasterxml/jackson/core/jackson-annotations/2.5.1/jackson-annotations-2.5.1.jar:/home/jp/.m2/repository/org/projectlombok/lombok/1.16.4/lombok-1.16.4.jar:/home/jp/.m2/repository/org/deeplearning4j/deeplearning4j-ui/0.4-rc3.8/deeplearning4j-ui-0.4-rc3.8.jar:/home/jp/.m2/repository/io/dropwizard/dropwizard-client/0.8.0/dropwizard-client-0.8.0.jar:/home/jp/.m2/repository/org/glassfish/jersey/core/jersey-client/2.16/jersey-client-2.16.jar:/home/jp/.m2/repository/org/glassfish/hk2/hk2-api/2.4.0-b09/hk2-api-2.4.0-b09.jar:/home/jp/.m2/repository/org/glassfish/hk2/hk2-utils/2.4.0-b09/hk2-utils-2.4.0-b09.jar:/home/jp/.m2/repository/org/glassfish/hk2/external/aopalliance-repackaged/2.4.0-b09/aopalliance-repackaged-2.4.0-b09.jar:/home/jp/.m2/repository/org/glassfish/hk2/external/javax.inject/2.4.0-b09/javax.inject-2.4.0-b09.jar:/home/jp/.m2/repository/org/glassfish/hk2/hk2-locator/2.4.0-b09/hk2-locator-2.4.0-b09.jar:/home/jp/.m2/repository/io/dropwizard/metrics/metrics-httpclient/3.1.0/metrics-httpclient-3.1.0.jar:/home/jp/.m2/repository/org/apache/httpcomponents/httpclient/4.3.5/httpclient-4.3.5.jar:/home/jp/.m2/repository/org/apache/httpcomponents/httpcore/4.3.2/httpcore-4.3.2.jar:/home/jp/.m2/repository/commons-codec/commons-codec/1.6/commons-codec-1.6.jar:/home/jp/.m2/repository/org/glassfish/jersey/connectors/jersey-apache-connector/2.16/jersey-apache-connector-2.16.jar:/home/jp/.m2/repository/io/dropwizard/dropwizard-forms/0.8.0/dropwizard-forms-0.8.0.jar:/home/jp/.m2/repository/org/glassfish/jersey/media/jersey-media-multipart/2.17/jersey-media-multipart-2.17.jar:/home/jp/.m2/repository/org/glassfish/jersey/core/jersey-common/2.17/jersey-common-2.17.jar:/home/jp/.m2/repository/javax/annotation/javax.annotation-api/1.2/javax.annotation-api-1.2.jar:/home/jp/.m2/repository/org/glassfish/jersey/bundles/repackaged/jersey-guava/2.17/jersey-guava-2.17.jar:/home/jp/.m2/repository/org/glassfish/hk2/osgi-resource-locator/1.0.1/osgi-resource-locator-1.0.1.jar:/home/jp/.m2/repository/org/jvnet/mimepull/mimepull/1.9.3/mimepull-1.9.3.jar:/home/jp/.m2/repository/org/nd4j/nd4j-jackson/0.4-rc3.8/nd4j-jackson-0.4-rc3.8.jar:/home/jp/.m2/repository/com/google/guava/guava/19.0/guava-19.0.jar:/home/jp/.m2/repository/org/nd4j/nd4j-x86/0.4-rc3.8/nd4j-x86-0.4-rc3.8.jar:/home/jp/.m2/repository/net/sourceforge/f2j/arpack_combined_all/0.1/arpack_combined_all-0.1.jar:/home/jp/.m2/repository/com/github/fommil/netlib/core/1.1.2/core-1.1.2.jar:/home/jp/.m2/repository/com/github/fommil/netlib/netlib-native_ref-osx-x86_64/1.1/netlib-native_ref-osx-x86_64-1.1-natives.jar:/home/jp/.m2/repository/com/github/fommil/netlib/native_ref-java/1.1/native_ref-java-1.1.jar:/home/jp/.m2/repository/com/github/fommil/jniloader/1.1/jniloader-1.1.jar:/home/jp/.m2/repository/com/github/fommil/netlib/netlib-native_ref-linux-x86_64/1.1/netlib-native_ref-linux-x86_64-1.1-natives.jar:/home/jp/.m2/repository/com/github/fommil/netlib/netlib-native_ref-linux-i686/1.1/netlib-native_ref-linux-i686-1.1-natives.jar:/home/jp/.m2/repository/com/github/fommil/netlib/netlib-native_ref-win-x86_64/1.1/netlib-native_ref-win-x86_64-1.1-natives.jar:/home/jp/.m2/repository/com/github/fommil/netlib/netlib-native_ref-win-i686/1.1/netlib-native_ref-win-i686-1.1-natives.jar:/home/jp/.m2/repository/com/github/fommil/netlib/netlib-native_ref-linux-armhf/1.1/netlib-native_ref-linux-armhf-1.1-natives.jar:/home/jp/.m2/repository/com/github/fommil/netlib/netlib-native_system-osx-x86_64/1.1/netlib-native_system-osx-x86_64-1.1-natives.jar:/home/jp/.m2/repository/com/github/fommil/netlib/native_system-java/1.1/native_system-java-1.1.jar:/home/jp/.m2/repository/com/github/fommil/netlib/netlib-native_system-linux-x86_64/1.1/netlib-native_system-linux-x86_64-1.1-natives.jar:/home/jp/.m2/repository/com/github/fommil/netlib/netlib-native_system-linux-i686/1.1/netlib-native_system-linux-i686-1.1-natives.jar:/home/jp/.m2/repository/com/github/fommil/netlib/netlib-native_system-linux-armhf/1.1/netlib-native_system-linux-armhf-1.1-natives.jar:/home/jp/.m2/repository/com/github/fommil/netlib/netlib-native_system-win-x86_64/1.1/netlib-native_system-win-x86_64-1.1-natives.jar:/home/jp/.m2/repository/com/github/fommil/netlib/netlib-native_system-win-i686/1.1/netlib-native_system-win-i686-1.1-natives.jar:/home/jp/.m2/repository/org/jblas/jblas/1.2.4/jblas-1.2.4.jar:/home/jp/.m2/repository/org/bytedeco/javacpp/0.11/javacpp-0.11.jar:/home/jp/.m2/repository/org/nd4j/canova-nd4j-image/0.0.0.14/canova-nd4j-image-0.0.0.14.jar:/home/jp/.m2/repository/org/nd4j/canova-nd4j-common/0.0.0.14/canova-nd4j-common-0.0.0.14.jar:/home/jp/.m2/repository/org/nd4j/canova-data-image/0.0.0.14/canova-data-image-0.0.0.14.jar:/home/jp/.m2/repository/com/github/jai-imageio/jai-imageio-core/1.3.0/jai-imageio-core-1.3.0.jar:/home/jp/.m2/repository/com/twelvemonkeys/imageio/imageio-jpeg/3.1.1/imageio-jpeg-3.1.1.jar:/home/jp/.m2/repository/com/twelvemonkeys/imageio/imageio-core/3.1.1/imageio-core-3.1.1.jar:/home/jp/.m2/repository/com/twelvemonkeys/imageio/imageio-metadata/3.1.1/imageio-metadata-3.1.1.jar:/home/jp/.m2/repository/com/twelvemonkeys/common/common-lang/3.1.1/common-lang-3.1.1.jar:/home/jp/.m2/repository/com/twelvemonkeys/common/common-io/3.1.1/common-io-3.1.1.jar:/home/jp/.m2/repository/com/twelvemonkeys/common/common-image/3.1.1/common-image-3.1.1.jar:/home/jp/.m2/repository/com/twelvemonkeys/imageio/imageio-tiff/3.1.1/imageio-tiff-3.1.1.jar:/home/jp/.m2/repository/com/twelvemonkeys/imageio/imageio-psd/3.1.1/imageio-psd-3.1.1.jar:/home/jp/.m2/repository/com/twelvemonkeys/imageio/imageio-bmp/3.1.1/imageio-bmp-3.1.1.jar:/home/jp/.m2/repository/org/nd4j/canova-nd4j-codec/0.0.0.14/canova-nd4j-codec-0.0.0.14.jar:/home/jp/.m2/repository/org/jcodec/jcodec/0.1.5/jcodec-0.1.5.jar:/home/jp/.m2/repository/com/fasterxml/jackson/dataformat/jackson-dataformat-yaml/2.5.1/jackson-dataformat-yaml-2.5.1.jar:/home/jp/.m2/repository/org/yaml/snakeyaml/1.12/snakeyaml-1.12.jar:/home/jp/Desktop/tmp/eclipse-installer/idea-IC-143.382.35/lib/idea_rt.jar com.intellij.rt.execution.application.AppMain testicle.HelloWord2Vec
loading vectors... /home/jp/projects/deeplearning/dl4j-0.4-examples/GoogleNews-vectors-negative300.bin
Feb 04, 2016 1:21:29 PM com.github.fommil.jni.JniLoader liberalLoad
INFO: successfully loaded /tmp/jniloader2678824758436261965netlib-native_system-linux-x86_64.so
vectors loaded...
hat is to head as c is to ...
head
Exception in thread "main" java.lang.OutOfMemoryError: Java heap space
    at org.nd4j.linalg.api.buffer.BaseDataBuffer.<init>(BaseDataBuffer.java:268)
    at org.nd4j.linalg.api.buffer.FloatBuffer.<init>(FloatBuffer.java:41)
    at org.nd4j.linalg.api.buffer.factory.DefaultDataBufferFactory.createFloat(DefaultDataBufferFactory.java:59)
    at org.nd4j.linalg.factory.Nd4j.createBuffer(Nd4j.java:1010)
    at org.nd4j.linalg.api.ndarray.BaseNDArray.<init>(BaseNDArray.java:217)
    at org.nd4j.linalg.cpu.NDArray.<init>(NDArray.java:112)
    at org.nd4j.linalg.cpu.CpuNDArrayFactory.create(CpuNDArrayFactory.java:234)
    at org.nd4j.linalg.factory.Nd4j.create(Nd4j.java:3825)
    at org.nd4j.linalg.api.shape.Shape.toOffsetZeroCopyHelper(Shape.java:162)
    at org.nd4j.linalg.api.shape.Shape.toOffsetZeroCopy(Shape.java:118)
    at org.nd4j.linalg.api.ndarray.BaseNDArray.dup(BaseNDArray.java:1368)
    at org.nd4j.linalg.api.ndarray.BaseNDArray.mulRowVector(BaseNDArray.java:2240)
    at org.deeplearning4j.models.embeddings.wordvectors.WordVectorsImpl.wordsNearest(WordVectorsImpl.java:208)
    at testicle.HelloWord2Vec.printIsTo(HelloWord2Vec.java:74)
    at testicle.HelloWord2Vec.run(HelloWord2Vec.java:53)
    at testicle.HelloWord2Vec.main(HelloWord2Vec.java:29)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:497)
    at com.intellij.rt.execution.application.AppMain.main(AppMain.java:144)

Process finished with exit code 1

one problem:when i do training net.fit(iterRegression);

I build my nets based on the example:GravesLSTMCharModellingExample.java
A exception appear.
image
i debug it and find net.fit(iterRegression);
jump to MultiLayerNetwork.class (iter.reset();)
image
go on jump to SequenceRecordReaderDataSetIterator.class(labelsReader.reset();)
image
and then meet NULLPointerException
image

CNNMNistExample: how to reproduce results as listed in the readme.md?

Hey there,

sorry, this is probably not exactly an issue, but I'm not sure where to state this question instead.

So I'm trying to reproduce the results I find in the README / 2016-feb-16. I increased the number of samples + batchsize, as I expected it to be relevant, but that doesn't affect the results much (in terms of accuracy / F1).

I was working with CNNMNistExample.java from dl4j-0.4-examples. The lines I've changed are:

int numSamples = 10000;
int batchSize = 7500;

Console Output:

o.d.e.c.CNNMnistExample - Load data....

o.d.e.c.CNNMnistExample - Build model....
Feb 17, 2016 2:14:35 PM com.github.fommil.jni.JniLoader liberalLoad
INFO: successfully loaded /tmp/jniloader3434925949374956132netlib-native_system-linux-x86_64.so
o.d.e.c.CNNMnistExample - Train model....
o.d.o.s.BaseOptimizer - Objective function automatically set to minimize. Set stepFunction in neural net configuration to change default settings.
o.d.o.l.ScoreIterationListener - Score at iteration 0 is 3.7686845703125
o.d.o.l.ScoreIterationListener - Score at iteration 2 is 3.7683304036458334
o.d.o.l.ScoreIterationListener - Score at iteration 4 is 3.7679765625
o.d.o.l.ScoreIterationListener - Score at iteration 6 is 3.7676220703125
o.d.o.l.ScoreIterationListener - Score at iteration 8 is 3.7672682291666666
o.d.o.l.ScoreIterationListener - Score at iteration 0 is 3.7514986979166665
o.d.o.l.ScoreIterationListener - Score at iteration 2 is 3.7511383463541668
o.d.o.l.ScoreIterationListener - Score at iteration 4 is 3.7507783203125
o.d.o.l.ScoreIterationListener - Score at iteration 6 is 3.7504186197916667
o.d.o.l.ScoreIterationListener - Score at iteration 8 is 3.75005859375
o.d.e.c.CNNMnistExample - Evaluate weights....
o.d.e.c.CNNMnistExample - Evaluate model....
o.d.e.c.CNNMnistExample - 
Examples labeled as 0 classified by model as 0: 124 times
Examples labeled as 0 classified by model as 5: 54 times
Examples labeled as 0 classified by model as 6: 178 times
Examples labeled as 0 classified by model as 7: 90 times
Examples labeled as 1 classified by model as 0: 2 times
Examples labeled as 1 classified by model as 5: 470 times
Examples labeled as 1 classified by model as 6: 42 times
Examples labeled as 1 classified by model as 8: 3 times
Examples labeled as 2 classified by model as 0: 28 times
Examples labeled as 2 classified by model as 5: 133 times
Examples labeled as 2 classified by model as 6: 235 times
Examples labeled as 2 classified by model as 7: 41 times
Examples labeled as 2 classified by model as 9: 2 times
Examples labeled as 3 classified by model as 0: 21 times
Examples labeled as 3 classified by model as 5: 226 times
Examples labeled as 3 classified by model as 6: 65 times
Examples labeled as 3 classified by model as 7: 146 times
Examples labeled as 4 classified by model as 0: 166 times
Examples labeled as 4 classified by model as 5: 160 times
Examples labeled as 4 classified by model as 6: 78 times
Examples labeled as 4 classified by model as 7: 5 times
Examples labeled as 4 classified by model as 9: 1 times
Examples labeled as 5 classified by model as 0: 105 times
Examples labeled as 5 classified by model as 5: 204 times
Examples labeled as 5 classified by model as 6: 53 times
Examples labeled as 5 classified by model as 7: 71 times
Examples labeled as 5 classified by model as 8: 1 times
Examples labeled as 6 classified by model as 0: 108 times
Examples labeled as 6 classified by model as 5: 261 times
Examples labeled as 6 classified by model as 6: 34 times
Examples labeled as 6 classified by model as 7: 30 times
Examples labeled as 7 classified by model as 0: 137 times
Examples labeled as 7 classified by model as 5: 292 times
Examples labeled as 7 classified by model as 6: 30 times
Examples labeled as 7 classified by model as 7: 10 times
Examples labeled as 7 classified by model as 8: 2 times
Examples labeled as 8 classified by model as 0: 49 times
Examples labeled as 8 classified by model as 5: 305 times
Examples labeled as 8 classified by model as 6: 64 times
Examples labeled as 8 classified by model as 7: 29 times
Examples labeled as 9 classified by model as 0: 140 times
Examples labeled as 9 classified by model as 5: 225 times
Examples labeled as 9 classified by model as 6: 69 times
Examples labeled as 9 classified by model as 7: 9 times
Examples labeled as 9 classified by model as 8: 2 times

Warning: class 1 was never predicted by the model. This class was excluded from the average precision
Warning: class 2 was never predicted by the model. This class was excluded from the average precision
Warning: class 3 was never predicted by the model. This class was excluded from the average precision
Warning: class 4 was never predicted by the model. This class was excluded from the average precision

==========================Scores========================================
 Accuracy:  0.0827
 Precision: 0.0486
 Recall:    0.0848
 F1 Score:  0.0618
========================================================================

System:
-linux: 3.16.0-60-generic
-java: 1.8.0_40-b26

Sorry I guess this is rather a trivial issue. Thanks in advance.

Different output (prob of next char) for the same initial input char-sequence in Graves LSTM RNN example

Hello,

I open an issue as discussed in gitter last time. The issue is about the computation of the RNN output for the initialization input "word" which results in different probabilities for the next character (below the output INDArray is shown).

I am using the current version of nd4j and dl4j. My machine runs von win7 and in the backend the default is used. I use jdk1.7.0_51 and my openBlas massage:
INFO: successfully loaded C:\Users\XXX\AppData\Local\Temp\jniloader6920817434504940605netlib-native_system-win-x86_64.dll

char_output2

Thanks in advance!

A new RNN structure

I want to code a new RNN structure which is similar to LSTM,has gates mechanism. I want to do it based on your job.Can you tell me which .java files contains the core structure of LSTM.

com.fasterxml.jackson.databind.JsonMappingException

i am getting the following error

java.lang.reflect.InvocationTargetException
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:497)
    at org.codehaus.mojo.exec.ExecJavaMojo$1.run(ExecJavaMojo.java:297)
    at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.RuntimeException: com.fasterxml.jackson.databind.JsonMappingException: (was java.lang.NullPointerException) (through reference chain: org.deeplearning4j.nn.conf.ComputationGraphConfiguration["defaultConfiguration"]->org.deeplearning4j.nn.conf.NeuralNetConfiguration["extraArgs"])
    at org.deeplearning4j.nn.conf.ComputationGraphConfiguration.toJson(ComputationGraphConfiguration.java:119)
    at org.deeplearning4j.util.ModelSerializer.writeModel(ModelSerializer.java:49)
    at org.deeplearning4j.util.ModelSerializer.writeModel(ModelSerializer.java:37)
    at com.blockspring.Queries.Features.DeepArchitectures.RNNs.Seq2Seq.Translator.main(Translator.java:155)
    ... 6 more
Caused by: com.fasterxml.jackson.databind.JsonMappingException: (was java.lang.NullPointerException) (through reference chain: org.deeplearning4j.nn.conf.ComputationGraphConfiguration["defaultConfiguration"]->org.deeplearning4j.nn.conf.NeuralNetConfiguration["extraArgs"])
    at com.fasterxml.jackson.databind.JsonMappingException.wrapWithPath(JsonMappingException.java:210)
    at com.fasterxml.jackson.databind.JsonMappingException.wrapWithPath(JsonMappingException.java:177)
    at com.fasterxml.jackson.databind.ser.std.StdSerializer.wrapAndThrow(StdSerializer.java:190)
    at com.fasterxml.jackson.databind.ser.std.BeanSerializerBase.serializeFields(BeanSerializerBase.java:671)
    at com.fasterxml.jackson.databind.ser.BeanSerializer.serialize(BeanSerializer.java:156)
    at com.fasterxml.jackson.databind.ser.BeanPropertyWriter.serializeAsField(BeanPropertyWriter.java:575)
    at com.fasterxml.jackson.databind.ser.std.BeanSerializerBase.serializeFields(BeanSerializerBase.java:663)
    at com.fasterxml.jackson.databind.ser.BeanSerializer.serialize(BeanSerializer.java:156)
    at com.fasterxml.jackson.databind.ser.DefaultSerializerProvider.serializeValue(DefaultSerializerProvider.java:129)
    at com.fasterxml.jackson.databind.ObjectMapper._configAndWriteValue(ObjectMapper.java:3385)
    at com.fasterxml.jackson.databind.ObjectMapper.writeValueAsString(ObjectMapper.java:2779)
    at org.deeplearning4j.nn.conf.ComputationGraphConfiguration.toJson(ComputationGraphConfiguration.java:117)
    ... 9 more
Caused by: java.lang.NullPointerException
    at org.deeplearning4j.nn.conf.NeuralNetConfiguration.getExtraArgs(NeuralNetConfiguration.java:323)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:497)
    at com.fasterxml.jackson.databind.ser.BeanPropertyWriter.serializeAsField(BeanPropertyWriter.java:536)
    at com.fasterxml.jackson.databind.ser.std.BeanSerializerBase.serializeFields(BeanSerializerBase.java:663)

This is the config

ComputationGraphConfiguration configuration = new NeuralNetConfiguration.Builder()
              .regularization(true).l2(0.0001)
              .weightInit(WeightInit.XAVIER)
              .learningRate(0.01)
              .updater(Updater.RMSPROP)
              .optimizationAlgo(OptimizationAlgorithm.STOCHASTIC_GRADIENT_DESCENT).iterations(1)
              .graphBuilder()
              //InputVertex.The number of strings provided define the number of inputs
              //the order of the input also defines the order of the corresponding 
              //INDArrays in the fit methods (or the DataSet/MultiDataSet objects).
              .addInputs("inEn", "inFr")
              // specify types of input to the network,
              .setInputTypes(InputType.recurrent(VOCAB_SIZE+1), InputType.recurrent(VOCAB_SIZE+1))
              //LayerVertex: add layer called embeddingEn with inputs ....
              .addLayer("embeddingEn", new EmbeddingLayer.Builder().nIn(VOCAB_SIZE+1).nOut(128).activation("identity").build(),"inEn")
              .addLayer("encoder", new GravesLSTM.Builder().nIn(128).nOut(256).activation("softsign").build(),"embeddingEn")
              .addVertex("lastTimeStep", new LastTimeStepVertex("inEn"),"encoder")
              .addVertex("duplicateTimeStep", new DuplicateToTimeSeriesVertex("inFr"), "lastTimeStep")
              .addLayer("embeddingFr", new EmbeddingLayer.Builder().nIn(VOCAB_SIZE+1).nOut(128).activation("identity").build(),"inFr")
              .addVertex("embeddingFrSeq", new PreprocessorVertex(new FeedForwardToRnnPreProcessor()), "embeddingFr")
              .addLayer("decoder", new GravesLSTM.Builder().nIn(128 + 256).nOut(256).activation("softsign").build(), "embeddingFrSeq", "duplicateTimeStep")
              .addLayer("output", new RnnOutputLayer.Builder().nIn(256).nOut(VOCAB_SIZE + 1).activation("softmax").build(), "decoder")
              .setOutputs("output")
              .pretrain(false).backprop(true)
              .build();

      ComputationGraph net = new ComputationGraph(configuration);
      net.init();
      net.fit(train);
      ModelSerializer.writeModel(net, "test", true);

and my pom is as follows

<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
  <modelVersion>4.0.0</modelVersion>
  <groupId>xxxx</groupId>
  <artifactId>xxxx</artifactId>
  <packaging>jar</packaging>
  <version>1.0-SNAPSHOT</version>
   <repositories>
      <repository>
       <id>openblas</id>
       <url>file://${blasRoot}/openblas/0.2.15</url>
      </repository>

      <repository>
        <id>Sonatype-public</id>
        <name>SnakeYAML repository</name>
        <url>http://oss.sonatype.org/content/groups/public/</url>
      </repository>

      <repository>
            <id>snapshots-repo</id>
            <url>https://oss.sonatype.org/content/repositories/snapshots</url>
            <releases><enabled>false</enabled></releases>
            <snapshots><enabled>true</enabled></snapshots>
       </repository>
   </repositories>

  <properties>
    <maven.compiler.source>1.8</maven.compiler.source>
    <maven.compiler.target>1.8</maven.compiler.target>
    <encoding>UTF-8</encoding>

    <juice-version>4.0</juice-version>
    <commonsLang.version>2.6</commonsLang.version>
    <java.version>1.8</java.version>
    <scala.version>2.11.7</scala.version>
    <scala.compat.version>2.11</scala.compat.version>
    <Stanford-CORE-NLP>3.6.0</Stanford-CORE-NLP>
    <Stanford-PARSER>3.6.0</Stanford-PARSER>
    <joda-version>2.9.2</joda-version>  
    <slf4j.version>1.7.16</slf4j.version>
    <jersey.client.version>2.16</jersey.client.version>
    <jackson.version>2.5.1</jackson.version>
    <snakeYAML.version>1.17-SNAPSHOT</snakeYAML.version>
    <googleAPIs.version>v3-rev124-1.21.0</googleAPIs.version>
    <googleGson.version>2.6.1</googleGson.version>
    <guava.version>18.0</guava.version>
    <orgjson.version>20160212</orgjson.version>
    <junit.version>4.12</junit.version>   
    <luceneDotProductVersion>1.0-SNAPSHOT</luceneDotProductVersion>

    <!-- https://oss.sonatype.org/content/repositories/snapshots/org/nd4j/nd4j-api/ -->
    <nd4j.version>0.4-rc3.9-SNAPSHOT</nd4j.version>

    <!-- https://oss.sonatype.org/content/repositories/snapshots/org/deeplearning4j/deeplearning4j-core/ -->
    <dl4j.version>0.4-rc3.9-SNAPSHOT</dl4j.version>
    <dl4j.examples.version>0.0.3.1</dl4j.examples.version>
    <canova.version>0.0.0.15-SNAPSHOT</canova.version>  
<!-- 
    <nd4j.version>0.4-rc3.8</nd4j.version>
    <dl4j.version>0.4-rc3.8</dl4j.version>
    <dl4j.examples.version>0.4-rc0-SNAPSHOT</dl4j.examples.version>
    <canova.version>0.0.0.14</canova.version>   -->
  </properties>

  <dependencies>
   <!-- guava -->   
     <dependency>
        <groupId>com.google.guava</groupId>
        <artifactId>guava</artifactId>
        <version>${guava.version}</version>
    </dependency>
   <!-- dl4j -->   
   <dependency>
        <groupId>org.deeplearning4j</groupId>
        <artifactId>deeplearning4j-core</artifactId>
        <version>${dl4j.version}</version>
   </dependency>
   <dependency>
        <groupId>org.deeplearning4j</groupId>
        <artifactId>deeplearning4j-nlp</artifactId>
        <version>${dl4j.version}</version>
   </dependency>
   <dependency>
        <groupId>org.deeplearning4j</groupId>
        <artifactId>deeplearning4j-examples</artifactId>
        <version>${dl4j.examples.version}</version>
   </dependency> 
    <dependency>
        <groupId>org.deeplearning4j</groupId>
        <artifactId>deeplearning4j-ui</artifactId>
        <version>${dl4j.version}</version>
    </dependency>
    <dependency>
       <groupId>org.nd4j</groupId>
       <artifactId>nd4j-native</artifactId>
       <version>${nd4j.version}</version>
   </dependency>
   <dependency>
       <artifactId>canova-api</artifactId>
       <groupId>org.nd4j</groupId>
       <version>${canova.version}</version>
   </dependency>
    <!-- Lucene DOT product logic -->   
    <dependency>
        <groupId>com.blockspring.peyman</groupId>
        <artifactId>DotProduct</artifactId>
        <version>${luceneDotProductVersion}</version>
    </dependency>
    <!-- org.json -->   
    <dependency>
        <groupId>org.json</groupId>
        <artifactId>json</artifactId>
        <version>${orgjson.version}</version>
    </dependency>
    <!-- google GUIC -->   
    <dependency>
        <groupId>com.google.inject</groupId>
        <artifactId>guice</artifactId>
        <version>${juice-version}</version>
    </dependency>
    <!-- Stanford NLP -->
    <dependency>
        <groupId>edu.stanford.nlp</groupId>
        <artifactId>stanford-parser</artifactId>
        <version>${Stanford-PARSER}</version>
    </dependency>
    <dependency>
        <groupId>edu.stanford.nlp</groupId>
        <artifactId>stanford-parser</artifactId>
        <version>${Stanford-PARSER}</version>
        <classifier>models</classifier>
    </dependency>   
    <dependency>
        <groupId>edu.stanford.nlp</groupId>
        <artifactId>stanford-corenlp</artifactId>
        <version>${Stanford-CORE-NLP}</version>
    </dependency>   
    <dependency>
        <groupId>edu.stanford.nlp</groupId>
        <artifactId>stanford-corenlp</artifactId>
        <version>${Stanford-CORE-NLP}</version>
        <classifier>models</classifier>
    </dependency>   
    <!-- JODA -->
    <dependency>
      <groupId>joda-time</groupId>
      <artifactId>joda-time</artifactId>
      <version>${joda-version}</version>
    </dependency> 
    <!-- Jersey client -->   
    <dependency>
        <groupId>org.glassfish.jersey.core</groupId>
        <artifactId>jersey-client</artifactId>
        <version>${jersey.client.version}</version>
    </dependency>
    <!-- Commons lang -->
    <dependency>
        <groupId>commons-lang</groupId>
        <artifactId>commons-lang</artifactId>
        <version>${commonsLang.version}</version>
    </dependency>
    <!-- Scala -->
    <dependency>
      <groupId>org.scala-lang</groupId>
      <artifactId>scala-library</artifactId>
      <version>${scala.version}</version>
    </dependency>    
    <!-- SnakeYAML parser -->
    <dependency>
        <groupId>org.yaml</groupId>
        <artifactId>snakeyaml</artifactId>
        <version>${snakeYAML.version}</version>
    </dependency>
    <!-- jackson -->
    <dependency>
       <groupId>com.fasterxml.jackson.core</groupId>
       <artifactId>jackson-databind</artifactId>
       <version>${jackson.version}</version>
       <scope>compile</scope>
    </dependency>   
    <dependency>
        <groupId>com.fasterxml.jackson.core</groupId>
        <artifactId>jackson-core</artifactId>
        <version>${jackson.version}</version>
    </dependency>
    <dependency>
        <groupId>com.fasterxml.jackson.core</groupId>
        <artifactId>jackson-annotations</artifactId>
        <version>${jackson.version}</version>
    </dependency>
    <!-- Google APIs -->
    <dependency>
      <groupId>com.google.apis</groupId>
      <artifactId>google-api-services-analytics</artifactId>
      <version>${googleAPIs.version}</version>
    </dependency>
    <!-- gson-->
    <dependency>
        <groupId>com.google.code.gson</groupId>
        <artifactId>gson</artifactId>
        <version>${googleGson.version}</version>
    </dependency> 
    <!-- Test -->
    <dependency>
      <groupId>junit</groupId>
      <artifactId>junit</artifactId>
      <version>${junit.version}</version>
      <scope>test</scope>
    </dependency> 
    <dependency>
      <groupId>org.specs2</groupId>
      <artifactId>specs2-core_${scala.compat.version}</artifactId>
      <version>2.4.16</version>
      <scope>test</scope>
    </dependency>   
    <dependency>
      <groupId>org.scalatest</groupId>
      <artifactId>scalatest_${scala.compat.version}</artifactId>
      <version>2.2.4</version>
      <scope>test</scope>
    </dependency>

</dependencies>

 <build>

    <resources>
            <resource>
                <directory>${rootPath}/lib</directory>
                <filtering>false</filtering>
                <includes>
                    <include>*.bin</include>
                </includes>
            </resource>

           <resource>
                <directory>${rootPath}/config</directory>
                <filtering>false</filtering>
                <includes>
                    <include>*.properties</include>
                </includes>
          </resource>
           <resource>
                <directory>${rootPath}/data/stopwords</directory>
                <filtering>false</filtering>
                <includes>
                    <include>*.txt</include>
                </includes>
          </resource> 

          <resource>
                <directory>${rootPath}/src/main/java/com/blockspring/Queries/Features/DeepArchitectures</directory>
                <filtering>false</filtering>
                <includes>
                    <include>**/**/**/*.txt</include>
                </includes>
          </resource>                
          <resource>
                <directory>${rootPath}/src/main/java/com/blockspring/Queries/Features/DeepArchitectures</directory>
                <filtering>false</filtering>
                <includes>
                    <include>**/**/*.txt</include>
                </includes>
          </resource>  

          <resource>
                <directory>${rootPath}/data/synonyms/GA</directory>
                <filtering>false</filtering>
                <includes>
                    <include>*.syns</include>
                </includes>
          </resource>
         </resources>   

       <pluginManagement>
            <plugins>
              <!-- get sources and javadocs -->           
              <plugin>
                <groupId>org.apache.maven.plugins</groupId>
                <artifactId>maven-eclipse-plugin</artifactId>
                <version>2.9</version>
                <configuration>
                  <downloadSources>true</downloadSources>
                  <downloadJavadocs>true</downloadJavadocs>
                </configuration>
              </plugin>

                <plugin>
                    <groupId>net.alchim31.maven</groupId>
                    <artifactId>scala-maven-plugin</artifactId>
                    <version>3.2.1</version>
                </plugin>
                <plugin>
                    <groupId>org.apache.maven.plugins</groupId>
                    <artifactId>maven-compiler-plugin</artifactId>
                    <version>2.0.2</version>
                </plugin>
            </plugins>
        </pluginManagement>
        <plugins>
            <plugin>
                <groupId>net.alchim31.maven</groupId>
                <artifactId>scala-maven-plugin</artifactId>
                <executions>
                    <execution>
                        <id>scala-compile-first</id>
                        <phase>process-resources</phase>
                        <goals>
                            <goal>add-source</goal>
                            <goal>compile</goal>
                        </goals>
                    </execution>
                    <execution>
                        <id>scala-test-compile</id>
                        <phase>process-test-resources</phase>
                        <goals>
                            <goal>testCompile</goal>
                        </goals>
                    </execution>
                </executions>
            </plugin>
        </plugins>
    </build>
</project>

OutOfMemoryError while loading the pretrain Google News vectors

I ran Word2VecSentimentRNN.java in my IntelliJ IDEA Community Edition 15.0.2 x64 under Win10 Professional. My JAVA version is 1.8.0_65. I set the maven home directory as my local apache-maven-3.3.9.
The file idea64.exe.vmoptions is like this:
-Xms128m
-Xmx8096m
-XX:MaxPermSize=350m
-XX:ReservedCodeCacheSize=240m
-XX:+UseConcMarkSweepGC
-XX:SoftRefLRUPolicyMSPerMB=50
-ea
-Dsun.io.useCanonCaches=false
-Djava.net.preferIPv4Stack=true
-XX:+HeapDumpOnOutOfMemoryError
-XX:-OmitStackTraceInFastThrow

I also added -Xmx8096m to VM option of importer and runner for maven, and added it to gradle VM option in the settings of IDEA.
I set the WORD_VECTORS_PATH to my local GoogleNews-vectors-negative300.bin.gz

Then I hit run, it got an error:

Exception in thread "main" java.lang.OutOfMemoryError: Java heap space
at org.nd4j.linalg.api.buffer.BaseDataBuffer.(BaseDataBuffer.java:268)
at org.nd4j.linalg.api.buffer.FloatBuffer.(FloatBuffer.java:41)
at org.nd4j.linalg.api.buffer.factory.DefaultDataBufferFactory.createFloat(DefaultDataBufferFactory.java:59)
at org.nd4j.linalg.factory.Nd4j.createBuffer(Nd4j.java:1010)
at org.nd4j.linalg.api.ndarray.BaseNDArray.(BaseNDArray.java:217)
at org.nd4j.linalg.cpu.NDArray.(NDArray.java:112)
at org.nd4j.linalg.cpu.CpuNDArrayFactory.create(CpuNDArrayFactory.java:234)
at org.nd4j.linalg.factory.Nd4j.create(Nd4j.java:3825)
at org.nd4j.linalg.factory.Nd4j.create(Nd4j.java:3794)
at org.nd4j.linalg.factory.Nd4j.create(Nd4j.java:3068)
at org.deeplearning4j.models.embeddings.loader.WordVectorSerializer.readBinaryModel(WordVectorSerializer.java:182)
at org.deeplearning4j.models.embeddings.loader.WordVectorSerializer.loadGoogleModel(WordVectorSerializer.java:97)
at org.deeplearning4j.examples.word2vec.sentiment.Word2VecSentimentRNN.main(Word2VecSentimentRNN.java:89)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at com.intellij.rt.execution.application.AppMain.main(AppMain.java:144)

Process finished with exit code 1

I add a break point at BaseDataBuffer.java:268 and hit debug. It show me
Method threw 'java.lang.NullPointerException' exception. Cannot evaluate org.nd4j.linalg.api.buffer.FloatBuffer.toString()
at the first interruption
image

Then I hit F9, the Length of floatData change to 160600 from 240000, then 800, then 200, then 400, then 401400, then 402, then 401802, then 401802, then 900000000, then OutOfMemoryError.

TSNEStandardExample results in NaN

I've tried with 0.4-rc3.7 and 0.4-rc3.4 for both nd4j-x86 and nd4j-jblas.

o.d.e.t.TSNEStandardExample - Load & Vectorize data....
o.d.e.t.TSNEStandardExample - Build model....
o.d.e.t.TSNEStandardExample - Store TSNE Coordinates for Plotting....
o.d.plot.Tsne - Calculating probabilities of data similarities..
o.d.plot.Tsne - Mean value of sigma 0.00
o.d.plot.Tsne - Cost at iteration 0 was NaN
o.d.plot.Tsne - Cost at iteration 1 was NaN
o.d.plot.Tsne - Cost at iteration 2 was NaN
o.d.plot.Tsne - Cost at iteration 3 was NaN
o.d.plot.Tsne - Cost at iteration 4 was NaN
o.d.plot.Tsne - Cost at iteration 5 was NaN
o.d.plot.Tsne - Cost at iteration 6 was NaN

Confusion Matrix in CNNExample does not contain any information

Hi all,

I've modified CNNIrisExample.java in the package
org.deeplearning4j.examples.convolution
and try to get the confusion matrix.

However a call to

System.out.println(eval.getConfusionMatrix().toCSV());

shows only zeros and contains no information.

My minimal example:

package org.deeplearning4j.examples.convolution;

import java.io.File;
import java.io.IOException;
import org.deeplearning4j.datasets.iterator.DataSetIterator;
import org.deeplearning4j.eval.Evaluation;
import org.deeplearning4j.nn.api.OptimizationAlgorithm;
import org.deeplearning4j.nn.conf.GradientNormalization;
import org.deeplearning4j.nn.conf.MultiLayerConfiguration;
import org.deeplearning4j.nn.conf.NeuralNetConfiguration;
import org.deeplearning4j.nn.conf.layers.ConvolutionLayer;
import org.deeplearning4j.nn.conf.layers.OutputLayer;
import org.deeplearning4j.nn.conf.layers.setup.ConvolutionLayerSetup;
import org.deeplearning4j.nn.multilayer.MultiLayerNetwork;
import org.deeplearning4j.nn.params.DefaultParamInitializer;
import org.deeplearning4j.nn.weights.WeightInit;
import org.deeplearning4j.optimize.api.IterationListener;
import org.deeplearning4j.optimize.listeners.ScoreIterationListener;
import org.nd4j.linalg.api.ndarray.INDArray;
import org.nd4j.linalg.dataset.DataSet;
import org.nd4j.linalg.dataset.SplitTestAndTrain;
import org.nd4j.linalg.factory.Nd4j;
import org.nd4j.linalg.lossfunctions.LossFunctions;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;

import java.util.Arrays;
import java.util.Random;
import org.canova.api.conf.Configuration;
import org.canova.api.records.reader.RecordReader;
import org.canova.api.records.reader.impl.CSVRecordReader;
import org.canova.api.records.reader.impl.LibSvmRecordReader;
import org.canova.api.split.FileSplit;
import org.deeplearning4j.datasets.canova.RecordReaderDataSetIterator;
import org.deeplearning4j.nn.api.Layer;
import org.nd4j.linalg.indexing.INDArrayIndex;
import org.springframework.core.io.ClassPathResource;

/**
 * @author sonali
 */
public class CNNMy2Example {

    private static Logger log = LoggerFactory.getLogger(CNNIrisExample.class);

    public static void main(String[] args) throws IOException, InterruptedException {

        final int numRows = 2;
        final int numColumns = 2;
        int nChannels = 1;
        int iterations = 10;
        int seed = 123;
        int listenerFreq = 1;

        /**
         * Set a neural network configuration with multiple layers
         */
        log.info("Load data....");

        // 1. Get TRAINING data.
        RecordReader recordReaderTrain = new CSVRecordReader(0, ",");
        recordReaderTrain.initialize(new FileSplit(new File("myCSVinputTRAIN.txt")));
        DataSetIterator iteratorTrain = new RecordReaderDataSetIterator(recordReaderTrain, 12, 4, 3);
        DataSet irisTrain = iteratorTrain.next();
        irisTrain.normalizeZeroMeanZeroUnitVariance();
        System.out.println("Loaded " + irisTrain.labelCounts() + " training set.");

        // 2. Get TEST data.
        RecordReader recordReaderTest = new CSVRecordReader(0, ",");
        recordReaderTest.initialize(new FileSplit(new File("myCSVinputTEST.txt")));
        DataSetIterator iteratorTest = new RecordReaderDataSetIterator(recordReaderTest, 6, 4, 3);
        DataSet irisTest = iteratorTest.next();
        irisTest.normalizeZeroMeanZeroUnitVariance();
        System.out.println("Loaded " + irisTest.labelCounts() + " test set.");

        MultiLayerConfiguration.Builder builder = new NeuralNetConfiguration.Builder()
                .seed(seed)
                .iterations(iterations)
                .optimizationAlgo(OptimizationAlgorithm.STOCHASTIC_GRADIENT_DESCENT)
                .list(2)
                .layer(0, new ConvolutionLayer.Builder(new int[]{1, 1})
                        .nIn(nChannels)
                        .nOut(1000) // # nodes in hidden layer.
                        .activation("relu")
                        .weightInit(WeightInit.RELU)
                        .build())
                .layer(1, new OutputLayer.Builder(LossFunctions.LossFunction.MCXENT)
                        .nOut(3) // # output classes.
                        .weightInit(WeightInit.XAVIER)
                        .activation("softmax")
                        .build())
                .backprop(true).pretrain(false);

        new ConvolutionLayerSetup(builder, numRows, numColumns, nChannels);

        MultiLayerConfiguration conf = builder.build();

        log.info("Build model....");
        MultiLayerNetwork model = new MultiLayerNetwork(conf);
        model.init();
        model.setListeners(Arrays.asList((IterationListener) new ScoreIterationListener(listenerFreq)));

        log.info("Train model....");
        System.out.println("Training on " + irisTrain.labelCounts());
        model.fit(irisTrain);

        log.info("Evaluate weights....");
        for (org.deeplearning4j.nn.api.Layer layer : model.getLayers()) {
            INDArray w = layer.getParam(DefaultParamInitializer.WEIGHT_KEY);
            //log.info("Weights: " + w);
        }

        Evaluation eval = new Evaluation(3);
        log.info("Evaluate model....");
        System.out.println("Testing on " + irisTest.labelCounts());

        INDArray output = model.output(irisTest.getFeatureMatrix());

        int[] predictedLabels = model.predict(irisTest.getFeatureMatrix());
        //for(int predictedLabel : predictedLabels) System.out.println(predictedLabel);

        // CONFUSION MATRIX DOES CONTAIN ONLY ZERO VALUES !!!
        System.out.println(eval.getConfusionMatrix().toCSV());
        // SAME HERE !
        System.out.println(eval.getConfusionMatrix());


        eval.eval(irisTest.getLabels(), output);
        log.info(eval.stats());

        log.info("\n");
        log.info("****************Example finished********************");
    }
}

myCSVinputTEST.txt
myCSVinputTRAIN.txt

undefined symbol: cblas_sgemm

When I try and run the GravesLSTMCharModellingExample I get:
/usr/java/jdk1.8.0_60/bin/java: symbol lookup error: /tmp/jniloader8051030970684864458netlib-native_system-linux-x86_64.so: undefined symbol: cblas_sgemm

I am using fedora (in parallels on a macbook air to confuse things). I have installed all the blas related packages (both 32/64 bit versions).

LenetMnistExample ArrayIndexOutOfBoundsException

I'm getting this exception on line 113 of LenetMnistExample.java (model.fit(trainInput);)

"C:\Program Files\Java\jdk1.8.0_45\bin\java" -Didea.launcher.port=7532 "-Didea.launcher.bin.path=C:\Program Files (x86)\JetBrains\IntelliJ IDEA Community Edition 14.1.4\bin" -Dfile.encoding=windows-1252 -classpath "C:\Program Files\Java\jdk1.8.0_45\jre\lib\charsets.jar;C:\Program Files\Java\jdk1.8.0_45\jre\lib\deploy.jar;C:\Program Files\Java\jdk1.8.0_45\jre\lib\javaws.jar;C:\Program Files\Java\jdk1.8.0_45\jre\lib\jce.jar;C:\Program Files\Java\jdk1.8.0_45\jre\lib\jfr.jar;C:\Program Files\Java\jdk1.8.0_45\jre\lib\jfxswt.jar;C:\Program Files\Java\jdk1.8.0_45\jre\lib\jsse.jar;C:\Program Files\Java\jdk1.8.0_45\jre\lib\management-agent.jar;C:\Program Files\Java\jdk1.8.0_45\jre\lib\plugin.jar;C:\Program Files\Java\jdk1.8.0_45\jre\lib\resources.jar;C:\Program Files\Java\jdk1.8.0_45\jre\lib\rt.jar;C:\Program Files\Java\jdk1.8.0_45\jre\lib\ext\access-bridge-64.jar;C:\Program Files\Java\jdk1.8.0_45\jre\lib\ext\cldrdata.jar;C:\Program Files\Java\jdk1.8.0_45\jre\lib\ext\dnsns.jar;C:\Program Files\Java\jdk1.8.0_45\jre\lib\ext\jaccess.jar;C:\Program Files\Java\jdk1.8.0_45\jre\lib\ext\jfxrt.jar;C:\Program Files\Java\jdk1.8.0_45\jre\lib\ext\localedata.jar;C:\Program Files\Java\jdk1.8.0_45\jre\lib\ext\nashorn.jar;C:\Program Files\Java\jdk1.8.0_45\jre\lib\ext\sunec.jar;C:\Program Files\Java\jdk1.8.0_45\jre\lib\ext\sunjce_provider.jar;C:\Program Files\Java\jdk1.8.0_45\jre\lib\ext\sunmscapi.jar;C:\Program Files\Java\jdk1.8.0_45\jre\lib\ext\sunpkcs11.jar;C:\Program Files\Java\jdk1.8.0_45\jre\lib\ext\zipfs.jar;C:\Users\Vlad\Documents\DL4J_examples\dl4j-0.4-examples\target\classes;C:\Users\Vlad\.m2\repository\org\deeplearning4j\deeplearning4j-nlp\0.4-rc3.6\deeplearning4j-nlp-0.4-rc3.6.jar;C:\Users\Vlad\.m2\repository\org\apache\lucene\lucene-analyzers-common\5.3.1\lucene-analyzers-common-5.3.1.jar;C:\Users\Vlad\.m2\repository\org\apache\lucene\lucene-core\5.3.1\lucene-core-5.3.1.jar;C:\Users\Vlad\.m2\repository\org\apache\lucene\lucene-queryparser\5.3.1\lucene-queryparser-5.3.1.jar;C:\Users\Vlad\.m2\repository\org\apache\lucene\lucene-queries\5.3.1\lucene-queries-5.3.1.jar;C:\Users\Vlad\.m2\repository\org\apache\lucene\lucene-sandbox\5.3.1\lucene-sandbox-5.3.1.jar;C:\Users\Vlad\.m2\repository\org\deeplearning4j\deeplearning4j-scaleout-akka\0.4-rc3.6\deeplearning4j-scaleout-akka-0.4-rc3.6.jar;C:\Users\Vlad\.m2\repository\javax\ws\rs\javax.ws.rs-api\2.0.1\javax.ws.rs-api-2.0.1.jar;C:\Users\Vlad\.m2\repository\io\dropwizard\dropwizard-core\0.8.0\dropwizard-core-0.8.0.jar;C:\Users\Vlad\.m2\repository\io\dropwizard\dropwizard-util\0.8.0\dropwizard-util-0.8.0.jar;C:\Users\Vlad\.m2\repository\joda-time\joda-time\2.7\joda-time-2.7.jar;C:\Users\Vlad\.m2\repository\io\dropwizard\dropwizard-jackson\0.8.0\dropwizard-jackson-0.8.0.jar;C:\Users\Vlad\.m2\repository\com\fasterxml\jackson\datatype\jackson-datatype-jdk7\2.5.1\jackson-datatype-jdk7-2.5.1.jar;C:\Users\Vlad\.m2\repository\com\fasterxml\jackson\datatype\jackson-datatype-guava\2.5.1\jackson-datatype-guava-2.5.1.jar;C:\Users\Vlad\.m2\repository\com\fasterxml\jackson\module\jackson-module-afterburner\2.5.1\jackson-module-afterburner-2.5.1.jar;C:\Users\Vlad\.m2\repository\com\fasterxml\jackson\datatype\jackson-datatype-joda\2.5.1\jackson-datatype-joda-2.5.1.jar;C:\Users\Vlad\.m2\repository\io\dropwizard\dropwizard-validation\0.8.0\dropwizard-validation-0.8.0.jar;C:\Users\Vlad\.m2\repository\org\hibernate\hibernate-validator\5.1.3.Final\hibernate-validator-5.1.3.Final.jar;C:\Users\Vlad\.m2\repository\javax\validation\validation-api\1.1.0.Final\validation-api-1.1.0.Final.jar;C:\Users\Vlad\.m2\repository\org\jboss\logging\jboss-logging\3.1.3.GA\jboss-logging-3.1.3.GA.jar;C:\Users\Vlad\.m2\repository\com\fasterxml\classmate\1.0.0\classmate-1.0.0.jar;C:\Users\Vlad\.m2\repository\org\glassfish\javax.el\3.0.0\javax.el-3.0.0.jar;C:\Users\Vlad\.m2\repository\io\dropwizard\dropwizard-configuration\0.8.0\dropwizard-configuration-0.8.0.jar;C:\Users\Vlad\.m2\repository\io\dropwizard\dropwizard-logging\0.8.0\dropwizard-logging-0.8.0.jar;C:\Users\Vlad\.m2\repository\io\dropwizard\metrics\metrics-logback\3.1.0\metrics-logback-3.1.0.jar;C:\Users\Vlad\.m2\repository\org\slf4j\jul-to-slf4j\1.7.10\jul-to-slf4j-1.7.10.jar;C:\Users\Vlad\.m2\repository\org\slf4j\jcl-over-slf4j\1.7.10\jcl-over-slf4j-1.7.10.jar;C:\Users\Vlad\.m2\repository\org\eclipse\jetty\jetty-util\9.2.9.v20150224\jetty-util-9.2.9.v20150224.jar;C:\Users\Vlad\.m2\repository\io\dropwizard\dropwizard-metrics\0.8.0\dropwizard-metrics-0.8.0.jar;C:\Users\Vlad\.m2\repository\io\dropwizard\dropwizard-jersey\0.8.0\dropwizard-jersey-0.8.0.jar;C:\Users\Vlad\.m2\repository\org\glassfish\jersey\core\jersey-server\2.16\jersey-server-2.16.jar;C:\Users\Vlad\.m2\repository\org\glassfish\jersey\core\jersey-common\2.16\jersey-common-2.16.jar;C:\Users\Vlad\.m2\repository\org\glassfish\jersey\bundles\repackaged\jersey-guava\2.16\jersey-guava-2.16.jar;C:\Users\Vlad\.m2\repository\org\glassfish\hk2\osgi-resource-locator\1.0.1\osgi-resource-locator-1.0.1.jar;C:\Users\Vlad\.m2\repository\org\glassfish\jersey\core\jersey-client\2.16\jersey-client-2.16.jar;C:\Users\Vlad\.m2\repository\org\glassfish\jersey\media\jersey-media-jaxb\2.16\jersey-media-jaxb-2.16.jar;C:\Users\Vlad\.m2\repository\javax\annotation\javax.annotation-api\1.2\javax.annotation-api-1.2.jar;C:\Users\Vlad\.m2\repository\org\glassfish\hk2\hk2-api\2.4.0-b09\hk2-api-2.4.0-b09.jar;C:\Users\Vlad\.m2\repository\org\glassfish\hk2\hk2-utils\2.4.0-b09\hk2-utils-2.4.0-b09.jar;C:\Users\Vlad\.m2\repository\org\glassfish\hk2\external\aopalliance-repackaged\2.4.0-b09\aopalliance-repackaged-2.4.0-b09.jar;C:\Users\Vlad\.m2\repository\org\glassfish\hk2\external\javax.inject\2.4.0-b09\javax.inject-2.4.0-b09.jar;C:\Users\Vlad\.m2\repository\org\glassfish\hk2\hk2-locator\2.4.0-b09\hk2-locator-2.4.0-b09.jar;C:\Users\Vlad\.m2\repository\org\glassfish\jersey\ext\jersey-metainf-services\2.16\jersey-metainf-services-2.16.jar;C:\Users\Vlad\.m2\repository\io\dropwizard\metrics\metrics-jersey2\3.1.0\metrics-jersey2-3.1.0.jar;C:\Users\Vlad\.m2\repository\com\fasterxml\jackson\jaxrs\jackson-jaxrs-json-provider\2.5.1\jackson-jaxrs-json-provider-2.5.1.jar;C:\Users\Vlad\.m2\repository\com\fasterxml\jackson\jaxrs\jackson-jaxrs-base\2.5.1\jackson-jaxrs-base-2.5.1.jar;C:\Users\Vlad\.m2\repository\com\fasterxml\jackson\module\jackson-module-jaxb-annotations\2.5.1\jackson-module-jaxb-annotations-2.5.1.jar;C:\Users\Vlad\.m2\repository\org\glassfish\jersey\containers\jersey-container-servlet\2.16\jersey-container-servlet-2.16.jar;C:\Users\Vlad\.m2\repository\org\glassfish\jersey\containers\jersey-container-servlet-core\2.16\jersey-container-servlet-core-2.16.jar;C:\Users\Vlad\.m2\repository\org\eclipse\jetty\jetty-server\9.2.9.v20150224\jetty-server-9.2.9.v20150224.jar;C:\Users\Vlad\.m2\repository\javax\servlet\javax.servlet-api\3.1.0\javax.servlet-api-3.1.0.jar;C:\Users\Vlad\.m2\repository\org\eclipse\jetty\jetty-io\9.2.9.v20150224\jetty-io-9.2.9.v20150224.jar;C:\Users\Vlad\.m2\repository\org\eclipse\jetty\jetty-webapp\9.2.9.v20150224\jetty-webapp-9.2.9.v20150224.jar;C:\Users\Vlad\.m2\repository\org\eclipse\jetty\jetty-xml\9.2.9.v20150224\jetty-xml-9.2.9.v20150224.jar;C:\Users\Vlad\.m2\repository\org\eclipse\jetty\jetty-continuation\9.2.9.v20150224\jetty-continuation-9.2.9.v20150224.jar;C:\Users\Vlad\.m2\repository\io\dropwizard\dropwizard-jetty\0.8.0\dropwizard-jetty-0.8.0.jar;C:\Users\Vlad\.m2\repository\io\dropwizard\metrics\metrics-jetty9\3.1.0\metrics-jetty9-3.1.0.jar;C:\Users\Vlad\.m2\repository\org\eclipse\jetty\jetty-servlet\9.2.9.v20150224\jetty-servlet-9.2.9.v20150224.jar;C:\Users\Vlad\.m2\repository\org\eclipse\jetty\jetty-security\9.2.9.v20150224\jetty-security-9.2.9.v20150224.jar;C:\Users\Vlad\.m2\repository\org\eclipse\jetty\jetty-servlets\9.2.9.v20150224\jetty-servlets-9.2.9.v20150224.jar;C:\Users\Vlad\.m2\repository\org\eclipse\jetty\jetty-http\9.2.9.v20150224\jetty-http-9.2.9.v20150224.jar;C:\Users\Vlad\.m2\repository\io\dropwizard\dropwizard-lifecycle\0.8.0\dropwizard-lifecycle-0.8.0.jar;C:\Users\Vlad\.m2\repository\io\dropwizard\metrics\metrics-core\3.1.0\metrics-core-3.1.0.jar;C:\Users\Vlad\.m2\repository\io\dropwizard\metrics\metrics-jvm\3.1.0\metrics-jvm-3.1.0.jar;C:\Users\Vlad\.m2\repository\io\dropwizard\metrics\metrics-servlets\3.1.0\metrics-servlets-3.1.0.jar;C:\Users\Vlad\.m2\repository\io\dropwizard\metrics\metrics-json\3.1.0\metrics-json-3.1.0.jar;C:\Users\Vlad\.m2\repository\io\dropwizard\metrics\metrics-healthchecks\3.1.0\metrics-healthchecks-3.1.0.jar;C:\Users\Vlad\.m2\repository\net\sourceforge\argparse4j\argparse4j\0.4.4\argparse4j-0.4.4.jar;C:\Users\Vlad\.m2\repository\org\eclipse\jetty\toolchain\setuid\jetty-setuid-java\1.0.2\jetty-setuid-java-1.0.2.jar;C:\Users\Vlad\.m2\repository\org\spark-project\akka\akka-remote_2.10\2.3.4-spark\akka-remote_2.10-2.3.4-spark.jar;C:\Users\Vlad\.m2\repository\org\scala-lang\scala-library\2.10.4\scala-library-2.10.4.jar;C:\Users\Vlad\.m2\repository\io\netty\netty\3.8.0.Final\netty-3.8.0.Final.jar;C:\Users\Vlad\.m2\repository\org\spark-project\protobuf\protobuf-java\2.5.0-spark\protobuf-java-2.5.0-spark.jar;C:\Users\Vlad\.m2\repository\org\uncommons\maths\uncommons-maths\1.2.2a\uncommons-maths-1.2.2a.jar;C:\Users\Vlad\.m2\repository\org\spark-project\akka\akka-actor_2.10\2.3.4-spark\akka-actor_2.10-2.3.4-spark.jar;C:\Users\Vlad\.m2\repository\com\typesafe\config\1.2.1\config-1.2.1.jar;C:\Users\Vlad\.m2\repository\com\typesafe\akka\akka-cluster_2.10\2.3.4\akka-cluster_2.10-2.3.4.jar;C:\Users\Vlad\.m2\repository\com\typesafe\akka\akka-contrib_2.10\2.3.4\akka-contrib_2.10-2.3.4.jar;C:\Users\Vlad\.m2\repository\com\typesafe\akka\akka-persistence-experimental_2.10\2.3.4\akka-persistence-experimental_2.10-2.3.4.jar;C:\Users\Vlad\.m2\repository\org\iq80\leveldb\leveldb\0.5\leveldb-0.5.jar;C:\Users\Vlad\.m2\repository\org\iq80\leveldb\leveldb-api\0.5\leveldb-api-0.5.jar;C:\Users\Vlad\.m2\repository\org\fusesource\leveldbjni\leveldbjni-all\1.7\leveldbjni-all-1.7.jar;C:\Users\Vlad\.m2\repository\org\fusesource\leveldbjni\leveldbjni\1.7\leveldbjni-1.7.jar;C:\Users\Vlad\.m2\repository\org\fusesource\hawtjni\hawtjni-runtime\1.8\hawtjni-runtime-1.8.jar;C:\Users\Vlad\.m2\repository\org\fusesource\leveldbjni\leveldbjni-osx\1.5\leveldbjni-osx-1.5.jar;C:\Users\Vlad\.m2\repository\org\fusesource\leveldbjni\leveldbjni-linux32\1.5\leveldbjni-linux32-1.5.jar;C:\Users\Vlad\.m2\repository\org\fusesource\leveldbjni\leveldbjni-linux64\1.5\leveldbjni-linux64-1.5.jar;C:\Users\Vlad\.m2\repository\org\fusesource\leveldbjni\leveldbjni-win32\1.5\leveldbjni-win32-1.5.jar;C:\Users\Vlad\.m2\repository\org\fusesource\leveldbjni\leveldbjni-win64\1.5\leveldbjni-win64-1.5.jar;C:\Users\Vlad\.m2\repository\com\google\protobuf\protobuf-java\2.5.0\protobuf-java-2.5.0.jar;C:\Users\Vlad\.m2\repository\org\spark-project\akka\akka-slf4j_2.10\2.3.4-spark\akka-slf4j_2.10-2.3.4-spark.jar;C:\Users\Vlad\.m2\repository\args4j\args4j\2.0.29\args4j-2.0.29.jar;C:\Users\Vlad\.m2\repository\org\fusesource\sigar\1.6.4\sigar-1.6.4-native.jar;C:\Users\Vlad\.m2\repository\org\deeplearning4j\deeplearning4j-scaleout-zookeeper\0.4-rc3.6\deeplearning4j-scaleout-zookeeper-0.4-rc3.6.jar;C:\Users\Vlad\.m2\repository\org\deeplearning4j\deeplearning4j-scaleout-api\0.4-rc3.6\deeplearning4j-scaleout-api-0.4-rc3.6.jar;C:\Users\Vlad\.m2\repository\org\apache\zookeeper\zookeeper\3.5.1-alpha\zookeeper-3.5.1-alpha.jar;C:\Users\Vlad\.m2\repository\commons-cli\commons-cli\1.2\commons-cli-1.2.jar;C:\Users\Vlad\.m2\repository\net\java\dev\javacc\javacc\5.0\javacc-5.0.jar;C:\Users\Vlad\.m2\repository\org\apache\curator\curator-framework\2.4.0\curator-framework-2.4.0.jar;C:\Users\Vlad\.m2\repository\org\apache\curator\curator-client\2.4.0\curator-client-2.4.0.jar;C:\Users\Vlad\.m2\repository\org\apache\curator\curator-test\2.4.0\curator-test-2.4.0.jar;C:\Users\Vlad\.m2\repository\org\apache\commons\commons-math\2.2\commons-math-2.2.jar;C:\Users\Vlad\.m2\repository\org\apache\curator\curator-recipes\2.4.0\curator-recipes-2.4.0.jar;C:\Users\Vlad\.m2\repository\com\hazelcast\hazelcast-all\3.4.2\hazelcast-all-3.4.2.jar;C:\Users\Vlad\.m2\repository\net\sourceforge\findbugs\annotations\1.3.2\annotations-1.3.2.jar;C:\Users\Vlad\.m2\repository\com\eclipsesource\minimal-json\minimal-json\0.9.1\minimal-json-0.9.1.jar;C:\Users\Vlad\.m2\repository\it\unimi\dsi\dsiutils\2.2.2\dsiutils-2.2.2.jar;C:\Users\Vlad\.m2\repository\it\unimi\dsi\fastutil\6.5.15\fastutil-6.5.15.jar;C:\Users\Vlad\.m2\repository\com\martiansoftware\jsap\2.1\jsap-2.1.jar;C:\Users\Vlad\.m2\repository\commons-configuration\commons-configuration\1.8\commons-configuration-1.8.jar;C:\Users\Vlad\.m2\repository\commons-lang\commons-lang\2.6\commons-lang-2.6.jar;C:\Users\Vlad\.m2\repository\commons-collections\commons-collections\20040616\commons-collections-20040616.jar;C:\Users\Vlad\.m2\repository\org\cleartk\cleartk-snowball\2.0.0\cleartk-snowball-2.0.0.jar;C:\Users\Vlad\.m2\repository\org\apache\lucene\lucene-snowball\3.0.3\lucene-snowball-3.0.3.jar;C:\Users\Vlad\.m2\repository\org\cleartk\cleartk-util\2.0.0\cleartk-util-2.0.0.jar;C:\Users\Vlad\.m2\repository\org\apache\uima\uimaj-core\2.5.0\uimaj-core-2.5.0.jar;C:\Users\Vlad\.m2\repository\org\apache\uima\uimafit-core\2.0.0\uimafit-core-2.0.0.jar;C:\Users\Vlad\.m2\repository\commons-logging\commons-logging-api\1.1\commons-logging-api-1.1.jar;C:\Users\Vlad\.m2\repository\org\springframework\spring-context\3.1.2.RELEASE\spring-context-3.1.2.RELEASE.jar;C:\Users\Vlad\.m2\repository\org\springframework\spring-aop\3.1.2.RELEASE\spring-aop-3.1.2.RELEASE.jar;C:\Users\Vlad\.m2\repository\aopalliance\aopalliance\1.0\aopalliance-1.0.jar;C:\Users\Vlad\.m2\repository\org\springframework\spring-expression\3.1.2.RELEASE\spring-expression-3.1.2.RELEASE.jar;C:\Users\Vlad\.m2\repository\org\springframework\spring-asm\3.1.2.RELEASE\spring-asm-3.1.2.RELEASE.jar;C:\Users\Vlad\.m2\repository\org\springframework\spring-beans\3.1.2.RELEASE\spring-beans-3.1.2.RELEASE.jar;C:\Users\Vlad\.m2\repository\org\cleartk\cleartk-type-system\2.0.0\cleartk-type-system-2.0.0.jar;C:\Users\Vlad\.m2\repository\org\cleartk\cleartk-opennlp-tools\2.0.0\cleartk-opennlp-tools-2.0.0.jar;C:\Users\Vlad\.m2\repository\org\apache\opennlp\opennlp-maxent\3.0.3\opennlp-maxent-3.0.3.jar;C:\Users\Vlad\.m2\repository\org\apache\opennlp\opennlp-tools\1.5.3\opennlp-tools-1.5.3.jar;C:\Users\Vlad\.m2\repository\net\sf\jwordnet\jwnl\1.3.3\jwnl-1.3.3.jar;C:\Users\Vlad\.m2\repository\org\apache\opennlp\opennlp-uima\1.5.3\opennlp-uima-1.5.3.jar;C:\Users\Vlad\.m2\repository\io\dropwizard\dropwizard-assets\0.8.0\dropwizard-assets-0.8.0.jar;C:\Users\Vlad\.m2\repository\io\dropwizard\dropwizard-servlets\0.8.0\dropwizard-servlets-0.8.0.jar;C:\Users\Vlad\.m2\repository\io\dropwizard\metrics\metrics-annotation\3.1.0\metrics-annotation-3.1.0.jar;C:\Users\Vlad\.m2\repository\io\dropwizard\dropwizard-views-mustache\0.8.0\dropwizard-views-mustache-0.8.0.jar;C:\Users\Vlad\.m2\repository\io\dropwizard\dropwizard-views\0.8.0\dropwizard-views-0.8.0.jar;C:\Users\Vlad\.m2\repository\com\github\spullara\mustache\java\compiler\0.8.17\compiler-0.8.17.jar;C:\Users\Vlad\.m2\repository\io\dropwizard\dropwizard-views-freemarker\0.8.0\dropwizard-views-freemarker-0.8.0.jar;C:\Users\Vlad\.m2\repository\org\freemarker\freemarker\2.3.21\freemarker-2.3.21.jar;C:\Users\Vlad\.m2\repository\org\deeplearning4j\deeplearning4j-core\0.4-rc3.6\deeplearning4j-core-0.4-rc3.6.jar;C:\Users\Vlad\.m2\repository\io\netty\netty-buffer\4.0.28.Final\netty-buffer-4.0.28.Final.jar;C:\Users\Vlad\.m2\repository\io\netty\netty-common\4.0.28.Final\netty-common-4.0.28.Final.jar;C:\Users\Vlad\.m2\repository\org\nd4j\canova-api\0.0.0.12\canova-api-0.0.0.12.jar;C:\Users\Vlad\.m2\repository\org\slf4j\slf4j-api\1.7.12\slf4j-api-1.7.12.jar;C:\Users\Vlad\.m2\repository\ch\qos\logback\logback-classic\1.1.2\logback-classic-1.1.2.jar;C:\Users\Vlad\.m2\repository\ch\qos\logback\logback-core\1.1.2\logback-core-1.1.2.jar;C:\Users\Vlad\.m2\repository\org\apache\commons\commons-math3\3.4.1\commons-math3-3.4.1.jar;C:\Users\Vlad\.m2\repository\org\springframework\spring-core\3.2.5.RELEASE\spring-core-3.2.5.RELEASE.jar;C:\Users\Vlad\.m2\repository\commons-logging\commons-logging\1.1.1\commons-logging-1.1.1.jar;C:\Users\Vlad\.m2\repository\commons-io\commons-io\2.4\commons-io-2.4.jar;C:\Users\Vlad\.m2\repository\au\com\bytecode\opencsv\2.4\opencsv-2.4.jar;C:\Users\Vlad\.m2\repository\org\apache\commons\commons-compress\1.8\commons-compress-1.8.jar;C:\Users\Vlad\.m2\repository\org\tukaani\xz\1.5\xz-1.5.jar;C:\Users\Vlad\.m2\repository\com\google\guava\guava\11.0\guava-11.0.jar;C:\Users\Vlad\.m2\repository\com\google\code\findbugs\jsr305\1.3.9\jsr305-1.3.9.jar;C:\Users\Vlad\.m2\repository\org\nd4j\nd4j-api\0.4-rc3.6\nd4j-api-0.4-rc3.6.jar;C:\Users\Vlad\.m2\repository\net\bytebuddy\byte-buddy\0.6.14\byte-buddy-0.6.14.jar;C:\Users\Vlad\.m2\repository\junit\junit\4.11\junit-4.11.jar;C:\Users\Vlad\.m2\repository\org\hamcrest\hamcrest-core\1.3\hamcrest-core-1.3.jar;C:\Users\Vlad\.m2\repository\org\reflections\reflections\0.9.10\reflections-0.9.10.jar;C:\Users\Vlad\.m2\repository\org\javassist\javassist\3.19.0-GA\javassist-3.19.0-GA.jar;C:\Users\Vlad\.m2\repository\com\google\code\findbugs\annotations\2.0.1\annotations-2.0.1.jar;C:\Users\Vlad\.m2\repository\org\nd4j\nd4j-bytebuddy\0.4-rc3.6\nd4j-bytebuddy-0.4-rc3.6.jar;C:\Users\Vlad\.m2\repository\org\apache\commons\commons-lang3\3.3.1\commons-lang3-3.3.1.jar;C:\Users\Vlad\.m2\repository\com\fasterxml\jackson\core\jackson-core\2.5.1\jackson-core-2.5.1.jar;C:\Users\Vlad\.m2\repository\com\fasterxml\jackson\core\jackson-databind\2.5.1\jackson-databind-2.5.1.jar;C:\Users\Vlad\.m2\repository\org\json\json\20131018\json-20131018.jar;C:\Users\Vlad\.m2\repository\com\fasterxml\jackson\core\jackson-annotations\2.5.1\jackson-annotations-2.5.1.jar;C:\Users\Vlad\.m2\repository\org\projectlombok\lombok\1.16.4\lombok-1.16.4.jar;C:\Users\Vlad\.m2\repository\org\nd4j\nd4j-x86\0.4-rc3.6\nd4j-x86-0.4-rc3.6.jar;C:\Users\Vlad\.m2\repository\net\sourceforge\f2j\arpack_combined_all\0.1\arpack_combined_all-0.1.jar;C:\Users\Vlad\.m2\repository\com\github\fommil\netlib\core\1.1.2\core-1.1.2.jar;C:\Users\Vlad\.m2\repository\com\github\fommil\netlib\netlib-native_ref-osx-x86_64\1.1\netlib-native_ref-osx-x86_64-1.1-natives.jar;C:\Users\Vlad\.m2\repository\com\github\fommil\netlib\native_ref-java\1.1\native_ref-java-1.1.jar;C:\Users\Vlad\.m2\repository\com\github\fommil\jniloader\1.1\jniloader-1.1.jar;C:\Users\Vlad\.m2\repository\com\github\fommil\netlib\netlib-native_ref-linux-x86_64\1.1\netlib-native_ref-linux-x86_64-1.1-natives.jar;C:\Users\Vlad\.m2\repository\com\github\fommil\netlib\netlib-native_ref-linux-i686\1.1\netlib-native_ref-linux-i686-1.1-natives.jar;C:\Users\Vlad\.m2\repository\com\github\fommil\netlib\netlib-native_ref-win-x86_64\1.1\netlib-native_ref-win-x86_64-1.1-natives.jar;C:\Users\Vlad\.m2\repository\com\github\fommil\netlib\netlib-native_ref-win-i686\1.1\netlib-native_ref-win-i686-1.1-natives.jar;C:\Users\Vlad\.m2\repository\com\github\fommil\netlib\netlib-native_ref-linux-armhf\1.1\netlib-native_ref-linux-armhf-1.1-natives.jar;C:\Users\Vlad\.m2\repository\com\github\fommil\netlib\netlib-native_system-osx-x86_64\1.1\netlib-native_system-osx-x86_64-1.1-natives.jar;C:\Users\Vlad\.m2\repository\com\github\fommil\netlib\native_system-java\1.1\native_system-java-1.1.jar;C:\Users\Vlad\.m2\repository\com\github\fommil\netlib\netlib-native_system-linux-x86_64\1.1\netlib-native_system-linux-x86_64-1.1-natives.jar;C:\Users\Vlad\.m2\repository\com\github\fommil\netlib\netlib-native_system-linux-i686\1.1\netlib-native_system-linux-i686-1.1-natives.jar;C:\Users\Vlad\.m2\repository\com\github\fommil\netlib\netlib-native_system-linux-armhf\1.1\netlib-native_system-linux-armhf-1.1-natives.jar;C:\Users\Vlad\.m2\repository\com\github\fommil\netlib\netlib-native_system-win-x86_64\1.1\netlib-native_system-win-x86_64-1.1-natives.jar;C:\Users\Vlad\.m2\repository\com\github\fommil\netlib\netlib-native_system-win-i686\1.1\netlib-native_system-win-i686-1.1-natives.jar;C:\Users\Vlad\.m2\repository\org\bytedeco\javacpp\0.11\javacpp-0.11.jar;C:\Users\Vlad\.m2\repository\org\jblas\jblas\1.2.4\jblas-1.2.4.jar;C:\Users\Vlad\.m2\repository\org\nd4j\canova-nd4j-image\0.0.0.12\canova-nd4j-image-0.0.0.12.jar;C:\Users\Vlad\.m2\repository\org\nd4j\canova-nd4j-common\0.0.0.12\canova-nd4j-common-0.0.0.12.jar;C:\Users\Vlad\.m2\repository\org\nd4j\canova-data-image\0.0.0.12\canova-data-image-0.0.0.12.jar;C:\Users\Vlad\.m2\repository\com\github\jai-imageio\jai-imageio-core\1.3.0\jai-imageio-core-1.3.0.jar;C:\Users\Vlad\.m2\repository\com\twelvemonkeys\imageio\imageio-jpeg\3.1.1\imageio-jpeg-3.1.1.jar;C:\Users\Vlad\.m2\repository\com\twelvemonkeys\imageio\imageio-core\3.1.1\imageio-core-3.1.1.jar;C:\Users\Vlad\.m2\repository\com\twelvemonkeys\imageio\imageio-metadata\3.1.1\imageio-metadata-3.1.1.jar;C:\Users\Vlad\.m2\repository\com\twelvemonkeys\common\common-lang\3.1.1\common-lang-3.1.1.jar;C:\Users\Vlad\.m2\repository\com\twelvemonkeys\common\common-io\3.1.1\common-io-3.1.1.jar;C:\Users\Vlad\.m2\repository\com\twelvemonkeys\common\common-image\3.1.1\common-image-3.1.1.jar;C:\Users\Vlad\.m2\repository\com\twelvemonkeys\imageio\imageio-tiff\3.1.1\imageio-tiff-3.1.1.jar;C:\Users\Vlad\.m2\repository\com\twelvemonkeys\imageio\imageio-psd\3.1.1\imageio-psd-3.1.1.jar;C:\Users\Vlad\.m2\repository\com\twelvemonkeys\imageio\imageio-bmp\3.1.1\imageio-bmp-3.1.1.jar;C:\Users\Vlad\.m2\repository\org\nd4j\canova-nd4j-codec\0.0.0.12\canova-nd4j-codec-0.0.0.12.jar;C:\Users\Vlad\.m2\repository\org\jcodec\jcodec\0.1.5\jcodec-0.1.5.jar;C:\Users\Vlad\.m2\repository\com\fasterxml\jackson\dataformat\jackson-dataformat-yaml\2.5.1\jackson-dataformat-yaml-2.5.1.jar;C:\Users\Vlad\.m2\repository\org\yaml\snakeyaml\1.12\snakeyaml-1.12.jar;C:\Program Files (x86)\JetBrains\IntelliJ IDEA Community Edition 14.1.4\lib\idea_rt.jar" com.intellij.rt.execution.application.AppMain org.deeplearning4j.examples.convolution.LenetMnistExample
Nov 15, 2015 10:38:48 PM com.github.fommil.jni.JniLoader liberalLoad
INFO: successfully loaded C:\Users\Vlad\AppData\Local\Temp\jniloader3180380544208850346netlib-native_system-win-x86_64.dll
o.d.e.c.CNNMnistExample - Load data....
o.d.e.c.CNNMnistExample - Build model....
o.d.e.c.CNNMnistExample - Train model....
o.d.o.s.BaseOptimizer - Objective function automatically set to minimize. Set stepFunction in neural net configuration to change default settings.
Exception in thread "main" java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.ArrayIndexOutOfBoundsException
    at org.nd4j.linalg.api.parallel.tasks.cpu.misc.CPUIm2ColTask.blockUntilComplete(CPUIm2ColTask.java:715)
    at org.nd4j.linalg.api.parallel.tasks.cpu.misc.CPUIm2ColTask.invokeBlocking(CPUIm2ColTask.java:702)
    at org.nd4j.linalg.api.parallel.tasks.cpu.misc.CPUIm2ColTask.invokeBlocking(CPUIm2ColTask.java:19)
    at org.nd4j.linalg.convolution.Convolution.im2col(Convolution.java:137)
    at org.nd4j.linalg.convolution.Convolution.im2col(Convolution.java:100)
    at org.deeplearning4j.nn.layers.convolution.subsampling.SubsamplingLayer.activate(SubsamplingLayer.java:147)
    at org.deeplearning4j.nn.layers.BaseLayer.activate(BaseLayer.java:347)
    at org.deeplearning4j.nn.multilayer.MultiLayerNetwork.activationFromPrevLayer(MultiLayerNetwork.java:488)
    at org.deeplearning4j.nn.multilayer.MultiLayerNetwork.feedForward(MultiLayerNetwork.java:570)
    at org.deeplearning4j.nn.multilayer.MultiLayerNetwork.feedForward(MultiLayerNetwork.java:583)
    at org.deeplearning4j.nn.multilayer.MultiLayerNetwork.computeGradientAndScore(MultiLayerNetwork.java:1602)
    at org.deeplearning4j.optimize.solvers.BaseOptimizer.gradientAndScore(BaseOptimizer.java:124)
    at org.deeplearning4j.optimize.solvers.StochasticGradientDescent.optimize(StochasticGradientDescent.java:56)
    at org.deeplearning4j.optimize.Solver.optimize(Solver.java:52)
    at org.deeplearning4j.nn.multilayer.MultiLayerNetwork.fit(MultiLayerNetwork.java:1374)
    at org.deeplearning4j.nn.multilayer.MultiLayerNetwork.fit(MultiLayerNetwork.java:1404)
    at org.deeplearning4j.examples.convolution.LenetMnistExample.main(LenetMnistExample.java:113)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:497)
    at com.intellij.rt.execution.application.AppMain.main(AppMain.java:140)
Caused by: java.util.concurrent.ExecutionException: java.lang.ArrayIndexOutOfBoundsException
    at java.util.concurrent.ForkJoinTask.get(ForkJoinTask.java:1006)
    at org.nd4j.linalg.api.parallel.tasks.cpu.misc.CPUIm2ColTask.blockUntilComplete(CPUIm2ColTask.java:713)
    ... 21 more
Caused by: java.lang.ArrayIndexOutOfBoundsException
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:422)
    at java.util.concurrent.ForkJoinTask.getThrowableException(ForkJoinTask.java:598)
    at java.util.concurrent.ForkJoinTask.get(ForkJoinTask.java:1005)
    ... 22 more
Caused by: java.lang.ArrayIndexOutOfBoundsException: 20000
    at org.nd4j.linalg.api.parallel.tasks.cpu.misc.CPUIm2ColTask.doHeapDouble(CPUIm2ColTask.java:438)
    at org.nd4j.linalg.api.parallel.tasks.cpu.misc.CPUIm2ColTask.execute(CPUIm2ColTask.java:222)
    at org.nd4j.linalg.api.parallel.tasks.cpu.misc.CPUIm2ColTask.splitOrExecute(CPUIm2ColTask.java:207)
    at org.nd4j.linalg.api.parallel.tasks.cpu.misc.CPUIm2ColTask.compute(CPUIm2ColTask.java:100)
    at org.nd4j.linalg.api.parallel.tasks.cpu.misc.CPUIm2ColTask.compute(CPUIm2ColTask.java:19)
    at java.util.concurrent.RecursiveTask.exec(RecursiveTask.java:94)
    at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
    at java.util.concurrent.ForkJoinPool$WorkQueue.tryRemoveAndExec(ForkJoinPool.java:1107)
    at java.util.concurrent.ForkJoinPool.awaitJoin(ForkJoinPool.java:2043)
    at java.util.concurrent.ForkJoinTask.doJoin(ForkJoinTask.java:390)
    at java.util.concurrent.ForkJoinTask.join(ForkJoinTask.java:719)
    at org.nd4j.linalg.api.parallel.tasks.cpu.misc.CPUIm2ColTask.splitOrExecute(CPUIm2ColTask.java:202)
    at org.nd4j.linalg.api.parallel.tasks.cpu.misc.CPUIm2ColTask.compute(CPUIm2ColTask.java:100)
    at org.nd4j.linalg.api.parallel.tasks.cpu.misc.CPUIm2ColTask.compute(CPUIm2ColTask.java:19)
    at java.util.concurrent.RecursiveTask.exec(RecursiveTask.java:94)
    at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
    at java.util.concurrent.ForkJoinPool$WorkQueue.tryRemoveAndExec(ForkJoinPool.java:1107)
    at java.util.concurrent.ForkJoinPool.awaitJoin(ForkJoinPool.java:2043)
    at java.util.concurrent.ForkJoinTask.doJoin(ForkJoinTask.java:390)
    at java.util.concurrent.ForkJoinTask.join(ForkJoinTask.java:719)
    at org.nd4j.linalg.api.parallel.tasks.cpu.misc.CPUIm2ColTask.splitOrExecute(CPUIm2ColTask.java:202)
    at org.nd4j.linalg.api.parallel.tasks.cpu.misc.CPUIm2ColTask.compute(CPUIm2ColTask.java:100)
    at org.nd4j.linalg.api.parallel.tasks.cpu.misc.CPUIm2ColTask.compute(CPUIm2ColTask.java:19)
    at java.util.concurrent.RecursiveTask.exec(RecursiveTask.java:94)
    at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
    at java.util.concurrent.ForkJoinPool$WorkQueue.tryRemoveAndExec(ForkJoinPool.java:1107)
    at java.util.concurrent.ForkJoinPool.awaitJoin(ForkJoinPool.java:2043)
    at java.util.concurrent.ForkJoinTask.doJoin(ForkJoinTask.java:390)
    at java.util.concurrent.ForkJoinTask.join(ForkJoinTask.java:719)
    at org.nd4j.linalg.api.parallel.tasks.cpu.misc.CPUIm2ColTask.splitOrExecute(CPUIm2ColTask.java:202)
    at org.nd4j.linalg.api.parallel.tasks.cpu.misc.CPUIm2ColTask.compute(CPUIm2ColTask.java:100)
    at org.nd4j.linalg.api.parallel.tasks.cpu.misc.CPUIm2ColTask.compute(CPUIm2ColTask.java:19)
    at java.util.concurrent.RecursiveTask.exec(RecursiveTask.java:94)
    at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
    at java.util.concurrent.ForkJoinPool$WorkQueue.tryRemoveAndExec(ForkJoinPool.java:1107)
    at java.util.concurrent.ForkJoinPool.awaitJoin(ForkJoinPool.java:2043)
    at java.util.concurrent.ForkJoinTask.doJoin(ForkJoinTask.java:390)
    at java.util.concurrent.ForkJoinTask.join(ForkJoinTask.java:719)
    at org.nd4j.linalg.api.parallel.tasks.cpu.misc.CPUIm2ColTask.splitOrExecute(CPUIm2ColTask.java:202)
    at org.nd4j.linalg.api.parallel.tasks.cpu.misc.CPUIm2ColTask.compute(CPUIm2ColTask.java:100)
    at org.nd4j.linalg.api.parallel.tasks.cpu.misc.CPUIm2ColTask.compute(CPUIm2ColTask.java:19)
    at java.util.concurrent.RecursiveTask.exec(RecursiveTask.java:94)
    at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
    at java.util.concurrent.ForkJoinPool$WorkQueue.tryRemoveAndExec(ForkJoinPool.java:1107)
    at java.util.concurrent.ForkJoinPool.awaitJoin(ForkJoinPool.java:2043)
    at java.util.concurrent.ForkJoinTask.doJoin(ForkJoinTask.java:390)
    at java.util.concurrent.ForkJoinTask.join(ForkJoinTask.java:719)
    at org.nd4j.linalg.api.parallel.tasks.cpu.misc.CPUIm2ColTask.splitOrExecute(CPUIm2ColTask.java:202)
    at org.nd4j.linalg.api.parallel.tasks.cpu.misc.CPUIm2ColTask.compute(CPUIm2ColTask.java:100)
    at org.nd4j.linalg.api.parallel.tasks.cpu.misc.CPUIm2ColTask.compute(CPUIm2ColTask.java:19)
    at java.util.concurrent.RecursiveTask.exec(RecursiveTask.java:94)
    at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
    at java.util.concurrent.ForkJoinPool$WorkQueue.tryRemoveAndExec(ForkJoinPool.java:1107)
    at java.util.concurrent.ForkJoinPool.awaitJoin(ForkJoinPool.java:2043)
    at java.util.concurrent.ForkJoinTask.doJoin(ForkJoinTask.java:390)
    at java.util.concurrent.ForkJoinTask.join(ForkJoinTask.java:719)
    at org.nd4j.linalg.api.parallel.tasks.cpu.misc.CPUIm2ColTask.splitOrExecute(CPUIm2ColTask.java:202)
    at org.nd4j.linalg.api.parallel.tasks.cpu.misc.CPUIm2ColTask.compute(CPUIm2ColTask.java:100)
    at org.nd4j.linalg.api.parallel.tasks.cpu.misc.CPUIm2ColTask.compute(CPUIm2ColTask.java:19)
    at java.util.concurrent.RecursiveTask.exec(RecursiveTask.java:94)
    at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
    at java.util.concurrent.ForkJoinPool$WorkQueue.tryRemoveAndExec(ForkJoinPool.java:1107)
    at java.util.concurrent.ForkJoinPool.awaitJoin(ForkJoinPool.java:2043)
    at java.util.concurrent.ForkJoinTask.doJoin(ForkJoinTask.java:390)
    at java.util.concurrent.ForkJoinTask.join(ForkJoinTask.java:719)
    at org.nd4j.linalg.api.parallel.tasks.cpu.misc.CPUIm2ColTask.splitOrExecute(CPUIm2ColTask.java:202)
    at org.nd4j.linalg.api.parallel.tasks.cpu.misc.CPUIm2ColTask.compute(CPUIm2ColTask.java:100)
    at org.nd4j.linalg.api.parallel.tasks.cpu.misc.CPUIm2ColTask.compute(CPUIm2ColTask.java:19)
    at java.util.concurrent.RecursiveTask.exec(RecursiveTask.java:94)
    at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
    at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
    at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1689)
    at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:157)

Process finished with exit code 1

Ran on Windows 10 with java 1.8.0_45 and maven 3.3.3, same result on Ubuntu 15.

CNNLFWExample issue

Hi all

I have pulled the latest version of dl4j-examples
When I try to run CNNLFWExample i get the following evaluation output

Accuracy: 0
Precision: 0
Recall: 0
F1 Score: 0

Kind regards
Ivanhoe

Tf-idf Implementation Example

Hi, it will be really great if you could provide an example for the tf-idf implementation using deeplearning4j. I did study the word2vec example to get a gist, but a tf-idf example will help a lot.

Thanks!

CUDA_ERROR_INVALID_VALUE when running Test example

When running the Test.java example with nd4j-jcublas-7.5 as the backend, the following error occurs:
Exception in thread "main" java.lang.RuntimeException: Could not execute kernel
at org.nd4j.linalg.jcublas.ops.executioner.JCudaExecutioner.invoke(JCudaExecutioner.java:354)
at org.nd4j.linalg.jcublas.ops.executioner.JCudaExecutioner.exec(JCudaExecutioner.java:73)
at org.nd4j.linalg.api.shape.Shape.toOffsetZeroCopyHelper(Shape.java:176)
at org.nd4j.linalg.api.shape.Shape.toOffsetZeroCopy(Shape.java:131)
at org.nd4j.linalg.api.ndarray.BaseNDArray.dup(BaseNDArray.java:1420)
at org.nd4j.linalg.api.ndarray.BaseNDArray.sub(BaseNDArray.java:3035)
at org.nd4j.linalg.api.ops.impl.accum.Bias.exec(Bias.java:157)
at org.nd4j.linalg.api.ops.executioner.DefaultOpExecutioner.exec(DefaultOpExecutioner.java:57)
at org.nd4j.linalg.jcublas.ops.executioner.JCudaExecutioner.exec(JCudaExecutioner.java:69)
at org.nd4j.linalg.api.ops.executioner.DefaultOpExecutioner.execAndReturn(DefaultOpExecutioner.java:194)
at org.nd4j.linalg.api.ops.impl.accum.Variance.exec(Variance.java:179)
at org.nd4j.linalg.api.ops.executioner.DefaultOpExecutioner.exec(DefaultOpExecutioner.java:57)
at org.nd4j.linalg.jcublas.ops.executioner.JCudaExecutioner.exec(JCudaExecutioner.java:69)
at org.nd4j.linalg.api.ops.executioner.DefaultOpExecutioner.execAndReturn(DefaultOpExecutioner.java:194)
at org.nd4j.linalg.api.ops.impl.accum.StandardDeviation.exec(StandardDeviation.java:89)
at org.nd4j.linalg.api.ops.executioner.DefaultOpExecutioner.exec(DefaultOpExecutioner.java:245)
at org.nd4j.linalg.api.ndarray.BaseNDArray.std(BaseNDArray.java:3400)
at org.nd4j.linalg.dataset.DataSet.normalizeZeroMeanZeroUnitVariance(DataSet.java:348)
at org.deeplearning4j.examples.test.Test.main(Test.java:46)
Caused by: jcuda.CudaException: CUDA_ERROR_INVALID_VALUE
at jcuda.driver.JCudaDriver.checkResult(JCudaDriver.java:326)
at jcuda.driver.JCudaDriver.cuLaunchKernel(JCudaDriver.java:14530)
at jcuda.utils.KernelLauncher.call(KernelLauncher.java:1020)
at org.nd4j.linalg.jcublas.kernel.KernelFunctions.invoke(KernelFunctions.java:119)
at org.nd4j.linalg.jcublas.ops.executioner.JCudaExecutioner.invokeFunction(JCudaExecutioner.java:206)
at org.nd4j.linalg.jcublas.ops.executioner.JCudaExecutioner.invoke(JCudaExecutioner.java:352)
... 18 more

OS: Windows 7
IDE: Eclipse Mars
GPU: NVIDIA Quadro K1100M
ND4J Version: 0.4-rc3.6
DL4J Version: 0.4-rc3.6
Cuda Version: 7.5

Convolution net

hi, could you tell me what mean when i get

12:42:40.599 [main] DEBUG o.d.o.solvers.BackTrackLineSearch - slope = -9.995819091796875

and how can i prevent this
please
thanks a lot

Example about collaborative filtering

I saw this paper doing CF using RBM, can you add an example?

Restricted Boltzmann Machines for Collaborative Filtering, Ruslan Salakhutdinov, ICML 2007.

TimeSeries Example

Hey,
I know you are pretty busy right now. But a RNN example with time-series would be awesome if you find time!

Thanks

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.