GithubHelp home page GithubHelp logo

neuroph / neuroph Goto Github PK

View Code? Open in Web Editor NEW
342.0 342.0 173.0 156.85 MB

Java Neural Network Framework Neuroph

Home Page: http://neuroph.sourceforge.net

Java 96.91% HTML 2.65% GLSL 0.01% Roff 0.43%

neuroph's Introduction

Neuroph - Java Neural Network Platform Neuroph

In order to make Neuroph project easier to use, navigate and maintain this repo has been splitted in two repos which contain Neuroph Framework and Neuroph Studio GUI. These two repos will be active and should be used. This old repo here will remain active for some time, but should not be used for new development and will not include latest additions.

https://github.com/neuroph/NeurophFramework

https://github.com/neuroph/NeurophStudio

neuroph's People

Contributors

andrejicmilos93 avatar carrillo avatar charlesgriffiths avatar hrza avatar jecajeca avatar jtksource avatar kdekooter avatar lukicsasa avatar megicivovic avatar mithquissir avatar mladensavic94 avatar mroxso avatar neuroph avatar putnich avatar salle18 avatar stuparm avatar todorbalabanov avatar vujicictijana avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

neuroph's Issues

Perceptron#createNetwork: Settings for input layer incomplete/wrong

Currently for the Input Layer in Perceptron#createNetwork are used Instances of org.neuroph.core.Neuron (default).

Because the input layer has no Input Connections in Neuron#calculate WeightedSum-InputFunction is generating always 0.

Initialisation of input layerin Perceptron#createNetwork should look like this:
// init neuron settings for input layer
...
inputNeuronProperties.setProperty("neuronType", InputNeuron.class);
....

In version 2.9, getVersion() in org.neuroph.util.Neuroph returns 2.8

In version 2.9, getVersion() in org.neuroph.util.Neuroph returns 2.8. serialVersionUID also outdated, leading to this error:

java.io.InvalidClassException: org.neuroph.core.NeuralNetwork; local class incompatible: stream classdesc serialVersionUID = 7, local class serialVersionUID = 6

Constructing large NN takes a long time

I am not positive on why constructing a relatively large NN would take so much time.

To construct a small NN, for example

MultiLayerPerceptron neuralNet = new MultiLayerPerceptron(10000, 20);

It takes about 7 seconds to finish this single line of code.

However, if we simply expands the number of input neurons, i.e.

MultiLayerPerceptron neuralNet = new MultiLayerPerceptron(100000, 20);

It takes more than 15 mins to complete construction of this MLP NN. (My computer has a quad-core i5 Skylake processor.)

In the example above, it's about 20_10^5 = 2_10^6 connections. From my perception, it seems to be an "easy" and quick task for the computer. I am not sure why the construction would take so much time.

@neuroph Any comments or help would be greatly appreciated!

Unable to build 2.94 -- tests fail

When attempting to build the pom.xml in neuroph-2.9 using:
mvn clean package

I am getting quite a few test failures. Are the tests out of date and should I use -DskipTests or am I not using the correct maven commands?

The last example was:

T E S T S

Running org.neuroph.contrib.AppTest
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.019 sec
Running org.neuroph.contrib.ConvolutionLayerTest
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.019 sec
Running org.neuroph.contrib.PoolingLayerTest
Tests run: 3, Failures: 1, Errors: 0, Skipped: 0, Time elapsed: 0.01 sec <<< FAILURE!
testPoolingTwoLayersWithOneFeatureMap(org.neuroph.contrib.PoolingLayerTest) Time elapsed: 0.01 sec <<< FAILURE!
java.lang.AssertionError: expected:<4> but was:<1>
at org.junit.Assert.fail(Assert.java:88)
at org.junit.Assert.failNotEquals(Assert.java:743)
at org.junit.Assert.assertEquals(Assert.java:118)
at org.junit.Assert.assertEquals(Assert.java:555)
at org.junit.Assert.assertEquals(Assert.java:542)
at org.neuroph.contrib.PoolingLayerTest.testPoolingTwoLayersWithOneFeatureMap(PoolingLayerTest.java:114)

LMSTest fails on new build. Does anyone know how long has this been broken?

Tests run: 3, Failures: 2, Errors: 0, Skipped: 0, Time elapsed: 0.009 sec <<< FAILURE!
testUpdateNeuronWeightsInOnlineMode(org.neuroph.nnet.learning.LMSTest) Time elapsed: 0.006 sec <<< FAILURE!
junit.framework.AssertionFailedError: expected:<0.09707076026863733> but was:<0.047070760268637324>
at junit.framework.Assert.fail(Assert.java:50)
at junit.framework.Assert.failNotEquals(Assert.java:287)
at junit.framework.Assert.assertEquals(Assert.java:67)
at junit.framework.Assert.assertEquals(Assert.java:74)
at org.neuroph.nnet.learning.LMSTest.testUpdateNeuronWeightsInOnlineMode(LMSTest.java:43)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:483)
at junit.framework.TestCase.runTest(TestCase.java:168)
at junit.framework.TestCase.runBare(TestCase.java:134)
at junit.framework.TestResult$1.protect(TestResult.java:110)
at junit.framework.TestResult.runProtected(TestResult.java:128)
at junit.framework.TestResult.run(TestResult.java:113)
at junit.framework.TestCase.run(TestCase.java:124)
at junit.framework.TestSuite.runTest(TestSuite.java:243)
at junit.framework.TestSuite.run(TestSuite.java:238)
at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:236)
at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:134)
at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:113)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:483)
at org.apache.maven.surefire.util.ReflectionUtils.invokeMethodWithArray(ReflectionUtils.java:189)
at org.apache.maven.surefire.booter.ProviderFactory$ProviderProxy.invoke(ProviderFactory.java:165)
at org.apache.maven.surefire.booter.ProviderFactory.invokeProvider(ProviderFactory.java:85)
at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:103)
at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:74)

learn net in second thread

Hi folks,

I just might not have found it, but is there possibility to outsource the learning process to a new thread?
I read something about learnInNewThread(), but tbh I'm not quite sure how it works.
I'd like to learn multiple nets at once, and also poll their current error while learning.

thanks and regards

C# Port?

The Neuroph home page mentions that a port of it is available in C#.
Can you add a link to where we can find the source?

Unsual repository layout

Naming the repository "master" instead of e.g. "neuroph" leads to a directory "master" when cloning the repository. This is unsual, because one won't find the project by name.

Having a sub-project-dir "neuroph-2.9" is unsual too, because this leads to renaming the directory with each release. Instead of using version numbers in directory names, tags or branches would do the versioning stuff a lot better. For just marking releases, tags are optimal. If ever a bugfix release for a particular release is needed, one could create a branch based on the release tag.

Build of master branch fails

[ERROR] 'dependencies.dependency.version' for android:android:jar is missing. @ org.neuroph:neuroph-imgrec:[unknown-version], /Users/kees/workspaces/iwoz/neuroph/neuroph-2.9/ImageRec/pom.xml, line 21, column 19

This dependency in the ImageRec module is causing the trouble:

<dependency>
  <groupId>android</groupId>
  <artifactId>android</artifactId>
</dependency>

Removing this causes compilation errors in ImageAndroid.java

Changing the depency to the following makes the project compile again:

<dependency>
    <groupId>com.google.android</groupId>
    <artifactId>android</artifactId>
    <version>1.5_r4</version>
  </dependency>

Neuroph on maven central?

Neuroph does not seem to be present at maven central. Why? It would be much easier to use it in projects.

Frequent hanging

With even small size data set (2000 entries with 3 input parameters and 2 output parameters),
and a very minimal size NN (3 input, 5 intermediate, 2 output neurons)

The training hanged in ubuntu as well as windows version.

Netbeans errors keep on cropping like 'GC overhead exceeded' or 'too much data on clipboard'.

I understand it has not much to do with neuroph, but with netbeans platform.

Any fix ?

Problem using ImageRecognitionSample.java

I have trained a network using RGBImageRecognitionTrainingSample. When I try to use the network with ImageRecognitionSample the code fails here with a NullPointerException:

ImageRecognitionPlugin imageRecognitionPlugin =
                (ImageRecognitionPlugin) neuralNetwork.getPlugin(ImageRecognitionPlugin.class);

imageRecognitionPlugin is null although in my debugger I see that the plugins collection in the network is filled.

Removing elements from NeurophArrayList causes NPEs

If you create a layer, add some neurons too it (or initialize it with a set number of neurons, as I am) and then remove some of them, the layer's getNeurons() array leaves trailing nulls. This is because the internal array isn't shrunk down to the size of the list. It simply doesn't use that section of the array. The problem comes when anything uses the Layer.getNeurons() method, and naively assumes that the elements will be non-null.

Here's the proper implementation and the flawed one, as they appear side by side in the code:

    public Object[] toArray() {
        return Arrays.copyOf(elementData, size);
    }
    
    public final E[] asArray() {
        return elementData;
    }

Note that the proper implementation only copies size elements, whereas the flawed one just throws the entire array at you.

Android development - java.awt classes

Greetings,

I'm trying to create an Android application that uses neural network for image recognition.

I've encountered many problems with various Neuroph versions while trying to implement it in my application.

Problems mostly revolve around java.awt classes because they are not available in Android framework.
In the neuroph-core-2.92.jar file I've seen some commented lines with methods for usage in Android framework and would like to know when it is going to be fully implemented and how could one help to implement it faster.

If it's not going to be implemented soon, what can I do to temporarily convince core classes to use Android framework's classes?

The Android app example on your page works well out of the box, but for some reason, I cannot replicate the procedure of opening a neural network from resources.

Thank you for your work, cheers
Marko

Problems with Jzy3dWrapper

1.) Broken references to wrapped jars.
2.) After fixing build fails:
The module /home/.../NeurophStudio/build/cluster/modules/org-jzy3d.jar has no public packages and so cannot be compiled against
BUILD FAILED (total time: 22 seconds)

Remove Connection functions disables any Neural Network functionality

"learn" or "calculate" returns this error after using any remove connection function:
java.lang.NullPointerException
at org.neuroph.core.input.WeightedSum.getOutput(WeightedSum.java:34)
at org.neuroph.core.Neuron.calculate(Neuron.java:145)
at org.neuroph.core.Layer.calculate(Layer.java:259)
at org.neuroph.core.NeuralNetwork.calculate(NeuralNetwork.java:313)

Better repository structure

Please,

Separate IDE and library sources into two repositories.

It will help to download source of library.

Learning networks in online fashion

We want to make use of ur library in one of our systems.
The issue we came across is that, our architecture assumes that the learners are leaned in online fashion (not batch, i.e. the classifier seems an instances, learns it, moves to the next instance. That said we can still iterate over the data multiple times).

I was checking your examples for multi-layer perceptron, which accepts data as batch:

// create multi layer perceptron
MultiLayerPerceptron myMlPerceptron = new MultiLayerPerceptron(TransferFunctionType.TANH, 2, 3, 1);
// learn the training set
myMlPerceptron.learn(trainingSet);

However I know somewhere (?) internally somewhere you learn the data instance by instance. Anyone can help me find it?
Essentially I am looking for something like this:

// create multi layer perceptron
MultiLayerPerceptron myMlPerceptron = new MultiLayerPerceptron(TransferFunctionType.TANH, 2, 3, 1);

// made up code: 
for (someInstance : trainingSet) { 
    // learn the training set
    myMlPerceptron.learnOneInstance(someInstance);
}

NeurophException is type RuntimeException

Because the Javadoc does not list NeuralNetwork.createFromFile() as throwing any exceptions and the exceptions that are thrown are RuntimeExceptions, it becomes difficult to know when to catch exceptions. For example, it may be assumed that running NeuralNetwork.createFromFile() on a nonexistent file will not produce an exception since the compiler does not ask the developer to catch any exceptions. This makes debugging difficult when trying to catch all instances of NeurophException since they are never documented or caught by the compiler.

DecimalScaleNormalizer output problem

I'm using the DecimalScaleNormalizer class for normalize my dataset, the inputs are normalize correctly but the output not.
Debbuging the normalize method I see that the class finds the correct maxOut value but in the method findScaleVectors():

  this.scaleFactorOut = new double[this.maxOut.length];

        for(i = 0; i < this.scaleFactorOut.length; ++i) {
            this.scaleFactorOut[i] = 1.0D;
        }

        for(i = 0; i < this.maxOut.length; ++i) {
            while(this.maxIn[i] > 1.0D) {
                this.maxOut[i] /= 10.0D;
                this.scaleFactorOut[i] *= 10.0D;
            }
        }

the value this.scaleFactorOut is 0 and then the scaleFactorOut is 1 and not 10.000 how is expect
The problem is the while, cause it's asking for the maxIn vector instead of maxOut vector

org.neuroph.nnet.learning.PerceptronLearning updates threshold wrongly

The current implementation does not converge when training logic OR function with perceptron learning. Below is the reason I found.

The reason to update threshold is to avoid having a bias neuron according to http://stackoverflow.com/questions/6554792/whats-the-point-of-the-threshold-in-a-perceptron

x1.w1 + x2.w2 = threshold

to

x1.w1 + x2.w2 - 1.threshold = 0

where bias is -threshold. So the way to update threshold should be
threshold = threshold + this.learningRate * neuronError;

See https://github.com/neuroph/neuroph/blob/master/neuroph-2.9/Core/src/main/java/org/neuroph/nnet/learning/PerceptronLearning.java#L63

Missing dependencies?

Some dependencies (e.g. android:android) cannot be resolved via maven central. And there is no documentation where to get them from. This way it is very difficult to build neuroph via maven.

Furthermore at least some tests fail if executed via maven, e.g. GaussianTest.

Master branch should be compiled with jdk1.8

The NeuralNetworkEventListener interface of org.neuroph.core.events package used FunctionalInterface annotation, which is included in jdk1.8. So the project should be compiled with jdk1.8.

But inside POM file, which was indicated that

<maven.compiler.source>1.7</maven.compiler.source>
<maven.compiler.target>1.7</maven.compiler.target>

this should be solved, and I suggest that, don't use FunctionalInterface to keep compatible with jdk1.7

Neuroph 2.93b Core - org/neuroph/nnet/comp/Kernel.java interchanged Dimensions

The dimensions for the weights array are interchanged in the initWeights(double min, double max) method:

public void initWeights(double min, double max) {
    weights = new Weight[height][width];

    for (int i = 0; i < height; i++) {
        for (int j = 0; j < width; j++) {
            Weight weight = new Weight();
            weight.randomize(min, max);
            weights[i][j] = weight;
        }
    }
}

so only square shaped kernel would work. Otherwise an ArrayIndexOutOfBoundsException will be thrown.
A fixed Version could be:

public void initWeights(double min, double max) {
    weights = new Weight[width][height];

    for (int i = 0; i < height; i++) {
        for (int j = 0; j < width; j++) {
            Weight weight = new Weight();
            weight.randomize(min, max);
            weights[j][i] = weight;
        }
    }
}

NeurophArrayList is it deprecated?

import org.neuroph.util.NeurophArrayList; fails. I am not sure if NeurophArrayList is deprecated or it is a new addition? It is breaking everything in core module.

Logback files should not be included in libraries

When adding neuroph as a maven dependency, Logback dependencies and the logback.xml config file are included. This can cause, as happened to me, the neuroph logback.xml to overwrite my own logback.xml

For libraries, you should only include slf4j as a dependency.

Nice library, btw!

NeuralNetwork#setOutputNeurons

Method NeuralNetwork#setOutputNeurons doesn't set Neurons - it adds new Neurons. That is a difference ;)

P. S.
If you want to add something outputNeurons.addAll it this case would do.

Tanh derivative tests fail

When I clone the repo and run mvn package, the tanh getDerivative tests fail.

Failed tests: testGetOutput0: expected:<0.0249947929> but was:<0.042888565254513115>
testGetDerivative0: expected:<0.940014848> but was:<0.8233849509308319>
testGetOutput1: expected:<0.0499583> but was:<0.08572357559022631>
testGetDerivative1: expected:<0.786447> but was:<0.37123532456703945>
testGetOutput2: expected:<0.07485969> but was:<0.12845174325067968>
testGetDerivative2: expected:<0.596585> but was:<-0.18777757241307147>
testGetOutput3: expected:<0.09966799> but was:<0.17102031197696171>
testGetDerivative3: expected:<0.419974> but was:<-0.707776976114507>
testGetOutput4: expected:<0.124353> but was:<0.2133773157398819>
testGetDerivative4: expected:<0.280414> but was:<-1.1186837273905166>
testGetOutput5: expected:<0.148885> but was:<0.2554718291942514>
testGetDerivative5: expected:<0.1807066> but was:<-1.4122559381650563>
testGetOutput6: expected:<0.17323515> but was:<0.2972542073284932>
testGetDerivative6: expected:<0.113812> but was:<-1.609214399230635>
testGetOutput7: expected:<0.1973753> but was:<0.3386763119739128>
testGetDerivative7: expected:<0.07065> but was:<-1.7362946813477618>
testGetOutput8: expected:<0.22127846> but was:<0.3796917230669401>
testGetDerivative8: expected:<0.043464> but was:<-1.8163384925297978>

Tests run: 272, Failures: 18, Errors: 0, Skipped: 0

In possibly related news, when I skip the tests (and have jury rigged a few things to compile), the TestTimeSeries training never converges. I'm not sure if it's errors in other parts of the code, errors in the changes I made, or errors in Tanh.java. I think the current implementation of getDerivative is only accurate for amplitude=1 and slope=2, though, according to some equations plugged in to Wolfram Alpha.

Hope to move website from http://neuroph.sourceforge.net/ to Github, because in China we can't open that website

I started to focus on your project in 2012 , I can visite your website on sourceforge on that time . But five years later , When I start to research AI with Google'S TensorFlow . Your site came to my mind . It's a good choice some how.
But I can't visit http://neuroph.sourceforge.net/ . (This is because Chinese Great Wall FireWall forbid outer internet).
I will appreciate it if you make website on github .

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.