harthur / brain Goto Github PK
View Code? Open in Web Editor NEWSimple feed-forward neural network in JavaScript
License: MIT License
Simple feed-forward neural network in JavaScript
License: MIT License
Hi,
I am trying to use brain for training RSSI and sensors coordinates as inputs and position as output in x,y,z normalized in single number. can you suggest on data modeling this into brain.
inputs : [{x1,y1,z1,distance1},{{x2,y2,z2,distance2},{x3,y3,z3,distance3}}
output: {x100+y10+z} as position.
i get constant error while training even up to 20000 iterations.
Regards
Here is a table describing the training time as a function of the training set size, on my computer
1 - 8[ms]
2 - 26[ms]
4- 132[ms]
8 - 938[ms]
16 - 3106[ms]
32 - 9717[ms]
It seems that the training time is quadratic in the number of training samples...
Hi Heather, I started to use your library this week-end and got up and running pretty quickly. I got a prototype working very well in FireFox. However today as I started to test in IE (any version = 6/7/8) I found out that the getters and setters do not work there.
E.g.:
Layer.prototype = {
get outputs() { }
}
You can actually try it using your demo page.
While researching this for a bit I found that although FF was first to implement this, the JS standard group has rallied around a different syntax/approach which MS started to implement in IE8.
Would it be possible to have a version using a "dumbed-down" syntax so it can be more cross-browser? I started to adapt the code but must be doing something wrong as I could no longer get the library to work correctly. Could you help?
Thanks in advance.
I`ll do like to know: can I reopen this repository?
After training my network, i tried to get the json and load onto another network. I am hitting this error while doing that .
var net = new brain.NeuralNetwork({learningRate: 0.3});
net.train(game.data.nndata, {log:true, errorThresh:0.02, logPeriod:100, iterations:100});
var json = net.toJSON();
var net1 = new brain.NeuralNetwork();
net1.fromJSON(json); // hitting the error here
Please add usage examples for using toJSON and fromJSON if something is wrong with the way its done above.
the learningRate
is only used in training, and so should be an option passed to train()
and not the initialization of the neural network.
Calling net.run()
on a net that was trained with array-formatted data (e.g.: {input: [0, 1], output: [1]}
should return an array (e.g. [0.987]
).
If you train with stream, run will return an object (e.g. {'0': 0.987'}
)
I was wondering if brain can be used for making predictions, and if so, can I get an example?
Hi can you modify the rgb example below for stream training? Thank you.
var net = new brain.NeuralNetwork();
net.train([{input: { r: 0.03, g: 0.7, b: 0.5 }, output: { black: 1 }},
{input: { r: 0.16, g: 0.09, b: 0.2 }, output: { white: 1 }},
{input: { r: 0.5, g: 0.5, b: 1.0 }, output: { white: 1 }}]);
var output = net.run({ r: 1, g: 0.4, b: 0 }); // { white: 0.99, black: 0.002 }
Nice Neural Net!
I have two suggestions for improvement:
brain.fromJSON = function (json) {
return new brain.NeuralNetwork().fromJSON(
typeof json === "string" ? JSON.parse(json) : json
);
}
Let me know if you want a pull request for this.
When training, if you have a string output of zero "0" it converts the results of run
to an Array if created using fromJSON
. This could happen when creating a portable NeuralNet to convert images (for example, on a license plate) of letters and numbers to a string in Javascript.
Example:
net.train([
{input:inputX,output:{"X":1}},
{input:input1,output:{"1":1}},
{input:input0,output:{"0":1}}
]);
net.run(inputX).constructor === Object
// net.run(inputX) will return an Object,
// but if cloned it will return an Array:
var net_clone = brain.fromJSON(net.toJSON());
net_clone.run(inputX).constructor === Array;
toFunction
is unaffected by this bug.
Fuller example at this gist:
https://gist.github.com/josher19/5218613
Let me know if you would like me to add a unit test to
brain/test/unit/json.js
or any other way I can help.
Best Regards,
->> Josh <<-
I need some theoretical backgrounds for it, especially some papers, or books, regarding the specific algorithm used in it.
The demo works in none of the browsers I've tested (firefox, chrome, opera). Gives me
Cannot read property 'input' of undefined
and
data[0] is undefined
var bayes = new brain.BayesianClassifier({backend: {type: 'redis'}});
will got following errors:
TypeError: undefined is not a function
at CALL_NON_FUNCTION (native)
at /Users/apple/node_modules/brain/lib/bayesian/bayesian.js:149:7
at /Users/apple/node_modules/brain/lib/bayesian/bayesian.js:116:9
at Command.callback (/Users/apple/node_modules/brain/lib/bayesian/backends/redis.js:89:7)
at RedisClient.return_reply (/Users/apple/node_modules/redis/index.js:425:29)
at HiredisReplyParser. (/Users/apple/node_modules/redis/index.js:81:14)
at HiredisReplyParser.emit (events.js:64:17)
at HiredisReplyParser.execute (/Users/apple/node_modules/redis/lib/parser/hiredis.js:35:22)
at RedisClient.on_data (/Users/apple/node_modules/redis/index.js:358:27)
at Socket. (/Users/apple/node_modules/redis/index.js:93:14)
@harthur I was looking at your profile and saw this:
Why do you not maintain these projects anymore? If you don't have time to do that, why do you not search somebody to keep them alive?
One way to have self learning AI is to evolve neural networks, which is what I want to use this for.
What I need to be able to do this:
Is that possible with this library, or is that not what it's designed for and I should look elsewhere? Thanks!
There is a relatively complex single player game called peg solitaire: http://en.wikipedia.org/wiki/Peg_solitaire . I intend to win this game with a learning algorithm. Do you think brain.js is capable to do that, or can you suggest something about how to start with these kind of algorithms?
I attempted to install brain using npm install
, and I got the following error.
Perhaps you should add the directory containing `cairo.pc'
to the PKG_CONFIG_PATH environment variable
No package 'cairo' found
So, I attempted to install cairo, but there also appears to be a dependency upon xcb-shm. I think this may be cairo's dependency. In my case, on a Mac, I attempted to install cairo using homebrew.
At any rate, perhaps the installation instructions could be expanded to include cairo installation, since it's non-obvious? I'll be happy to send a PR if I get it working, but if somebody has already done this successfully, I'd really appreciate some pointers.
Hi, sorry if this has already been answered but I couldn't find it.
My question is what is the algorithm that is used to train the neural network? Is it backpropagation?
Hi.
I try to learn to detect the gender of names.
The TrainingData looks like this
{ input: [ 2, 26, 11, 8, 13, 4 ], output: { h: 0, f: 1 } },
{ input: [ 2, 4, 17, 8, 18, 4 ], output: { h: 0, f: 1 } },
{ input: [ 2, 4, 18, 0, 8, 17, 4 ], output: { h: 1, f: 0 } },
{ input: [ 2, 26, 18, 0, 17 ], output: { h: 1, f: 0 } },
{ input: [ 2, 7, 0, 13, 19, 0, 11 ], output: { h: 0, f: 1 } },
{ input: [ 2, 7, 0, 13, 19, 26 ], output: { h: 0, f: 1 } },
{ input: [ 2, 7, 0, 17, 11, 8, 13, 4 ], output: { h: 0, f: 1 } },
{ input: [ 2, 7, 0, 17, 11, 14, 19 ], output: { h: 1, f: 0 } },
My code : https://gist.github.com/lucaspojo/1a4c7c848f18074ccd195eb8d7828b6a
Do you know what I'm doing wrong?
Thx!
Would be better to have something like write ahead log similar to databases.
Where there is a max amount of data events before training is started. If max is not reached in some time frame, then go ahead with training and begin queuing up more data while training. however lock the training while currently training.
At the end of each training session, trained values should be copied, and the run function should work on these copies.
By removing the null we can
socket.on('data', function (buffer) { brainstream.write(buffer); }
And all will work. I may try to create a pull request.
This is a feature request and I'd be happy to work on this and submit a PR, but what do you think about giving the option to start off with initial weights? This can make training much faster if the user already has perhaps some pretrained data?
I'm seeing this even with sudo on Ubuntu 12.10 npm version 1.1.16
Any thoughts as to whats going on here?
npm ERR! Error: ENOENT, lstat '/home/malachai/dev/projects/brain/node_modules/mocha/node_modules/jade/node_modules/less/node_modules/ycssmin/tests/files/dataurl-realdata-yuiapp.css.min'
npm ERR! You may report this log at:
npm ERR! http://github.com/isaacs/npm/issues
npm ERR! or email it to:
npm ERR! [email protected]
npm ERR!
npm ERR! System Linux 3.5.0-21-generic
npm ERR! command "node" "/usr/bin/npm" "install" "--dev"
npm ERR! cwd /home/malachai/dev/projects/brain
npm ERR! node -v v0.8.9
npm ERR! npm -v 1.1.16
npm ERR! path /home/malachai/dev/projects/brain/node_modules/mocha/node_modules/jade/node_modules/less/node_modules/ycssmin/tests/files/dataurl-realdata-yuiapp.css.min
npm ERR! fstream_path /home/malachai/dev/projects/brain/node_modules/mocha/node_modules/jade/node_modules/less/node_modules/ycssmin/tests/files/dataurl-realdata-yuiapp.css.min
npm ERR! fstream_type File
npm ERR! fstream_class FileWriter
npm ERR! code ENOENT
npm ERR! message ENOENT, lstat '/home/malachai/dev/projects/brain/node_modules/mocha/node_modules/jade/node_modules/less/node_modules/ycssmin/tests/files/dataurl-realdata-yuiapp.css.min'
npm ERR! errno {}
npm ERR! fstream_stack Writer._finish.er.fstream_finish_call (/usr/lib/nodejs/npm/node_modules/fstream/lib/writer.js:284:26)
npm ERR! fstream_stack Object.oncomplete (fs.js:297:15)
npm ERR! Unresolvable cycle detected
npm ERR! While installing: [email protected]
npm ERR! Found a pathological dependency case that npm cannot solve.
npm ERR! Please report this to the package author.
npm ERR!
npm ERR! System Linux 3.5.0-21-generic
npm ERR! command "node" "/usr/bin/npm" "install" "--dev"
npm ERR! cwd /home/malachai/dev/projects/brain
npm ERR! node -v v0.8.9
npm ERR! npm -v 1.1.16
npm ERR! code ECYCLE
npm ERR! message Unresolvable cycle detected
npm ERR! errno {}
npm ERR!
npm ERR! Additional logging details can be found in:
npm ERR! /home/malachai/dev/projects/brain/npm-debug.log
npm not ok
My npm-debug.log:
info it worked if it ends with ok
verbose cli [ 'node', '/usr/bin/npm', 'install', '--dev' ]
info using [email protected]
info using [email protected]
verbose config file /home/malachai/.npmrc
verbose config file /usr/local/etc/npmrc
verbose config file /usr/lib/nodejs/npm/npmrc
verbose caching /home/malachai/dev/projects/brain/package.json
verbose loadDefaults [email protected]
verbose readDependencies: using package.json deps
verbose where, deps [ '/home/malachai/dev/projects/brain',
verbose where, deps [ 'underscore', 'mocha', 'canvas', 'cradle', 'should', 'async' ] ]
verbose from cache /home/malachai/dev/projects/brain/package.json
info preinstall [email protected]
verbose caching /home/malachai/dev/projects/brain/node_modules/cradle/package.json
verbose caching /home/malachai/dev/projects/brain/node_modules/underscore/package.json
verbose caching /home/malachai/dev/projects/brain/node_modules/async/package.json
verbose caching /home/malachai/dev/projects/brain/node_modules/should/package.json
verbose from cache /home/malachai/dev/projects/brain/package.json
verbose readDependencies: using package.json deps
verbose from cache /home/malachai/dev/projects/brain/node_modules/cradle/package.json
verbose from cache /home/malachai/dev/projects/brain/node_modules/underscore/package.json
verbose from cache /home/malachai/dev/projects/brain/node_modules/async/package.json
verbose from cache /home/malachai/dev/projects/brain/node_modules/should/package.json
verbose already installed in /home/malachai/dev/projects/brain skipping underscore@>=1.3.3
verbose cache add [ 'mocha@>=1.0.0', null ]
Then the mocha test fails like so:
mocha test/cross-validation/* --timeout 10000
module.js:340
throw err;
^
Error: Cannot find module 'canvas'
at Function.Module._resolveFilename (module.js:338:15)
at Function.Module._load (module.js:280:25)
at Module.require (module.js:362:17)
at require (module.js:378:17)
at Object. (/home/malachai/dev/projects/brain/test/cross-validation/ocr.js:2:14)
at Module._compile (module.js:449:26)
at Object.Module._extensions..js (module.js:467:10)
at Module.load (module.js:356:32)
at Function.Module._load (module.js:312:12)
at Module.require (module.js:362:17)
at require (module.js:378:17)
at Mocha.loadFiles (/usr/local/lib/node_modules/mocha/lib/mocha.js:152:27)
at Array.forEach (native)
at Mocha.loadFiles (/usr/local/lib/node_modules/mocha/lib/mocha.js:149:14)
at Mocha.run (/usr/local/lib/node_modules/mocha/lib/mocha.js:305:31)
at Object. (/usr/local/lib/node_modules/mocha/bin/_mocha:327:7)
at Module._compile (module.js:449:26)
at Object.Module._extensions..js (module.js:467:10)
at Module.load (module.js:356:32)
at Function.Module._load (module.js:312:12)
at Module.runMain (module.js:492:10)
at process.startup.processNextTick.process._tickCallback (node.js:244:9)
When I fork this repo, and do "npm install", the folder is filled with node_modules, and since there is no .gitignore, git tries to commit them...
From what I read, there are two options for handling node_modules in git projects:
A. Keep all the node_modules in the git repository, together with my project, or:
B. Don't keep any node_modules in the git repository, and have a ".gitignore" file that contains "node_modules".
However, here, there are no node_modules, but also no .gitignore file...
I am not an expert, so maybe it's my mistake.
I'm having trouble getting brain to build properly. I was trying to get v0.6.1 working for a browser-based game I'm working on, but didn't see it listed on the downloads page. So, I cloned it, checked out version v0.6.1 and tried a build:
git clone git://github.com/harthur/brain.git
git checkout 8ba611b771d6186d2ff63405978b951c7b49cbb1 # 0.6.1
npm install
npm install browserify
jake build
npm install uglify-js@1
jake minify
Which worked (no errors), but wouldn't run in the browser.
So, I tried
git checkout 20114ae3844bbdbe4b1e5abb2fc38f50b940bfa4 # 0.6.0
jake build
jake minify
wget -O brain-0.6.0-official.js https://github.com/downloads/harthur/brain/brain-0.6.0.js
diff brain-0.6.0.js brain-0.6.0-official.js > brain.diff
and compared my generated brain-0.6.0.js to the official brain-0.6.0.js and it was way different and also wouldn't run in the browser.
How do you build brain?
I get an error at line 430 in the file neuralnetwork.js
for (var i = 0; i < errors.length; i++) {
Because errors
is undefined. Error Uncaught TypeError: Cannot read property 'length' of undefined
.
This happens with your basic example:
var net = new brain.NeuralNetwork();
net.train([{input: [0, 0], output: [0]},
{input: [0, 1], output: [1]},
{input: [1, 0], output: [1]},
{input: [1, 1], output: [0]}]);
var output = net.run([1, 0]); // [0.987]
This is an enhancement request: I think we should be able to train one neural network multiple times.
Without this feature, we need to keep a record of everything we've used for previous trainings, append our new data, and train again, creating a new network. But with this feature, we can pass around a neural network export, and train again when needed.
I'm trying to use this library for an image recognition assignment. We're getting pretty poor results, around 40% accuracy, compared to the high 90% range for all the other techniques we've tested. That's with all our preprocessing and training over 100s of iterations with a very low error threshold.
I'm wondering what type of neural network this is, and whether there's anything fundamental in its implementation that makes it poorly suited to image recognition? Convolutional NNs are generally great at it, but I was thinking that this library might be a different type.
Does it generally perform well on high dimensional data? We're using all features with non-zero variance from our image set, which is in the order of 100 or so pixels/features. The colour contrast example given only has 3 dimensions, so I was thinking we might be able to get better performance with some aggressive feature extraction/dimensionality reduction.
Some run redis with a auth key (password) and, currently, the brain redis backend does not support AUTH.
I have a simple patch to fix this.
Is anyone interested-in or working-on a Redis backend for this project?
Is there a reason there isn't one already (something I may be overlooking), or any fundamental reasons this wouldn't work?
You have two flows: Train and Run.
I import it to Node-Red and press the inject node to Train it. I get this in the debug window.
"TypeError: May not write null values to stream"
If the demo doesn't work, I am a bit worried.
It seems like when we set the iterations in trainStream it'll add 1 additional iteration
So I try to run this project, here is the steps I take, not sure if it is correct
yarn install
When it asks
Couldn't find any versions for "esprima-six" that matches "0.0.3"
? Please choose a version from this list: (Use arrow keys)
Choose 1.0.1
Then, create a file like basic.js
var assert = require("assert"),
brain = require("./lib/brain");
var net = new brain.NeuralNetwork();
net.train([{input: { r: 0.03, g: 0.7, b: 0.5 }, output: { black: 1 }},
{input: { r: 0.16, g: 0.09, b: 0.2 }, output: { white: 1 }},
{input: { r: 0.5, g: 0.5, b: 1.0 }, output: { white: 1 }}]);
var output = net.run({ r: 1, g:
Run node basic.js
to see the result. Hope it help
Hello,
tests are taking forever to complete. Is it normal?
$ node test/runtests.js
The "sys" module is now called "util". It should have a similar interface.
PASS redis
PASS thresholds
PASS basictext
PASS json
PASS layers
PASS hash
PASS json
PASS grow
PASS tofunction
PASS bitwise
PASS errorthresh
running bayes test on data size: 900
average error: 0.21
average train time: 171 ms
average test time: 313 ms
running neuralnet test on data size: 591
average error: 0.02
average train time: 572 ms
average test time: 8 ms
running bayes test on data size: 900
average error: 0.21
average train time: 123 ms
average test time: 227 ms
running bayes test on data size: 900
average error: 0.20
average train time: 126 ms
average test time: 248 ms
... and just hangs here, CPU working at 100%. I let it run for some minutes, but nothing seems to happen.
I can't tell if something is wrong or if it's just my EEE being too slow.
I get this error when trying to use the localStorage in brain/lib/bayesian/backends/localStorage.js:12:3
I train my network on a set of data containing car data (year of fabrication, mileage, type, model as input and price as output). I try to predict price for another car but output is NaN. NaN is not even among the values in the training set so this seams like an issue with the brain module.
My code is on GitHib Gist, here: https://gist.github.com/alexnix/146fea914501d283c80635087dd87036
In the past I have used Brain.js and some other JavaScript based machine learning libraries but unfortunately I have found them not matching my needs. That's why, working on my personal projects, I have developed the idea of creating a different approach by myself. So, sorry to be here to talk about another project but I would really like to receive some opinions and suggestions from experienced people that, being into this specific field, could help me to set useful and shared expectations. By the way, thanks for the great support given to Brain.js so far. The library is called DN2A and is on https://github.com/dn2a/dn2a-javascript
see https://github.com/harthur/brain/blob/master/lib/neuralnetwork.js#L373 the toFunction
code as beblow:
toFunction: function() {
var json = this.toJSON();
// return standalone function that mimics run()
return new Function("input",
' var net = ' + JSON.stringify(json) + ';\n\n\
for (var i = 1; i < net.layers.length; i++) {\n\
var layer = net.layers[i];\n\
var output = {};\n\
\n\
for (var id in layer) {\n\
var node = layer[id];\n\
var sum = node.bias;\n\
\n\
for (var iid in node.weights) {\n\
sum += node.weights[iid] * input[iid];\n\
}\n\
output[id] = (1 / (1 + Math.exp(-sum)));\n\
}\n\
input = output;\n\
}\n\
return output;');
}
It looks a little dirty, I suggest to use the multiline lib to replace it.
And if you target only at nodejs/iojs platform, we can just use the ECMA6's string template feature, which supports multiline block code, see template_strings
Hi, this post is more discussion than issue. It's related to what was said before on issue #13.
Currently brains, when created take options hiddenLayers
& learningRate
:
var net = new NeuralNetwork({
hiddenLayers: [4],
learningRate: 0.6
});
And when trained, take data
and options errorThresh
, iterations
, log
& logPeriod
:
net.train(data, {
errorThresh: 0.004,
iterations: 20000,
log: true,
logPeriod: 10
});
What I'm asking is this, given that one cannot iteratively train a these neural networks, why are there 2 points at which different options are provided to the neural network?
Does it not make sense to declare all options/setting in one place, ie on creation of the neural network?
Perhaps something like:
var net = new NeuralNetwork({
hiddenLayers: [4],
learningRate: 0.6,
errorThresh: 0.004,
iterations: 20000,
log: true,
logPeriod: 10
});
net.train(data);
Chris
After training a net on large data inputs (Array(90000)), I found that brain can crash a tab easily. This may not be a problem on node.
It crashes a tab simply allocating the arrays for the layers. My solution for this problem would be to use typed arrays like UInt8Array instead to decrease memory usage and increase general performance.
I may implement this on my own and submit a pull request if this would be a welcomed change.
this
var sizes = _([inputSize, hiddenSizes, outputSize]).flatten();
returns a lodash wrapper instance instead of an array. Please do this instead:
var sizes = _.flatten([inputSize, hiddenSizes, outputSize]);
If not your lib is not usable with lodash
Line 99 in 6b58dc2
also at line 527.
in neuralnetwork.js, there are a couple places the error vectors are not properly normalized. They're being mapped to the interval [0,1/sqrt(N)] instead of [0,1] . The result for very large data sets is error reported noticeably smaller than it should be (ie, during training).
line 181 in Layer.prototype
return Math.sqrt(sum / this.getSize())
instead of
return Math.sqrt(sum) / this.getSize()
to check this, note that sqrt(sum_of_errors) is at most sqrt(size_of_layer)
line 73 in train(), should be
error = Math.sqrt(sum / data.length); // mean squared error
instead of
error = Math.sqrt(sum) / data.length; // mean squared error
to check this, note that sqrt(sum) is at most sqrt(data.length)
its "contrast", see:
color constrast demo
If I running single my script to recognition a charecter then Neral Network skip layer input, but if require it from server Hapi error occur at line: 366, 367.
And I resolved it:
this.biases[i][j] = layer[node] ? layer[node].bias : 0;
this.weights[i][j] = layer[node] ? _(layer[node].weights).toArray() : 0;
I'm thinking about digging into the code and making the training done with streams and event emitters. That way I can stream the data in from my database and train as the data is flowing in.
Can you think of any problems I might run into? I understand that the data will probably need to be streamed in multiple times to do the training so maybe some caching will be necessary.
My worry is if I'm training a dataset of 20+ million records, I will quickly run out of memory trying to load it all into RAM.
From a quick glance it seems like the only real issue is going to be the formatData function. I'm thinking it will need to be called on each record individually as it's read from the stream. So if the input is going to be as a hash then there will need to be an extra iteration over the stream to generate inputLookup
and outputLookup
.
in neuralnetwork.js
line 196, 246, 287, 501, 506, 569 missing semicolon?
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.