Benchmarking tests for js-ipfs, using go-ipfs as a baseline.
$ git clone https://github.com/ipfs/ipfs-performance-profiling.git
$ cd ipfs-performance-profiling
$ npm install
Run all benchmarks on all environments:
$ npm run benchmarks
Run all benchmarks on the go and js-core environments:
$ npm run benchmarks -- --envs=go,js-core
Available environments are:
go
js-core
js-http
Run named benchmark on the js-http environment:
$ npm run benchmarks -- files-add-1MB-file --envs=js-http
You can output a JSON report using the --json
options:
$ npm run benchmarks -- files-add-1MB-file --json
You run and produce an HTML report using:
$ npm run benchmarks:report
You run and produce an HTML report containing links to profiling data using:
$ npm run benchmarks:report:profile -- [<suite>] --envs=[env1,env2]
A benchmark suite is simply a function that gets two arguments: an IPFS client object and a callback:
module.exports = function (ipfs, callback) {
ipfs.files.add([{
path: 'a.txt',
content: new Buffer('a')
}], callback)
}
Add it under its own dir under src/suites
. Also, don't forget to add an entry to src/suites/index.js
so that it can be found.
The suite runner uses the stdout
channel for the benchmark results. If you want to log to the console, use console.error
instead.
Feel free to join in. All welcome. Open an issue!
This repository falls under the IPFS Code of Conduct.