GithubHelp home page GithubHelp logo

alexxnica / js-ipfs-unixfs-engine Goto Github PK

View Code? Open in Web Editor NEW

This project forked from ipfs-inactive/js-ipfs-unixfs-engine

0.0 0.0 0.0 12.53 MB

JavaScript implementation of the layout and chunking mechanisms used by IPFS

License: MIT License

JavaScript 100.00%

js-ipfs-unixfs-engine's Introduction

IPFS unixFS Engine

standard-readme compliant Build Status Coverage Status Dependency Status js-standard-style

JavaScript implementation of the layout and chunking mechanisms used by IPFS to handle Files

Table of Contents

Install

> npm install ipfs-unixfs-engine

Usage

Importer

Importer example

Let's create a little directory to import:

> cd /tmp
> mkdir foo
> echo 'hello' > foo/bar
> echo 'world' > foo/quux

And write the importing logic:

const Importer = require('ipfs-unixfs-engine').Importer
const filesAddStream = new Importer(<dag or ipld-resolver instance)

// An array to hold the return of nested file/dir info from the importer
// A root DAG Node is received upon completion

const res = []

// Import path /tmp/foo/bar
const rs = fs.createReadStream(file)
const rs2 = fs.createReadStream(file2)
const input = { path: /tmp/foo/bar, content: rs }
const input2 = { path: /tmp/foo/quxx, content: rs2 }

// Listen for the data event from the importer stream
filesAddStream.on('data', (info) => res.push(info))

// The end event of the stream signals that the importer is done
filesAddStream.on('end', () => console.log('Finished filesAddStreaming files!'))

// Calling write on the importer to filesAddStream the file/object tuples
filesAddStream.write(input)
filesAddStream.write(input2)
filesAddStream.end()

When run, the stat of DAG Node is outputted for each file on data event until the root:

{ multihash: <Buffer 12 20 bd e2 2b 57 3f 6f bd 7c cc 5a 11 7f 28 6c a2 9a 9f c0 90 e1 d4 16 d0 5f 42 81 ec 0c 2a 7f 7f 93>,
  size: 39243,
  path: '/tmp/foo/bar' }

{ multihash: <Buffer 12 20 bd e2 2b 57 3f 6f bd 7c cc 5a 11 7f 28 6c a2 9a 9f c0 90 e1 d4 16 d0 5f 42 81 ec 0c 2a 7f 7f 93>,
  size: 59843,
  path: '/tmp/foo/quxx' }

{ multihash: <Buffer 12 20 bd e2 2b 57 3f 6f bd 7c cc 5a 11 7f 28 6c a2 9a 9f c0 90 e1 d4 16 d0 5f 42 81 ec 0c 2a 7f 7f 93>,
  size: 93242,
  path: '/tmp/foo' }

{ multihash: <Buffer 12 20 bd e2 2b 57 3f 6f bd 7c cc 5a 11 7f 28 6c a2 9a 9f c0 90 e1 d4 16 d0 5f 42 81 ec 0c 2a 7f 7f 93>,
  size: 94234,
  path: '/tmp' }

Importer API

const Importer = require('ipfs-unixfs-engine').Importer

const import = new Importer(dag [, options])

The import object is a duplex pull stream that takes objects of the form:

{
  path: 'a name',
  content: (Buffer or Readable stream)
}

import will outoyt file info objects as files get stored in IPFS. When stats on a node are emitted they are guaranteed to have been written.

dag is an instance of the IPLD Resolver or the js-ipfs dag api

The input's file paths and directory structure will be preserved in the dag-pb created nodes.

options is an JavaScript option that might include the following keys:

  • wrap (boolean, defaults to false): if true, a wrapping node will be created
  • shardSplitThreshold (positive integer, defaults to 1000): the number of directory entries above which we decide to use a sharding directory builder (instead of the default flat one)
  • chunker (string, defaults to "fixed"): the chunking strategy. Now only supports "fixed"
  • chunkerOptions (object, optional): the options for the chunker. Defaults to an object with the following properties:
    • maxChunkSize (positive integer, defaults to 262144): the maximum chunk size for the fixed chunker.
  • strategy (string, defaults to "balanced"): the DAG builder strategy name. Supports:
    • flat: flat list of chunks
    • balanced: builds a balanced tree
    • trickle: builds a trickle tree
  • maxChildrenPerNode (positive integer, defaults to 174): the maximum children per node for the balanced and trickle DAG builder strategies
  • layerRepeat (positive integer, defaults to 4): (only applicable to the trickle DAG builder strategy). The maximum repetition of parent nodes for each layer of the tree.
  • reduceSingleLeafToSelf (boolean, defaults to false): optimization for, when reducing a set of nodes with one node, reduce it to that node.
  • dirBuilder (object): the options for the directory builder
    • hamt (object): the options for the HAMT sharded directory builder
      • bits (positive integer, defaults to 8): the number of bits at each bucket of the HAMT

Exporter

Exporter example

// Create an export source pull-stream cid or ipfs path you want to export and a
// <dag or ipld-resolver instance> to fetch the file from
const filesStream = Exporter(<cid or ipfsPath>, <dag or ipld-resolver instance>)

// Pipe the return stream to console
filesStream.on('data', (file) => file.content.pipe(process.stdout))

Exporter API

const Exporter = require('ipfs-unixfs-engine').Exporter

new Exporter(, )

Uses the given [dag API or an ipld-resolver instance][] to fetch an IPFS UnixFS object(s) by their multiaddress.

Creates a new readable stream in object mode that outputs objects of the form

{
  path: 'a name',
  content: (Buffer or Readable stream)
}

Errors are received as with a normal stream, by listening on the 'error' event to be emitted.

Contribute

Feel free to join in. All welcome. Open an issue!

This repository falls under the IPFS Code of Conduct.

License

MIT

js-ipfs-unixfs-engine's People

Contributors

daviddias avatar dignifiedquire avatar fbaiodias avatar greenkeeper[bot] avatar greenkeeperio-bot avatar hackergrrl avatar jbenet avatar nginnever avatar noffle avatar pgte avatar richardlitt avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.