GithubHelp home page GithubHelp logo

node-modules / compressing Goto Github PK

View Code? Open in Web Editor NEW
428.0 18.0 36.0 813 KB

Everything you need for compressing and uncompressing

License: MIT License

JavaScript 100.00%
gzip tgz stream tar compressed-files filestream

compressing's Introduction

compressing

NPM version Test coverage npm download

The missing compressing and uncompressing lib for node.

Currently supported:

  • tar
  • gzip
  • tgz
  • zip

Install

npm install compressing

Usage

Compress a single file

Use gzip as an example, tar, tgz and zip is same as gzip.

promise style

const compressing = require('compressing');

// compress a file
compressing.gzip.compressFile('file/path/to/compress', 'path/to/destination.gz')
.then(compressDone)
.catch(handleError);

// compress a file buffer
compressing.gzip.compressFile(buffer, 'path/to/destination.gz')
.then(compressDone)
.catch(handleError);

// compress a stream
compressing.gzip.compressFile(stream, 'path/to/destination.gz')
.then(compressDone)
.catch(handleError);

stream style

const compressing = require('compressing');

new compressing.gzip.FileStream({ source: 'file/path/to/compress' })
  .on('error', handleError)
  .pipe(fs.createWriteStream('path/to/destination.gz'))
  .on('error', handleError);

// It's a transform stream, so you can pipe to it
fs.createReadStream('file/path/to/compress')
  .on('error', handleError)
  .pipe(new compressing.gzip.FileStream())
  .on('error', handleError)
  .pipe(fs.createWriteStream('path/to/destination.gz'))
  .on('error', handleError);

// You should take care of stream errors in caution, use pump to handle error in one place
const pump = require('pump');
const sourceStream = fs.createReadStream('file/path/to/compress');
const gzipStream = new compressing.gzip.FileStream();
const destStream = fs.createWriteStream('path/to/destination.gz');
pump(sourceStream, gzipStream, destStream, handleError);

Compress a dir

Use tar as an example, tgz and zip is same as gzip.

Gzip only support compressing a single file. if you want to compress a dir with gzip, then you may need tgz instead.

promise style

const compressing = require('compressing');
compressing.tar.compressDir('dir/path/to/compress', 'path/to/destination.tar')
.then(compressDone)
.catch(handleError);

stream style

const compressing = require('compressing');

const tarStream = new compressing.tar.Stream();
tarStream.addEntry('dir/path/to/compress');

tarStream
  .on('error', handleError)
  .pipe(fs.createWriteStream('path/to/destination.tar'))
  .on('error', handleError);

// You should take care of stream errors in caution, use pump to handle error in one place
const tarStream = new compressing.tar.Stream();
tarStream.addEntry('dir/path/to/compress');
const destStream = fs.createWriteStream('path/to/destination.tar');
pump(tarStream, destStream, handleError);

Stream is very powerful, you can compress multiple entries in it;

const tarStream = new compressing.tar.Stream();
// dir
tarStream.addEntry('dir/path/to/compress');

// file
tarStream.addEntry('file/path/to/compress');

// buffer
tarStream.addEntry(buffer);

// stream
tarStream.addEntry(stream);

const destStream = fs.createWriteStream('path/to/destination.tar');
pipe(tarStream, destStream, handleError);

Uncompress a file

promise style

const compressing = require('compressing');

// uncompress a file
compressing.tgz.uncompress('file/path/to/uncompress.tgz', 'path/to/destination/dir')
.then(uncompressDone)
.catch(handleError);

// uncompress a file buffer
compressing.tgz.uncompress(buffer, 'path/to/destination/dir')
.then(uncompressDone)
.catch(handleError);

// uncompress a stream
compressing.tgz.uncompress(stream, 'path/to/destination/dir')
.then(uncompressDone)
.catch(handleError);

Note: tar, tgz and zip have the same uncompressing API as above: destination should be a path of a directory, while that of gzip is slightly different: destination must be a file or filestream.

And working with urllib is super easy. Let's download a tgz file and uncompress to a directory:

const urllib = require('urllib');
const targetDir = require('os').tmpdir();
const compressing = require('compressing');

urllib.request('http://registry.npmjs.org/pedding/-/pedding-1.1.0.tgz', {
  streaming: true,
  followRedirect: true,
})
.then(result => compressing.tgz.uncompress(result.res, targetDir))
.then(() => console.log('uncompress done'))
.catch(console.error);

stream style

const compressing = require('compressing');
const mkdirp = require('mkdirp');

function onEntry(header, stream, next) => {
  stream.on('end', next);

  // header.type => file | directory
  // header.name => path name

  if (header.type === 'file') {
    stream.pipe(fs.createWriteStream(path.join(destDir, header.name)));
  } else { // directory
    mkdirp(path.join(destDir, header.name), err => {
      if (err) return handleError(err);
      stream.resume();
    });
  }
}

new compressing.tgz.UncompressStream({ source: 'file/path/to/uncompress.tgz' })
  .on('error', handleError)
  .on('finish', handleFinish) // uncompressing is done
  .on('entry', onEntry);

// It's a writable stream, so you can pipe to it
fs.createReadStream('file/path/to/uncompress')
  .on('error', handleError)
  .pipe(new compressing.tgz.UncompressStream())
  .on('error', handleError)
  .on('finish', handleFinish) // uncompressing is done
  .on('entry', onEntry);

Note: tar, tgz and zip have the same uncompressing streaming API as above: it's a writable stream, and entries will be emitted while uncompressing one after one another, while that of gzip is slightly different: gzip.UncompressStream is a transform stream, so no entry event will be emitted and you can just pipe to another stream

This constrants is brought by Gzip algorithm itself, it only support compressing one file and uncompress one file.

new compressing.gzip.UncompressStream({ source: 'file/path/to/uncompress.gz' })
  .on('error', handleError)
  .pipe(fs.createWriteStream('path/to/dest/file'))
  .on('error', handleError);

API

compressFile

Use this API to compress a single file. This is a convenient method, which wraps FileStream API below, but you can handle error in one place.

  • gzip.compressFile(source, dest, opts)
  • tar.compressFile(source, dest, opts)
  • tgz.compressFile(source, dest, opts)
  • zip.compressFile(source, dest, opts)

Params

  • source {String|Buffer|Stream} - source to be compressed, could be a file path, buffer, or a readable stream
  • dest {String|Stream} - compressing destination, could be a file path(eg. /path/to/xx.tgz), or a writable stream.
  • opts {Object} - usually you don't need it

Returns a promise object.

compressDir

Use this API to compress a dir. This is a convenient method, which wraps Stream API below, but you can handle error in one place.

Note: gzip do not have a compressDir method, you may need tgz instead.

  • tar.compressDir(source, dest, opts)
  • tgz.compressDir(source, dest, opts)
  • zip.compressDir(source, dest, opts)

Params

  • source {String|Buffer|Stream} - source to be compressed
  • dest {String|Stream} - compressing destination, could be a file path(eg. /path/to/xx.tgz), or a writable stream.
  • opts {Object} - usually you don't need it

uncompress

Use this API to uncompress a file. This is a convenient method, which wraps UncompressStream API below, but you can handle error in one place. RECOMMANDED.

  • tar.uncompress(source, dest, opts)
  • tgz.uncompress(source, dest, opts)
  • zip.uncompress(source, dest, opts)
  • gzip.uncompress(source, dest, opts)

Params

  • source {String|Buffer|Stream} - source to be uncompressed
  • dest {String|Stream} - uncompressing destination. When uncompressing tar, tgz and zip, it should be a directory path (eg. /path/to/xx). When uncompressing gzip, it should be a file path or a writable stream.
  • opts {Object} - usually you don't need it
    • opts.zipFileNameEncoding {String} - Only work on zip format, default is 'utf8'. Major non-UTF8 encodings by languages:

      • Korean: cp949, euc-kr
      • Japanese: sjis (shift_jis), cp932, euc-jp
      • Chinese: gbk, gb18030, gb2312, cp936, hkscs, big5, cp950

FileStream

The transform stream to compress a single file.

Note: If you are not very familiar with streams, just use compressFile() API, error can be handled in one place.

  • new gzip.FileStream(opts)
  • new tar.FileStream(opts)
  • new tgz.FileStream(opts)
  • new zip.FileStream(opts)

Common params:

  • opts.source {String|Buffer|Stream} - source to be compressed, could be a file path, buffer, or a readable stream.

Gzip params:

  • opts.zlib - {Object} gzip.FileStream uses zlib to compress, pass this param to control the behavior of zlib.

Tar params:

  • opts.relativePath {String} - Adds a file from source into the compressed result file as opts.relativePath. Uncompression programs would extract the file from the compressed file as relativePath. If opts.source is a file path, opts.relativePath is optional, otherwise it's required.
  • opts.size {Number} - Tar compression requires the size of file in advance. When opts.source is a stream, the size of it cannot be calculated unless load all content of the stream into memory(the default behavior, but loading all data into memory could be a very bad idea). Pass opts.size to avoid loading all data into memory, or a warning will be shown.
  • opts.suppressSizeWarning {Boolean} - Pass true to suppress the size warning mentioned.

Tgz params:

tgz.FileStream is a combination of tar.FileStream and gzip.FileStream, so the params are the combination of params of tar and gzip.

Zip params:

  • opts.relativePath {String} - Adds a file from source into the compressed result file as opts.relativePath. Uncompression programs would extract the file from the compressed file as relativePath. If opts.source is a file path, opts.relativePath is optional, otherwise it's required.
  • opts.yazl {Object} - zip.FileStream compression uses yazl, pass this param to control the behavior of yazl.

Stream

The readable stream to compress anything as you need.

Note: If you are not very familiar with streams, just use compressFile() and compressDir() API, error can be handled in one place.

Gzip only support compressing a single file. So gzip.Stream is not available.

Constructor

  • new tar.Stream()
  • new tgz.Stream()
  • new zip.Stream()

No options in all constructors.

Instance methods

  • addEntry(entry, opts)

Params

  • entry {String|Buffer|Stream} - entry to compress, cound be a file path, a dir path, a buffer, or a stream.
  • opts.relativePath {String} - uncompression programs would extract the file from the compressed file as opts.relativePath. If entry is a file path or a dir path, opts.relativePath is optional, otherwise it's required.
  • opts.ignoreBase {Boolean} - when entry is a dir path, and opts.ignoreBase is set to true, the compression will contain files relative to the path passed, and not with the path included.

UncompressStream

The writable stream to uncompress anything as you need.

Note: If you are not very familiar with streams, just use uncompress() API, error can be handled in one place.

Gzip only support compressing and uncompressing one single file. So gzip.UncompressStream is a transform stream which is different from others.

Constructor

  • new gzip.UncompressStream(opts)
  • new tar.UncompressStream(opts)
  • new tgz.UncompressStream(opts)
  • new zip.UncompressStream(opts)

Common params:

  • opts.source {String|Buffer|Stream} - source to be uncompressed, could be a file path, buffer, or a readable stream.

CAUTION for zip.UncompressStream

Due to the design of the .zip file format, it's impossible to interpret a .zip file without loading all data into memory.

Although the API is streaming style(try to keep it handy), it still loads all data into memory.

https://github.com/thejoshwolfe/yauzl#no-streaming-unzip-api

Contributors


fengmk2


shaoshuai0102


popomore


semantic-release-bot


DiamondYuan


acyza


bytemain


rickyes


Ryqsky


songhn233


Infiltrator


ZeekoZhu


killagu


okaponta


ShadyZOZ

This project follows the git-contributor spec, auto updated at Thu Aug 03 2023 01:39:37 GMT+0800.

compressing's People

Contributors

acyza avatar atian25 avatar bytemain avatar diamondyuan avatar fengmk2 avatar grj001 avatar infiltrator avatar killagu avatar okaponta avatar popomore avatar rickyes avatar ryqsky avatar semantic-release-bot avatar shadyzoz avatar shaoshuai0102 avatar songhn233 avatar zeekozhu avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

compressing's Issues

unable to supress size warning of tgz.Stream

class TgzStream extends BaseStream {
constructor(opts) {
super(opts);
const tarStream = this._tarStream = new tar.Stream();

We should create the internal tar stream and gzip stream with the the opts parameter

It's my mistake, the suppressSizeWarning should be used with addEntry, and there is a typo in the warning message, that misled me to the wrong direction.

Cannot find module 'compressing' - Windows 7

I'm trying to compress files from console. I installed the module globally and put this code in js file:

const compressing = require("compressing");

I re-installed the module many times, but I always got this error:

image

Thanks for your help.

Cannot uncompress unix file in mac.

I try to uncompress an unix file in my mac.But the filetype is text.
first image is the type i want.But the result of uncompressing is pic 2.
image
image

memory load

I currently have this code but it seems that every time it is executed the memory is loaded

await compressing.zip.compressDir(${pathFile}/${x.path}, ${pathFile}/${Fixtures.csvQliksenseOnline.fileOperationCenter}.zip);

file's ctime & mtime kept current time when uncompress

Using uncompress with file, the extracted files' create time and modified time are the same as current time, which lost the original create and modified time of the file itself.

using unzip from command line:

[admin@lapras /tmp]
$unzip ./60000001_1.8.0.5.amr -d ./600001_1.8.0.5
Archive:  ./60000001_1.8.0.5.amr
  inflating: ./600001_1.8.0.5/Manifest.xml
  inflating: ./600001_1.8.0.5/CERT.json
  inflating: ./600001_1.8.0.5/60000001.tar

[admin@lapras /tmp]
$cd 600001_1.8.0.5/

[admin@ /tmp/600001_1.8.0.5]
$ll
total 1080
-rw-r--r-- 1 admin admin 1095680 Feb  1 00:02 60000001.tar
-rw-r--r-- 1 admin admin     727 Feb  1 00:02 CERT.json
-rw-r--r-- 1 admin admin     132 Feb  1 00:02 Manifest.xml

using uncompress method:

const amrExtractPath = path.join(systemTempDir, amrFileName.replace(path.extname(amrFileName), ''));
await compressing.zip.uncompress(amrDestPath, amrExtractPath);

and the result:

[admin@lapras /tmp/600001_1.8.0.5]
$ll ../60000001_1.8.0.5
total 1084
drwxrwxr-x 3 admin admin    4096 Feb 17 22:11 60000001
-rw-rw-r-- 1 admin admin 1095680 Feb 17 22:26 60000001.tar
-rw-rw-r-- 1 admin admin     727 Feb 17 22:26 CERT.json
-rw-rw-r-- 1 admin admin     132 Feb 17 22:26 Manifest.xml

zip.Stream 的 addEntry 不能在多个事件循环中使用

const { zip } = require('compressing');
const fs = require('fs')
const zipStream = new zip.Stream();
zipStream.addEntry('test.html')
setTimeout(() => {
zipStream.addEntry('yarn.lock')
setTimeout(() => {
zipStream.addEntry('package.json')
setTimeout(() => {
zipStream.pipe(fs.createWriteStream('test.zip'));
}, 3000)
}, 3000)
}, 3000)

这段代码最终压缩包里只会有test.html ,原因是每次addEntry后都会调用 finalize ,会使依赖的yazl.zipFile 调用end,调用了end后,会使zipFile处理完最后一个文件后如果没有其他Entry就关闭流,导致压缩结束,所以处理完已添加的文件前如果 addEntry 没调用到,流就会关闭,再调用就不起作用,
所以安全起见,addEntry 要一次性同步调完,或者需要异步用yazl

compressing.zip.uncompress deep children files would be lost

Hi, man !

what I do:

// the front upload a `prd.zip` file。I receive it, unzip it,and send it to another folder。
// I use the following code。
const reader = fs.createReadStream(ctx.request.files['file']['path']);
const filePath = `${path.resolve(__dirname, '../../www/assets')}/ddmccdn/`;
compressing.zip.uncompress(reader, filePath)

What I want:

under the floder ddmccdn/,I can find all unziped files.

What I meet:

  1. when I use stream as compressing.zip.uncompress source param, I find children'children'children's file will be lost
  2. When I use file path as compressing.zip.uncompress source param, I find children'children'children'children's file will be lost

The following is my project:
https://github.com/woden0415/prdreview1

Looking forward to your reply!

File get fragmentary when .addEntry(stream, { relativePath }) with multi streams by async functions

Problem

I want to compress a .tgz file with two streams. But the file get fragmentary when TgzStream.addEntry(stream, { relativePath }) invoked in asynchronous functions.

Recurrence

I have written two test cases for recurrence. The second case catch this issue.

Put below test codes in the test/tgz/stream.test.js and run npm test.
Then uncompress the .tgz file and checkout its content by yourself.

Test codes
  // Generated .tgz file consists of `xx.log` and `yy.log`.
  it.only('.addEntry(stream, { relativePath }) with multi streams', done => {
    const destFile = path.join(os.tmpdir(), uuid.v4() + '.tgz');
    const fileStream = fs.createWriteStream(destFile);
    console.log('dest', destFile);

    mm(console, 'warn', msg => {
      assert(msg === 'You should specify the size of streamming data by opts.size to prevent all streaming data from loading into memory. If you are sure about memory cost, pass opts.supressSizeWarning: true to suppress this warning');
    });

    const { Readable } = require('stream');
    const tgzStream = new TgzStream();

    const stream = new Readable({
      encoding: 'utf8',
      read() {
        [ '123213', 'asdasdasd' ].forEach(str => {
          this.push(str);
          this.push('\n');
        });
        this.push(null);
      },
    });

    tgzStream.addEntry(stream, { relativePath: 'xx.log' });

    const stream2 = new Readable({
      encoding: 'utf8',
      read() {
        [ '98765', 'mnbv' ].forEach(str => {
          this.push(str);
          this.push('\n');
        });
        this.push(null);
      },
    });

    tgzStream.addEntry(stream2, { relativePath: 'yy.log' });

    pump(tgzStream, fileStream, err => {
      assert(!err);
      assert(fs.existsSync(destFile));
      done();
    });
  });

  // Generated .tgz file only have `xx.log`.
  it.only('.addEntry(stream, { relativePath }) with multi streams by async functions', done => {
    const destFile = path.join(os.tmpdir(), uuid.v4() + '.tgz');
    const fileStream = fs.createWriteStream(destFile);
    console.log('dest', destFile);

    mm(console, 'warn', msg => {
      assert(msg === 'You should specify the size of streamming data by opts.size to prevent all streaming data from loading into memory. If you are sure about memory cost, pass opts.supressSizeWarning: true to suppress this warning');
    });

    const { Readable } = require('stream');
    const tgzStream = new TgzStream();

    Promise.resolve().then(() => {
      const stream = new Readable({
        encoding: 'utf8',
        read() {
          [ '123213', 'asdasdasd' ].forEach(str => {
            this.push(str);
            this.push('\n');
          });
          this.push(null);
        },
      });

      tgzStream.addEntry(stream, { relativePath: 'xx.log' });
    });

    const delay = timeout => new Promise(resolve => setTimeout(resolve, timeout));

    delay(0).then(() => {
      const stream = new Readable({
        encoding: 'utf8',
        read() {
          [ '98765', 'mnbv' ].forEach(str => {
            this.push(str);
            this.push('\n');
          });
          this.push(null);
        },
      });

      tgzStream.addEntry(stream, { relativePath: 'yy.log' });
    });

    pump(tgzStream, fileStream, err => {
      assert(!err);
      assert(fs.existsSync(destFile));
      done();
    });
  });

Expected

Generate a .tgz file consists of two file xx.log and yy.log.

Environments

  • node: v8.9.1
  • compressing: 1.3.1

.tar file can not be uncompressed by browser or uncompress software

some code is below:

blobList.forEach((blob, index) => { tarStream.addEntry(blob.data, { relativePath:${blob.name}_${index + 1}${blob.type}}); });

but when i download this .tar file, uncompress software cannot uncompress this file, because this sub files have the same file name, so the relativePath is not usefull when tarStream.addEntry.
image

windows tgz and linux tar zxvf, path error

compressing.tgz....a dir which contain dir\a.js, dir\b.js; then upload this tgz to linux, and run tar zxvf xx.tgz, the result is unexpected( dir\a.js , dir\b.js );

I think the problem is not change /\/ to / on windows; the path is at lib/tar/stream.js/_addDirEntry();
I modify like below and works well:

_addDirEntry(entry, opts) {
    fs.readdir(entry, (err, files) => {
      if (err) return this.emit('error', err);

      const relativePath = opts.relativePath || '';
      files.forEach(fileOrDir => {
        const newOpts = utils.clone(opts);
        if (opts.ignoreBase) {
          newOpts.relativePath = path.join(relativePath, fileOrDir).replace(/\\/g, '/');
        } else {
          newOpts.relativePath = path.join(relativePath, path.basename(entry), fileOrDir).replace(/\\/g, '/');
        }
        newOpts.ignoreBase = true;
        this.addEntry(path.join(entry, fileOrDir).replace(/\\/g, '/'), newOpts);
      });
      this._onEntryFinish();
    });
  }

I do not whether this is a good idea, or I can fix it by so config; if not; I hope this problem can fixed as soon as posibble; best wishes to you ^ _ ^

TarStreamError: size mismatch

Hello, im facing a problem with the method compressDir, i eventually getting this error when trying to compress a directory, but its not always.
Lib version: 1.5.0
Code to execute:

compressing.tar.compressDir(inPath, outPath)
                        .then( () => {
                            return resolve(bundle)
                        }) 
                        .catch( error => {
                            console.log(error);
                            return reject(error)
                        })

Error output:

{ TarStreamError: size mismatch
 at Sink.<anonymous> (/centum/trace_network_api/node_modules/tar-stream/pack.js:175:23)
 at Sink.f (/centum/trace_network_api/node_modules/once/once.js:25:25)
 at Sink.onfinish (/centum/trace_network_api/node_modules/end-of-stream/index.js:31:27)
at emitNone (events.js:111:20)
at Sink.emit (events.js:208:7)
at finishMaybe (/centum/trace_network_api/node_modules/tar-stream/node_modules/readable-stream/lib/_stream_writable.js:630:14)
 at endWritable (/centum/trace_network_api/node_modules/tar-stream/node_modules/readable-stream/lib/_stream_writable.js:638:3)
at Sink.Writable.end (/centum/trace_network_api/node_modules/tar-stream/node_modules/readable-stream/lib/_stream_writable.js:594:41)
at ReadStream.onend (_stream_readable.js:595:10)
at Object.onceWrapper (events.js:313:30)
at emitNone (events.js:111:20)
at ReadStream.emit (events.js:208:7)
at endReadableNT (_stream_readable.js:1064:12)
at args.(anonymous function) (/usr/lib/node_modules/pm2/node_modules/event-loop-inspector/index.js:138:29)
at _combinedTickCallback (internal/process/next_tick.js:139:11)
at process._tickDomainCallback (internal/process/next_tick.js:219:9) name: 'TarStreamError' }

Any suggestions? I think i need some documentation to give the object options some data about my directory to compress, but i didnt find anything helpfull.
Thanks in advance!

uncompress error: Error: incorrect header check

pack.zip是axios请求createWriteStream下载而来的

  compressing.tgz.uncompress(path.join(__dirname, '../../pack.zip'), 'package').then(res => {
    console.log('uncompress success')
    // 解压之后,重启应用即可
    app.relaunch()
    app.exit(0)
  }).catch(err => {
    // 记录错误日志
    console.log(`uncompress error: ${err.toString()}`)
  })

启动后报了这个问题

Can not use [email protected]+ in electron

在electron 中使用compressing解压zip 时报错
Uncaught ReferenceError: setImmediate is not defined

在preload.js中引入
var compressing = require("compressing"); var fs = require("fs"); var path = require("path"); var request = require("request");

Bug: Cannot uncompress link file

hello:

When I used compressing a tar.gz file by running compressing.tgz.uncompress('./test.tar.gz', 'bin');, It uncompressed the soft link files in the tar.gz package as a folder (the idle3):

mac@Titan bin % ls -al       
drwxr-xr-x   2 mac  staff        64 Jan 25 16:30 idle3
-rwxr-xr-x   1 mac  staff       117 Jan 25 16:30 idle3.9

But actually it's a link file:

mac@Titan bin % ls -al
lrwxr-xr-x  1 mac  staff    7 Oct 12 16:49 idle3 -> idle3.9
-rwxr-xr-x  1 mac  staff  117 Oct 12 16:49 idle3.9

Cannot manually trigger the end of compression

My requirement is to provide a nodejs service, compress the network resources, and then provide it to the front-end page to download.
While downloading network resources become compressed, can provide a better user experience.
My code is like follows

const tarStream = new compressing.tar.Stream();
tarStream.addEntry(stream, {
  relativePath: fileName,
  size: contentLength, // from http.request response headers content-length
});

I need to request multiple resources. I must set size. But the size can only be obtained after the request returns. I can't wait all resources request returns, It will take too much time.
If when the first request returns. then I send the secound request and do addEntry after secound request returns. Compression will end early. The second resource will not be compressed.
I need to be able to actively control the end of compression.
And I tried to modify the source code as follows
/lib/tar/stream.js

constructor(opts) {
  super(opts);

  this._waitingEntries = [];
  this._processing = false;
  this._autoFinalize = false; // +
  this._init(opts);
}

_onEntryFinish(err) {
  if (err) return this.emit('error', err);

  this._processing = false;
  const waitingEntry = this._waitingEntries.shift();
  if (waitingEntry) {
    this.addEntry.apply(this, waitingEntry);
  } else {
    if (this._autoFinalize) {  // +
      this._finalize();
    }  // +
  }
}

finalize() {  // +
  this._autoFinalize = true;  // +
  if (!this._processing && this._waitingEntries.length === 0) {  // +
    this._finalize();  // +
  }  // +
}  // +

When I do addEntry all resource.

tarStream.finalize()

This way I can fulfill my needs.
It is recommended to modify the source code and add a manual trigger to complete the addition of all resources.

Thanks.

建议增加 electron 下包含 asar 文件的压缩、解压支持

如题,由于 asar 文件在 Electron 的 fs 中被当做目录处理,解压包含 asar 的文件时会抛出异常。此时需使用 original-fs 模块处理。

建议增加 fs 文件系统的识别逻辑,或增加可配置接口,支持自定义传入 fs 对象

如有需要可提供 PR

解压中文文件名的时候,出现乱码

代码如下:

compress() {
                const compressing = require('compressing');
                // 解压缩
                compressing.zip.uncompress('build.zip', 'nodejs-compressing-demo3')
                    .then(() => {
                        console.log('success');
                    })
                    .catch(err => {
                        console.error(err);
                    });
            },

测试zip文件中包含有中文文件名的文件时候,解压出来的文件名是乱码。

compressing.zip.uncompress

when I compressing.zip.uncompress, throw: ZipUncompressStreamError: end of central directory record signature not found

解决zip空文件夹无法打包问题

修改 lib/zip/stream.js 文件中添加 _addDirEntry 方法如下,添加一下代码,此方法为临时修复,希望作者可以给出相应的修复方案,更新npm包
此方案只修复zip打包方式,其他打包请参考

//需要头部导入两个包
const fs = require('fs');
const utils = require('../utils');

_addDirEntry(entry, opts) {
    //先创建文件夹
    this._zipfile.addEmptyDirectory(opts.relativePath || path.basename(entry), opts);
    fs.readdir(entry, (err, files) => {
      if (err) return this.emit('error', err);
      const relativePath = opts.relativePath || '';
      files.forEach(fileOrDir => {
        const newOpts = utils.clone(opts);
        if (opts.ignoreBase) {
          newOpts.relativePath = path.join(relativePath, fileOrDir);
        } else {
          newOpts.relativePath = path.join(relativePath, path.basename(entry), fileOrDir);
        }
        newOpts.ignoreBase = true;
        this.addEntry(path.join(entry, fileOrDir), newOpts);
      });
      this._onEntryFinish();
    });
  }

压缩文件后的顶层目录

可以压缩文件时得到的压缩文件后没有顶层目录么,就是说我addEntry 加入多少个文件解压出来就是这些,我现在用每次都有一个顶层目录

CI工作流检查eslint的问题

我上次交的pr测试文件有格式问题,当时vscode也没给报错,当时还在想测试文件不用管格式的吗?后来重新打开的时候扑面而来的红波浪。
这个CI工作流有个eslint . --fix加了--fix也没测出来
@fengmk2
图片

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.