robertklep / node-mbox Goto Github PK
View Code? Open in Web Editor NEWmbox file parser for Node.js
License: MIT License
mbox file parser for Node.js
License: MIT License
var mbox = new Mbox('blackbox.psd');
mbox.on('message', function(msg) {
console.log('got a message', msg);
});
any file return 1 message, aslo if it not have text 'From'
msg = ' From ' + content of blackbox.psd
I'm unable to slow down the "message" event so that it can wait for processing to happen. The examples shown don't seem to pass a "done()" to the "message" event handler for async coordination.
So if my message processing needs a moment, the messages build up and I need to either drop them or store them in memory. I'm operating on a large 40GB mbox file.
Please advise
Hi, the current implementation of splitting the mbox file into emls is looking for line that starts with 'From ' . I had eml's that in their text had lines that started with 'From ', and had nothing to do with mbox splitter. That caused bugs - broken messages. You need to check more values in that line, for example that it ends with a date. Provided mbox file with example to the issue.
test.mbox.zip
If a message is bigger than the V8 string length limit (Math.pow(2, 28)
, ~260mb) the parser will crash with following error:
node_modules/node-mbox/src/mbox.js:100
stream.emit('message', 'From ' + chunks.join(''));
^
RangeError: Invalid string length
at Array.join (native)
at SBMH.<anonymous> (node_modules/node-mbox/src/mbox.js:100:49)
at emitMany (events.js:146:13)
at SBMH.emit (events.js:223:7)
at SBMH._sbmh_feed (node_modules/streamsearch/lib/sbmh.js:159:14)
at SBMH.push (node_modules/streamsearch/lib/sbmh.js:56:14)
at MboxStream._transform (node_modules/node-mbox/src/mbox.js:119:17)
at MboxStream.Transform._read (_stream_transform.js:186:10)
at MboxStream.Transform._write (_stream_transform.js:174:12)
at doWrite (_stream_writable.js:387:12)
A potential mitigation strategy:
https://gist.github.com/tpreusse/bc2ccf6fc6bf4ff563cdd336c1728db2/revisions#diff-05028f8355c8d7379cb42b5feac6429f
(throw error by default, skip or emit partial via options)
However the ultimate solution would be create a new api which allows to stream junks (e.g. to mailparser
) as they come in. mailparser
can probably handle almost unlimited attachment sizes with its attachment streams.
The streaming example given in the documentation doesn't seem to work/ throws a "stream.on is not a function" error. What am I missing here?
`const mbox = new Mbox({ streaming : true });
// message
event emits stream
mbox.on('message', function(stream) {
stream.on('data', function(chunk) {
...
}).on('end', function() {
...
});
});
process.stdin.pipe(mbox);`
I am trying to pipe mbox to mailparser, but it does not work. I am a noob in Nodejs streams, but it seems to me that mbox does not support piping out to another stream. Could you please provide me some hints?
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.