tomas / needle Goto Github PK
View Code? Open in Web Editor NEWNimble, streamable HTTP client for Node.js. With proxy, iconv, cookie, deflate & multipart support.
Home Page: https://www.npmjs.com/package/needle
License: MIT License
Nimble, streamable HTTP client for Node.js. With proxy, iconv, cookie, deflate & multipart support.
Home Page: https://www.npmjs.com/package/needle
License: MIT License
I used needle to upload my file into my server.
There is any way to track upload progress ?
var data = {
fileUpload: { file: file, content_type: 'application/octet-stream' }
};
needle.post(config.URL+api, data, { multipart: true}, function(err, resp, body) {
...
});
Why is that?
To reproduce:
var needle = require('needle');
var url = 'http://example.com/api/oauth2/token';
needle.post(url, 'username=example&password=example', {json: false}, function(err, response, body) {console.log(err); })
Here's a block of code using the library pgte/nock
(I think the version is irrelevant) and needle
:
var needle = require("needle");
var nock = require("nock");
nock("http://www.abc.com").get("/abc").reply(200, (new Array(1024 * 1024 + 1)).join("."));
needle.get("http://www.abc.com/abc", function(e, r, b) { console.log("done, %d", b.length); });
The request gets intercepted by nock
and a 1MB response is sent. In needle
0.6.6 this code behaves as expected, printing done, 1048576
. In needle
0.7.0 and up, the response never arrives (and node doesn't know to wait, apparently, so the program exits without printing anything).
This could be related to this issue in nock
, but it's nevertheless a clear regression in needle
.
When specifying that needle should follow redirect, it only responds to status codes 301 und 302, but not to 303 See Other, which is an acceptable redirect status code.
Hi tomas,
Sending boolean data as part of JSON data becomes string on the receiving side.
For Eg: data = {id:100, fname:'john', status:{isactive:true}}
here isactive:true becomes isactive:'true' <- string 'true'
Is this an expected behaviour or bug?
In, lib/needle.js lines 110 & 111
post_data = (typeof(data) === "string") ? data : stringify(data);
config.headers['Content-Type'] = 'application/x-www-form-urlencoded';
to make this work, if I change like following it works as expected
post_data = (typeof(data) === "string") ? data : JSON.stringify(data);
config.headers['Content-Type'] = 'application/json';
If bug, Can you please fix this?.
there are maybe a bug, when I use needle digest authorization.
it does't work for RFC 2069, only supports RFC 2617.
auth.js line 46 should be
if (typeof challenge.qop === 'string') {
cnonce = md5(Math.random().toString(36)).substr(0, 8);
nc = digest.update_nc(nc);
resp = resp.concat(nc, cnonce);
resp = resp.concat(challenge.qop, ha2);
} else {
resp = resp.concat(ha2);
}
and
var ha1 = md5(user + ':' + challenge.realm + ':' + pass), ha2 = md5(method
+ ':' + path), resp = [ ha1, challenge.nonce ];
and can't be the parameters of function.
they should be get from the headers of response.
Hi there,
Something that noticed was the callback that adds a third argument which is a shortcut to response.body. The node.js community has pretty much settled on the 2-argument callback, which in turn caused a whole ecosystem of utilities that make this assumption (eg node-async and Q).
What is the rationale behind this? And would you mind it if I made a patch that detects the arity of the callback and behave in a standard-compliant way again if the caller requests so?
BTW I must be overwhelming you a bit with all these requests but I think there really is a place in the node ecosystem for this library, since it is one of the few libraries that actually is pleasant to use. :)
The 'delete' API call is documented as 'needle.delete(url, [options], callback);' while it should be 'needle.delete(url, data, [options], callback);'
Love the simplicity of needle, but it looks like needle doesn't really support 0.4.x. Since I'm hosting on heroku, I'm stuck with node 0.4.7.
The main issues are:
I'll fork and submit a pull request for the latter two, but I'm not sure about the best course of action to fix the first issue.
I'm using needle.post()
and passing in a callback function. The callback function is getting called twice and the body on the first time is a javascript object (as expected) and the second time the body is a JSON string. Using the latest version of needle.
Some server does not notify charset in HTTP header, it's better to also sniff from the HTML/XML <meta>
header.
I've done so using charset.
var encoding = 'utf8'; // defaults to 'utf8'
if (opts.encoding) {
encoding = opts.encoding;
console.info("Using encoding from options: " + encoding);
}
else {
// detect charset from header or body (the first 1Kb)
var cs = charset(res.headers, body.toString('utf8', 0, 1024));
if (cs) {
encoding = cs;
console.info("Detected encoding: " + encoding);
}
}
var str = iconv.decode(body, encoding);
For example, one can add a filename containing a " or Unicode characters, resulting in an invalid request.
Not 100% sure, but when I was using this against the moodstock API, the digest auth was failing. I am pretty sure it was because the first authorisation response is setting nc=2 instead of 1. I simply set the nc to be 0 in the digest.generate function and it all worked fine for me.
Just been looking at a request made from needle 0.5.9. It looks like the header's host value is missing the port, any chance this could be added?
Looks like the offending line may be lib\needle.js 172
opts.headers['Host'] = proxy ? url.parse(uri).hostname : remote.hostname;
Edit: Adding a thanks in advance/just for looking :)
ISO-8859 was the default of HTML4 but now it's UTF-8 for HTML5. I think it's a better idea now to set default charset to UTF-8. Besides, when detecting charset in parse_content_type(), charset is not always the second param. So it's a better to match it agains the whole header.
parse_content_type: function(header) {
if (!header || header == '') return {};
var charset = 'utf-8';
var arr = header.split(';');
try { charset = header.match(/charset=([^;]+)/)[1] } catch (e) { /* not found */ }
return { type: arr[0], charset: charset };
},
Use of readFileSync
is a very bad idea, as it will block the whole process.
Instead, the client should accept a Stream
(and internally can createReadStream
from any passed file paths).
Ideally this stream would be piped straight into the request object, but even buffering it internally would be preferably to the the current blocking behaviour.
Correct me if I'm wrong, but it appears as if the resp.body is only unzipped if you use the callback approach and not when you're using the streaming approach:
This is where the data is pushed on the stream: https://github.com/tomas/needle/blob/master/lib/needle.js#L252
And this is where the compressed data is decoded:
https://github.com/tomas/needle/blob/master/lib/needle.js#L274
Is this conclusion correct? Is there a reason that this approach is chosen (a buffering-approach instead of a streams-approach)?
Anyway, I would like to make an attempt at patching this tomorrow.
What I was thinking about:
Getting the stream to decompress would then be a matter of piping config.out through zlib.
Allow the filename=
of a part to be set to something other than the file's basename
Hello. I am trying to make 10 simultaneous POST requests to get some data from ElasticSearch
arrayWithFiveElements.forEach(function (country) {
needle.post('http://localhost:9200/' + country + '/test_document1/_search?size=1000',
'{"query": {"match_all": {}}}', function (err, response, body) {
populateList(err, testData, body);
});
needle.post('http://localhost:9200/' + country + '/test_document2/_search?size=1000',
'{"query": {"match_all": {}}}', function (err, response, body) {
populateList(err, testData, body);
});
});
and on the 9th and 10th request I always get this error:
Trace: [ERROR] No hits returned by Elastic Search for D1
at populateMonthList (/opt/ddd/backend/backend.js:48:21)
at /opt/ddd/backend/backend.js:74:17
at error_stop (/opt/ddd/backend/node_modules/needle/lib/needle.js:206:9)
at ClientRequest.<anonymous> (/opt/delivery-dashboard/backend/node_modules/needle/lib/needle.js:342:7)
at ClientRequest.EventEmitter.emit (events.js:95:17)
at Socket.socketErrorListener (http.js:1547:9)
at Socket.EventEmitter.emit (events.js:95:17)
at onwriteError (_stream_writable.js:239:10)
at onwrite (_stream_writable.js:257:5)
at WritableState.onwrite (_stream_writable.js:97:5)
{
"code": "EPIPE",
"errno": "EPIPE",
"syscall": "write"
}
Trace: [ERROR] No hits returned by Elastic Search for D2
at populateRWAMonthList (/opt/ddd/backend/backend.js:59:21)
at error_stop (/opt/ddd/backend/node_modules/needle/lib/needle.js:206:9)
at ClientRequest.<anonymous> (/opt/ddd/backend/node_modules/needle/lib/needle.js:342:7)
at ClientRequest.EventEmitter.emit (events.js:95:17)
at Socket.socketErrorListener (http.js:1547:9)
at Socket.EventEmitter.emit (events.js:95:17)
at onwriteError (_stream_writable.js:239:10)
at onwrite (_stream_writable.js:257:5)
at WritableState.onwrite (_stream_writable.js:97:5)
at fireErrorCallbacks (net.js:437:13)
{
"code": "EPIPE",
"errno": "EPIPE",
"syscall": "write"
}
The other 8 requests are totally fine.
Do you happen to know the solution to this issue?
I have this URL: http://panoramaproductions.electionemail.com/lt.php?nl=26&c=86&m=210&s=463b49e18dfd5a13da71ff10dc210d70&l=open
Callback get fired, first, with error from here:
request.on('error', function(err) {
debug('Request error', err);
if (timer) clearTimeout(timer);
error_stop(err || new Error('Unknown error when making request.'));
});
and the error is { [Error: Parse Error] bytesParsed: 311, code: 'HPE_INVALID_CONSTANT' }
But it will be also fired with no error from here :
// Only aggregate the full body if a callback was requested.
if (callback) {
resp.body = [];
resp.bytes = 0;
// Count the amount of (raw) bytes passed using a PassThrough stream.
[...]
callback(null, resp, resp.body);
});
}
It may not seem like a real issue, but this bug makes needle problematic when using it with async, because triggers the async error:
if (called) throw new Error("Callback was already called.");
needle : 0.7.2
node : v0.10.19
OS: ubuntu x64 with kernel 3.8.0-29-generic
Needle seems to send the wrong content-length if using umlauts in POSTs.
I have been using needle lately and come into a dilemma: it is possible to both cap download length and avoid having to deal with stream chunk myself?
From source code it seems that passing a callback will cause buffering but we have no way of knowing how much data has been buffered so far (thus not able to stop it). On the other hand if we handle readable stream ourselves I believe parser or decoder will be kick in (they need full buffer anyway).
Any suggestion?
PS: many thx for this package, i came from a curl
background so understandably some features are not here for granted.
Needle still parses JSON and XML data, even when no callback is provided. This causes the stream to be in object mode.
Is this expected behavior? In my mind, parsing should happen in streams mode. (But obviously this can trivially be disabled by passing in parsed: false
)
getCall(0) still works, though.
sinon.stub(needle, 'post');
needle.post(...);
needle.post.firstCall // undefined
needle.post.getCall(0) // Object
Without reverse proxy it works fine.
If you specify a reverse proxy with a port number not matching the port in the get url the port number in Host header is not correct in a GET request.
Example:
url request is http://testserver.testdomain.com/image.jpg
proxy = http://proxyserver.testdomain.com:8080
GET http://testserver.testdomain.com/image.jpg
Header:
Host: http://testserver.testdomain.com:8080
Needle.js
178 remote = proxy ? url.parse(proxy) : url.parse(uri);
182 opts.port = remote.port || (remote.protocol == 'https:' ? 443 : 80);
187 if (!opts.headers['Host']) {
188 opts.headers['Host'] = proxy ? url.parse(uri).hostname : remote.hostname;
189 if (opts.port != 80 && opts.port != 443)
190 opts.headers['Host'] += ':' + opts.port;
191 }
I wrote this into my code a little while back and I thought it worked at one point. Did a recent update kill this property?
When I'm using the 'debug' module which should set the DEBUG environment a value,'needle' module would output many debugging messages.So I can't find the debugging messages I want.
Sorry, my english is not well, so I don't know if you can understand.
I am trying to work via proxy, but got this exception.
How to solve it?
Error: Encoding not recognized: '"UTF-8"' (searched as: '"utf8"')
at Object.module.exports.getCodec (/home/nodeuser/node_modules/needle/node_modules/iconv-lite/index.js:36:23)
at Object.module.exports.fromEncoding (/home/nodeuser/node_modules/needle/node_modules/iconv-lite/index.js:7:22)
at Object.Needle.response_end (/home/nodeuser/node_modules/needle/lib/needle.js:240:38)
at IncomingMessage. (/home/nodeuser/node_modules/needle/lib/needle.js:187:16)
at IncomingMessage.EventEmitter.emit (events.js:117:20)
at _stream_readable.js:870:14
at process._tickCallback (node.js:415:13)
We are running into issues where some part of the response body are garbled upon decoding, and more annoyingly it does not always happen to the same page, not even the same text. So it's difficult for us to determine whether this is problem for needle
or iconv-lite
.
For example: http://www.huanqiukexue.com/html/newgc/2014/1215/25011.html is a page with charset gb2312, and upon multiple fetch we see different result on decode.
(text between question mark symbol are garbled, hopefully it show up the same on all OS)
attempt 1:
2005年,美国加利福尼亚大学河滨分校的心理学家索尼娅•柳�┟锥�斯基(Sonja Lyubomirsky)和她的同事一起,对多项研究的结论进行了综述,这些研究显示幸福与成功呈正比。
attempt 2:
2005年,美国加利福尼亚大学河滨分校的心理学家索尼娅•柳博米尔斯基(Sonja Lyubomirsky)和她的同事一起,对多项研究的结论进行了综述,这些研究显示幸福与成功呈正比。
There are also instance where both attempts result in the same garbled text.
attempt 1:
那些能够产生“心流”的任务不能太单调沉闷,�荒苋萌瞬�生挫败感;它还要具有充分的挑战性,要求一个人全神贯注。
attempt 2:
那些能够产生“心流”的任务不能太单调沉闷,�荒苋萌瞬�生挫败感;它还要具有充分的挑战性,要求一个人全神贯注。
and we use needle similar to this:
stream.on('data', function(chunk) {
if (chunk === null) {
return;
}
raw.push(chunk);
});
stream.on('end', function(err) {
body = Buffer.concat(raw).toString();
});
hi, I'm tring to use this module to get url, but got wrong output.
such as www.baidu.com (chinese), the chinese words is all unreadable encode.
needle.defaults({...}) does not set a property unless it's already defined in the initial defaults object.
I cannot set anything other than the 9 properties in the initial defaults object, because the defaults() function ignores all other properties.
Why...?
Ability to pass the value of agent
to include in the http.request
options would be welcomed.
when testing with undefined url to needle, it crashes on:
needle/lib/needle.js:79
if (uri.indexOf('http') == -1) uri = 'http://' + uri;
^
TypeError: Cannot call method 'indexOf' of undefined
I sent a POST request using needle like this:
needle.post(SERVICE_SOCKET + '/verify',
{
message: {
value1: 1.2456,
value2: -100,
value3: 100,
}
},
{},
function (err, resp) {
done(err);
});
The receiver then got an object like this:
{
value1: '1.2456',
value2: '-100',
value3: '100'
}
But I was expecting the following (as I get it using restify):
{
value1: 1.2456,
value2: -100,
value3: 100
}
As we are using both needle and restify clients I have to handle both data formats which is quite dissatisfying. Am I missing an option or is this a bug?
I specify the content type header in the options object as shown below.
var options = {
compressed : true,
headers : {
'Content-Type' : 'application/soap+xml"'
}
}
I then make a POST request with a string as the data argument. Needle overrides my content type with application/x-www-form-urlencoded.
I can get around this with the following hack at line 114 of needle.js
if ( config.headers['Content-Type'] === undefined ) {
config.headers['Content-Type'] = content_type;
}
I am using needle API to make POST request to REST API and it always returns 401 error , where as if i use the same credentials to make GET request it is successful. I am not sure what am i doing wrong. Or is it some problem with Needle API. Below is my code
var needle = require('needle');
var async = require('async');
var credentials = {
username: "myusername",
password:"mypassword"
}
var data = {
name:"Test Repo",
}
needle.post('https://bitbucket.org/api/1.0/repositories/accountName/repoName/issues/10/comments',credentials,data,function(err,resp,body){
console.log(resp.statusCode) //always returns 401 error.
});
needle.get('https://bitbucket.org/api/1.0/repositories/accountName/repoName/issues/10/comments',credentials,function(err,resp,body){
console.log(resp.statusCode) //always returns 200 error.
});
Needle seems to require stream.Transform which is new to Node v0.10.
Using Node 0.8.26:
util.js:538
ctor.prototype = Object.create(superCtor.prototype, {
^
TypeError: Cannot read property 'prototype' of undefined
at exports.inherits (util.js:538:43)
at Object.<anonymous> (/path/node_modules/needle/lib/decoder.js:5:1)
at Module._compile (module.js:449:26)
at Object.Module._extensions..js (module.js:467:10)
at Module.load (module.js:356:32)
at Function.Module._load (module.js:312:12)
at Module.require (module.js:362:17)
at require (module.js:378:17)
at Object.<anonymous> (/path/node_modules/needle/lib/needle.js:17:19)
at Module._compile (module.js:449:26)
Hello...
I am Heepan Kim.
First of all, thank you for sharing your code.
Original source code line 208
// if redirect code is found, send a GET request to that location if enabled via 'follow' option
if ([301, 302].indexOf(resp.statusCode) != -1 && headers.location) {
if (count <= config.follow)
return self.send_request(++count, 'GET', url.resolve(uri, headers.location), config, null, callback);
else if (config.follow > 0)
return callback(new Error('Max redirects reached. Possible loop in: ' + headers.location));
}
This code not working on cookie based server.
So I suggest my code.
// if redirect code is found, send a GET request to that location if enabled via 'follow' option
if ([301, 302].indexOf(resp.statusCode) != -1 && headers.location) {
if (count <= config.follow) {
// --------------------------------------------------
// must set cookie
var cookies = headers['set-cookie'];
if ( cookies ) {
config.base_opts.headers.Cookie = cookies[0];
}
// --------------------------------------------------
return self.send_request(++count, 'GET', url.resolve(uri, headers.location), config, null, callback);
}
else if (config.follow > 0)
return callback(new Error('Max redirects reached. Possible loop in: ' + headers.location));
}
When I run needle with the proxy option, I can only get needle to work when setting the proxy to http and not https.
Sample options are shown below.
rpurl = "https://endpointurl.com";
options = {
auth: 'auto',
proxy: 'http://sampleproxy.com:8080',
multipart: true
};
In this example, needle sends http protocol to the proxy even though the endpoint url is requiring https. So the proxy gets http and is required to translate to https. This is causing us some problems in negotiating cipher suites. We have a work around right now but would prefer to be able to run end to end https from needle to the proxy to the endpoint url.
When I try "proxy: 'https://sampleproxy.com:8080'", I receive the following error.
Error:
[Error: 140735290954512:error:140770FC:SSL routines:SSL23_GET_SERVER_HELLO:unknown protocol:../deps/openssl/openssl/ssl/s23_clnt.c:766:
Note that when running curl behind a proxy, curl initiates the https session with the proxy which simply passes the session out to the endpoint url. Curl appears to use the /etc/environment file proxy settings.
I would like to forward requests to "https://google.at" to "https://localhost:4500"
For example, the Facebook api has data free POST endpoints.
Hello,
i am trying to download and save binary file (image).
needle.get(url, { output: path }, function (err, resp, body) {
});
But I have got the error:
{ [Error: ENOENT, open '$PATH_TO_FILE$FILENAME']
errno: 34,
code: 'ENOENT',
path: '$PATH_TO_FILE$FILENAME' }
Could you suggest why I am getting this?
p.s. i have existing directories on the fie path.
By the way, is there possibility to enable creating missed directories
recursively?
resulting in servers parsing incorrect data.
when setting data=null on a needle.delete - nginx returns 411 error content-length not set. All works fine if you set data=" " - notice there a space inside the quotes.
When I send a POST request I'm getting back a Buffer in the request body. How do I handle this?
{ ...
resume: [Function],
read: [Function],
bytes: 0,
body: < Buffer > }
I was using needle in my app and CI was reporting Error: global leaks detected: self, request_opts, protocol
. On drilling down it appears to be originating from needle module. Switched to request and it's fine now.
If .post()
method is called with empty data object and multipart
option as true then in callback function err
param will be null and resp
will be [Error: Empty multipart body. Invalid data.]
Example:
needle.post('http://localhost', {}, { multipart: true }, function(err, resp, body) {
console.log(err); // null
console.log(resp); // [Error: Empty multipart body. Invalid data.]
});
but it should be as it is required to send JSON objects in request.
I use
needle.get('some url'
,{"timeout":20000,"headers" : {"Depth" : "1", "Content-Type": 'text/xml; charset="utf-8"'}}
,function(err,res){
....do something....
}).pipe(out);
when downloading large file and network Interrupt,callback is not fired.and timeout is not working too.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.