devfans / digest-fetch Goto Github PK
View Code? Open in Web Editor NEWdigest auth request plugin for fetch/node-fetch
License: MIT License
digest auth request plugin for fetch/node-fetch
License: MIT License
So apparently in the package.json
file the order of the entries in exports
matters. The default
key must always be last. This is causing me build failures, so can we simply swap the order of types
and default
?
DigestClient.prototype.fetch
is currently expecting its arguments to be in the form (url: string, init: RequestInit)
, but the fetch API also supports arguments in the form of (request: RequestInfo, init: undefined)
. This means when passing a proxied request to a client's fetch implementation, all of the original request's headers are lost.
When using {logger: console}
I get:
{"headers": {"Authorization": "Digest username=\"xxx\",realm=\"xxx\",nonce=\"xxx\",uri=\"/xxx\",opaque=\"xxx\",qop=\"auth\",algorithm=\"MD5\",response=\"xxx\",nc=00000001,cnonce=\"xxx\""}}
According to the specification the values for qop
and algorithm
must NOT be quoted.
https://www.rfc-editor.org/rfc/rfc7616#section-3.4
Version 1.3.0
I'm trying to talk to my Ultimaker 3 using it's API that requires digest authentication.
Their WWW-Authenticate
header:
Digest qop=auth, realm="Jedi-API", nonce="aadab65eef44b3a2f6e3744de03fb88a"
Example:
const DigestFetch = require("digest-fetch");
const digestOptions = {
cnonceSize: 32, // length of cnonce, default: 32
logger: console, // logger for debug, default: none
algorithm: "MD5", // algorithm to be used, 'MD5' or 'MD5-sess'
};
// ... Retrieving username and password...
const client = new DigestFetch(username, password, digestOptions);
const apiURL = `http://192.168.2.1/api/v1`;
client.fetch(`${apiURL}/auth/check/${token.id}`);
This logs:
requesting with auth carried
{ headers:
{ Authorization:
'Digest username="fd236e26a0d38d70696f3fd10cd1976f",realm="Jedi-API",nonce="93198b509a9023be76f3ceefdc12b7ac",uri="/api/v1/auth/verify",algorithm="MD5",response="796df59e0a932e9a38fb4282a02ac820",nc=00000002,cnonce="6c932f77981d7580f98d494e23235711"' } }
But I'm always getting this response:
{ message: 'Auth qop response required.' }
Is there any way to include this qop
?
I also tried using request-promise
which worked out of the box:
const response = await rp
.get(`${apiURL}/auth/verify`)
.auth(username, password, false);
The docs on the Ultimaker API are only available on the printer itself, but there is some info here:
https://community.ultimaker.com/topic/15574-inside-the-ultimaker-3-day-2-remote-access-part-1/
Please let me know if I can provide more information.
I'm trying to us it on a serverless project with node/typescript.
I'm using:
const DigestFetch = require('digest-fetch');
const client = new DigestFetch(process.env.recording_user, process.env.recording_password);
client.fetch(url, options)
.then(resp=>resp.json())
.then(
(data)=>{
console.log(data)
})
.catch(e=>console.error(e))
and it fails with
TypeError: fetch is not a function at DigestClient.fetch (\.webpac:\node_modules\digest-fetch\digest-fetch-src.js:43:1)
I dont know is this is an bug or I'm misunderstanding how to use it.
I'm using this in a small typescript project with node V12 and node fetch 2.5.7, and it failed to work for me.
Turns out that it was doing something like this:
// In digest-fetch:
addAuth(){
.... // Lots of complex stuff
options.headers.Authorization = xxx;
}
// Later in node-fetch, request constructor:
new Headers(init.headers) // where init is the options as recreated in digest-fetch
The problem here is that options.headers
is of type Headers
and the constructor of Headers taking another Headers instance will only copy over what's in the headers map, not additional properties.
In short, the generated Authorization header never got applied to the request. I got this to work again by this little tweak:
const _addAuth = digestClient.addAuth;
digestClient.addAuth = (url, options) => {
const transformedOptions = _addAuth.call(digestClient, url, options);
if (transformedOptions.headers.Authorization) {
transformedOptions.headers.set("Authorization", transformedOptions.headers.Authorization);
}
return transformedOptions;
};
I'm not much of a hero with JS but I can probably issue a PR over the weekend that tests and fixes this.
If the server challenge includes qop="auth, auth-int", digest-fetch just copies tihs into the qop part of the Authentication header. This is wrong (It should be the qop used, not the list of possibilities"
I just hard-wired it in addAuth
const qop = "auth"; // only one we support
I was looking for a library to do the heavy lifting of a SHA-256 digest authentication using node-fetch. This looked promising but it seems it does not support the SHA-256 algorithm. Any chance this will be added in the (near) future?
Sorry for asking a question by opening an issue, but I didn't know where else to ask.
I'd like to know if once we make the first request (which will be in fact two requests as the first one asks returns realm/random which is then used to md5 the username/password) subsequent requests to the same domain will be able to reuse the authentication or every request will in fact be two requests underneath.
That is of course provided we reuse the client
instance.
Thanks
Is it possible to fetch authorisation header that gets created post providing params like username, password, algorithm etc. ?
As a result even npm install --production
will end up installing what are declared as devDependencies
(e.g. babel).
I'm accessing a 3rd party api that uses Digest. When calling it through postman I am able to get the results back I expect. Looking at the response logged it looks like I'm getting the response back after the challenge however I don't actually get the body back. Not sure if there is something I'm doing wrong or it's a bug. Below is my code
const digestOptions = {
logger: console,
algorithm: 'MD5-sess',
statusCode: 401,
basic: false
}
const client = new DigestFetch(config.erpAuth.userid, config.erpAuth.password, digestOptions);
try {
var response = await client.fetch(config.erpURLS.batches);
var headers = response.headers;
var body = response.body;
ctx.status = 200;
ctx.body = JSON.stringify(body);
} catch (err){
ctx.status = 500;
ctx.body = err;
}
sopserver2 | requesting with auth carried
sopserver2 | {
sopserver2 | headers: {
sopserver2 | Authorization: Digest username="[email protected]",realm="Digest",nonce="+Upgraded+v1b29877ce5f60a4419254904708e6706b291017ff7a84d50178c2a873142c876250f53122b1d3f99958b573d7e15f16ef7950834e6f910bf0",uri="/ST-107565/ODataV4/Company('HTKA%202019-5-1')/Job_Card?tenant=T1002653.dynamicstocloud.com",qop="auth",algorithm="MD5-sess",response="c2d1eb69a34a2794ed3f5c049a84bb75",nc=00000001,cnonce="ed4c74e48c9d7c86cfab2e75997acd47"
sopserver2 | }
sopserver2 | }
Any plans to add typescript support?
TypeError: fetch is not a function
at DigestClient.fetch (https://bwvouy.csb.app/node_modules/digest-fetch/digest-fetch-src.js:48:24)
at onSubmit (https://bwvouy.csb.app/src/App.js:58:23)
at eval (https://bwvouy.csb.app/node_modules/formik/dist/formik.esm.js:957:12)
at eval (https://bwvouy.csb.app/node_modules/formik/dist/formik.esm.js:1195:24)
at eval (https://bwvouy.csb.app/node_modules/formik/dist/formik.esm.js:891:32)
From the node-fetch
readme:
node-fetch
from v3 is an ESM-only module - you are not able to import it withrequire()
.
Which basically means I can't use node-fetch@v3
and digest-fetch
at the same time.
Workaround:
npm i node-fetch@cjs node-fetchv3@npm:node-fetch@latest
Then digest-fetch
uses the old v2
common javascript API and within your own code you can import node-fetch@v3
via:
import fetch from 'node-fetchv3'
Hi, I have been working with the hikvision camera to send PTZ command through postman and i have successfully test it on the postman using digest auth as shown below.
But after following instruction from this link and implemented in the system. I got unauthorized error.
If you want to use the package in a Typescript project, without getting in trouble: you should use the following way to import it
import DigestClient = require('digest-fetch');
When a server responds with a redirect, the underlying fetch API appears to eagerly fetch the redirect (does not allow manual handling) and doesn't give a chance to digest-fetch to update the nc and cnonce in the Authorization header. I think this is because the fetch API re-uses the headers, namely the Authorization header, as a constant string. That same header is used for the follow-up GET request on the 303 redirection. The server barks that the nc has not been increased.
As [https://en.wikipedia.org/wiki/Digest_access_authentication] says:
[...] the client may make another request, reusing the server nonce value (the server only issues a new nonce for each "401" response) but providing a new client nonce (cnonce). For subsequent requests, the hexadecimal request counter (nc) must be greater than the last value it used [...]
I consider this a bug in digest-fetch but worse in the Fetch API (of Chrome and Firefox anyway).
I have tried not to provide a String header, but an object with a toString method, hoping that this would be called every time a request is made. So, instead of:
options.headers.Authorization = digest;
I did:
options.headers.Authorization = {
toString: function() {
alert(new Error("Gotcha! " + digest));
return digest;
}
}
Yet sadly, that gets called only once. I think out of the core fetch API function, where I am afraid a string is stored, and the toString() method will not again be called.
Any workaround this problem?
The line is here.
A fix could be to to remove them everywhere except for realm
.
[email protected]
[email protected]
[email protected]
[email protected]
[email protected]
[email protected]
@babel/[email protected]
@babel/[email protected]
[email protected]
In my use case I'm using digest-fetch
client to make requests to the MongoDB Atlas API from a Lambda Function via the serverless
framework. Testing locally everything works fine and node-fetch
is listed as a dependency
in my package.json
.
// webpack.config.js
const path = require('path');
const slsw = require('serverless-webpack');
const nodeExternals = require('webpack-node-externals');
const { NODE_ENV = 'local' } = process.env;
const isLocal = NODE_ENV === 'local';
module.exports = {
entry: slsw.lib.entries,
target: 'node',
mode: isLocal ? 'development' : 'production',
optimization: {
minimize: false,
},
performance: {
hints: false,
},
externals: [nodeExternals()],
module: {
rules: [
{
test: /\.js$/,
exclude: /node_modules/,
use: [
{
loader: 'babel-loader',
},
],
},
],
},
output: {
libraryTarget: 'commonjs2',
path: path.join(__dirname, '.webpack'),
filename: '[name].js',
sourceMapFilename: '[file].map',
},
};
Using webpack to handle the dependency tree, it seems that node-fetch
is not included in the build. When executed, I see the following error Error: Cannot find module 'node-fetch'\nRequire stack:\n- /var/task/node_modules/digest-fetch/digest-fetch-src.js
.
I believe webpack isn't packaging node-fetch
for me because it's not listed as a production dependency for digest-fetch
.
Was node-fetch
left as a dev dependency for a reason?
If so, is there a recommended, generic approach for specifying this sub-dependency for compiler tools? I'm opening an issue because it seems like if node-fetch
is required for digest-fetch
to work; then it should be listed as a production dependency.
With the following code:
const DigestFetch = require('digest-fetch')
const fs = require('fs')
const user = 'xxx'
const pass = 'xxx'
const uri = 'xxx'
const client = new DigestFetch(user, pass)
options = {}
async function doFetch() {
client.fetch(uri, options)
.then(async resp => {
console.log('resp.body :>> ', resp.body);
const filename = "picture.jpg"
const fileContents = Buffer.from(resp.body._readableState.buffer.head.data, 'base64')
console.log('fileContents :>> ', fileContents); // outputs a buffer that's about 1k in size
fs.writeFile(filename, fileContents, (err) => {
if (err) return console.error(err)
console.log('file saved to ', filename)
return resp.body
})
})
.catch(err => console.log('err :>> ', err));
}
doFetch()
resp.body is:
PassThrough {
_readableState: ReadableState {
objectMode: false,
highWaterMark: 16384,
buffer: BufferList { head: [Object], tail: [Object], length: 1 },
length: 920,
pipes: [],
flowing: null,
ended: false,
endEmitted: false,
reading: true,
constructed: true,
sync: false,
needReadable: true,
emittedReadable: false,
readableListening: false,
resumeScheduled: false,
errorEmitted: false,
emitClose: true,
autoDestroy: true,
destroyed: false,
errored: null,
closed: false,
closeEmitted: false,
defaultEncoding: 'utf8',
awaitDrainWriters: null,
multiAwaitDrain: false,
readingMore: false,
dataEmitted: false,
decoder: null,
encoding: null,
[Symbol(kPaused)]: null
},
_events: [Object: null prototype] {
prefinish: [Function: prefinish],
unpipe: [Function: onunpipe],
error: [ [Function: onerror], [Function (anonymous)] ],
close: [Function: bound onceWrapper] { listener: [Function: onclose] },
finish: [Function: bound onceWrapper] { listener: [Function: onfinish] }
},
_eventsCount: 5,
_maxListeners: undefined,
_writableState: WritableState {
objectMode: false,
highWaterMark: 16384,
finalCalled: false,
needDrain: false,
ending: false,
ended: false,
finished: false,
destroyed: false,
decodeStrings: true,
defaultEncoding: 'utf8',
length: 0,
writing: false,
corked: 0,
sync: false,
bufferProcessing: false,
onwrite: [Function: bound onwrite],
writecb: null,
writelen: 0,
afterWriteTickInfo: null,
buffered: [],
bufferedIndex: 0,
allBuffers: true,
allNoop: true,
pendingcb: 0,
constructed: true,
prefinished: false,
errorEmitted: false,
emitClose: true,
autoDestroy: true,
errored: null,
closed: false,
closeEmitted: false,
[Symbol(kOnFinished)]: []
},
allowHalfOpen: true,
[Symbol(kCapture)]: false,
[Symbol(kCallback)]: null
}
I was expecting resp.body to return the image (as a buffer, likely)
When I dig into the resp.body Passthrough
object, I get a buffer in the _readableState.buffer.head.data area... but it seems to only be 1k. ... which makes sense, as this code gives me a 1k image that only has the first few pixels.
How to do ensure the connection provides the entire image?
I'm trying to send patch request using MongoDB API but getting {"size":0,"timeout":0}
in response.
Could anyone suggest how can I send body request with patch method?
This project is using crypto-js v3 which is reported as high severity by snyk, see https://snyk.io/vuln/SNYK-JS-CRYPTOJS-548472.
Can we get an update to v4?
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.