nearform / fast-jwt Goto Github PK
View Code? Open in Web Editor NEWFast JSON Web Token implementation
Home Page: https://nearform.github.io/fast-jwt/
License: Other
Fast JSON Web Token implementation
Home Page: https://nearform.github.io/fast-jwt/
License: Other
createVerifier()
accepts an async key()
option to fetch the key. We should pass in the decoded token to better support JWKS.
Line 165 in d98ba73
This code should utilize a timing safe equal check.
in createVerified
, we accept a key
as a function. However we do not cache the fetched key. Adding caching would likely speed things up quite a bit in case of remote jwks keys.
The release workflow is failing because it's missing a build step (npm ci
). The build step can be added as a parameter to the optic-release-automation-action
under build-command
.
As part of this ticket we should remove the unnecessary ci
script from the package.json (and replace the usage of it with appropriate scripts lint
+ test
).
When trying to sign a payload the sub
property gets alway overwritten, even when it's not passed to createSigner
.
Reproduction steps:
const { createSigner, createDecoder } = require('fast-jwt')
const sign = createSigner({
key: 'secret'
})
const decode = createDecoder()
const token = sign({
id: 'id1',
sub: 'test'
})
const decoded = decode(token)
console.log(decoded.sub)
Hey there,
I just read your blog article and noticed the mention of Node's crypto slowing things down, and the IPC attempts to improve the performance.
I use this module in some other applications and I've had good performance with it - but I'm dealing with significantly larger chunks of data. Not sure if the increased context switching for such small payloads would yield a net positive or negative effect (I suspect negative based on your IPC experience), but I figured I'd drop this here just in case you wanted to give it a try:
The problem is that verifying tokens on every request is expensive from the resources point of view.
Caching the tokens without any mechanism of verification is also risky as there can be tokens used as valid when they shouldn't (e.g. key used to sign the token has expired yet the token is stored in the cache.
In order to prevent that I was playing with the idea of a reverse leaking bucket algorithm or a time based cache.
The first option would be something similar to:
Token gets verified (verify signature) and stored in a cache together with a "credit value" e.g. 10.
Every time the token is used, the credit value gets decreased by 1. When the value reaches 0, the
token gets verified again and stored in the cache.
That way, users of the service with more active connections will lead into more verifications and though it is not a ful fledged solution, it would balance speed and security: a credit of 1, will enforce a verification every time the token is used.
The second option is a timed cache:
Token gets verified and stored in a cache. Alongside the token, we store a time for when the token is valid that does not change depending on usage. This way we only verify the token on 1 call every x time units. Once the X time units lapses, a worker process (cannot be done on a timer or your sever can potentially choke the CPU every X time units if you have a large volume of tokens) will reverify the signature of all the tokens and evict the ones that canot be verified (e.g. public key has expired and so on).
Both options, for the shake of the speed should store:
The reason for that is make sure that when a auth consumer wants to check any of the JWT claims, we don't need to parse the token (it is already parsed).
As titled, this regressed a bit in #31
Unreleased commits have been found which are pending release, please publish the changes.
Issue generated by github-actions-notify-release.
Unreleased commits have been found which are pending release, please publish the changes.
Issue generated by github-actions-notify-release.
I think it would be good to have these functions generically typed. If we know the shape of the payload, we can pass it here and get better type completion.
createSigner
createVerifier
Would you accept a PR for this?
The keys that are password protected using the ES* (ES256, ES384 and ES512) algorithms, are currently not recognised by the library.
As titled, from time to time it's needed.
Hi!
I'm working on a feature where I need to set a different expiration for each token I sign.
I tried to do set a "exp" field but it didn't work:
const { createSigner } = require('fast-jwt')
const signSync = createSigner({ key: 'secret' })
const token = signSync({
a: 1,
exp: Math.floor(Date.now() / 1000) + (60 * 60),
})
Looking at the code, I saw that "exp" is overwritten by "undefined" if "expiresIn" is not set.
const finalPayload = {
...payload,
...fixedPayload,
iat: noTimestamp ? undefined : Math.floor(iat / 1000),
exp: expiresIn ? Math.floor((iat + expiresIn) / 1000) : undefined,
nbf: notBefore ? Math.floor((iat + notBefore) / 1000) : undefined
}
Would it be possible to change the "finalPayload" to exp: expiresIn? Math.floor ((iat + expiresIn) / 1000): payload.exp
or have some other way of passing a custom "exp"?
Thanks!
Update the fast-jwt repository to use the more updated release selection configuration in option-release-automation-action
As suggested by @panva, use the test vectors from https://tools.ietf.org/html/rfc7520 to the test suite in order to verify that the crypto implementation is correct.
The project has renamed
Try out and use https://github.com/pkgjs/action in the CI workflow.
Let @dominykas know when we've done this as he's the author of that action
Add support to sign and verify EdDSA tokens.
This also implies ensuring the hashKey method in utils.js chooses the right algorithm.
Supported curve and related cache keys hash algorithms are:
EdDSA curve | Cache key hash |
---|---|
Ed25519 | sha512 |
Ed448 curve | shake256(..., 114) |
Hi, if anyone's still monitoring this... how would caching work if you are verifying access tokens in an aws lambda function? Don't think there's a way to cache the function ( and it's memoized data?) that is created by createVerifier.
In order to support fastify-auth0-verify
, which sends public keys as x509 certificates (that are acquired remotely using a well-known URL containing a JWK set), for token verification it would be great if fast-jwt
library would support it as well.
At this moment if we run the following scenario, the library throws an error:
const { createVerifier } = require('fast-jwt')
const publicKey = `
-----BEGIN CERTIFICATE-----
MIIFAzCCAuugAwIBAgIUVVehdczwTU8GW39JULd9pYi43ZkwDQYJKoZIhvcNAQEL
BQAwETEPMA0GA1UEAwwGdW51c2VkMB4XDTIxMTEzMDA5MDYyMVoXDTIxMTIzMDA5
MDYyMVowETEPMA0GA1UEAwwGdW51c2VkMIICIjANBgkqhkiG9w0BAQEFAAOCAg8A
MIICCgKCAgEAwvS4K8Jnx2b1nuab16/cEslS5HqvKS/Qs7+9X3B9RdC5fT//lyWQ
1lj4Ag9rX+ewzfBjZksgtfYAp5gmXJl/+E+dYEIry2XkC2AJZ9voqTa+hld6a8YW
OmvITSNc8GinP73gXBcwcv20Ligyg3c7LTn5ZSaNwJixpgsi9/3qz0WK9ArylHgD
LQ4hEEKMicYvSWoC6VnnWANKZ0EphyXYplie0EdEbcoje0+7qncu64Mm2LwYaFW3
RR1ZwrmzCUjX2MV3/L2h/gI3kSIgrkxSZS0gdcvFO9uAu5tfcJbpJj9NR+ynH4CL
Rl8OT8F4lyCdO55QTJUMUDb1zLDurVLOWYvfxVejsvmz3/tQy+/T9uV2dFBzOx9I
AbcI+kjGSCF+APi/ShDnME8nnkxKMTC/NO2wC2sAqgwhH+4fl1Lb2lyIUrwmWVEG
JBajQ26j/iGUGGlpZW3dWHa7NKtVE5bS1U5HBBWuqPlyMqhr4Pa053uJ678ywGpk
yQan1E00YP67u36HAP8hNZtXWPTvHRasek8nEeKcemfFcu563eqaPjbCkieE0yUe
6m6OU5tD8TM+Gbce1OievtPEcJrctb+xoFZNZ0k4gTUIqyJ3F/n3o1x5wYFdf/wi
9feoTOr2kfhFQqarFPUZAgKqOKANRZdsl54TkApTmpWvtrpXOfYoUlUCAwEAAaNT
MFEwHQYDVR0OBBYEFDwRuiSKlx21kCz3yeZGxM+Qw9b0MB8GA1UdIwQYMBaAFDwR
uiSKlx21kCz3yeZGxM+Qw9b0MA8GA1UdEwEB/wQFMAMBAf8wDQYJKoZIhvcNAQEL
BQADggIBAAWS2FZI+ISovVC1gIbK+JAMlTGPpuawMe2N/38zLK1TUdA10rLqyQbF
31Sv/wj6IV6PDkuDhH3BzB04gQ0ZrIorqYG9wR3Y4ekcCr7pCkTbKo18I6TI7p3U
PN1t7W1VBt6PeXsXob+uhORhVtJH8+qqQswRlwobp9tF0xELJWHqs2JbWrfikb1R
tKv9IpsTXIyxab6iBGew4NLiGLEpk03ghjQLFWxC4/yvcF0TqZmSMO1IXDjSiK8t
6iBgLJFdyhSV7BTmHOV4ibdaEHdAfWmm4WvyQnHUZHIg4YgQuiyykqBHS1CLTIW4
sUjdDPJNTS7DKsKHrZUPnaOwQTRkkhjwC5tL6Fal+o8z1ogzbWJhhGeqq+KX6/Xb
K/NGMUhMdexh3fPmJ3wlEI2Ck7uni3CpPGnwckcoFpccwQjnkFj4TxhWDtf1yLxr
ne8EcVGQ9uuwsJjVboaujCovHChaRailpbBIV5Sc789iyLSZf+ylHv4dJcQ8UNJX
Wxqt7yJcfPnPGA3WGbvaJJuKtsWREE+Mf3ex7HL/RpX+6FX10m5GlwGKTNd0h2b8
qFPjP1tPZemAqxQntUkEGizSYxQ1bajAgrMjXeVNqz3Bah3WE7+pFbsH39h7fhtJ
L4HPfzHnmwS227fhAllgr8d7gc9vPgzLi9hKq6PMHqHRTPlytNMl
-----END CERTIFICATE-----
`
const token = 'eyJhbGciOiJSUzI1NiIsInR5cCI6IkpXVCIsImtpZCI6IktFWSJ9.eyJzdWIiOiIxMjM0NTY3ODkwIiwibmFtZSI6IkpvaG4gRG9lIiwiYWRtaW4iOnRydWUsImlzcyI6Imh0dHBzOi8vbG9jYWxob3N0LyJ9.vTYivYeaq09ZqDltEg7ELc6CXAWAg78LjmG-g7pqEm9xCIJ9amCS9tGfo_bwnAdr-VYb96vAVsZQfWVROQExGZCj6OxDxrZNcwN0Dv62axRhT2TrDKG9qzZMt_Lt92oLTVG0o3FAM8v_ZztjA5u2AMYWAA4xHuuj5Jf1ZbSIL7L0J5MJ62yg1xY2pV_5jUoVORBLo2XW7WtUkYZRrq4_tsAE5LgwSF83SPkScAF2p-MYOtz3RsjlAfGSAj5WyF4MnGCuQeC4jxH_UrpIf43cQpVliA-vRKr3hH_mPrnU-S8hI-acM69z_yfO3P28H_cn7Lc3sg6MGKJhuM4us1BWfYafDxdqbSaIvjKNCXaPxWSLgwOhEmjovNfluPRWnNR6CT3qEg3g7Mkobj1QKIbw8bO0UzpKBZHQEqLP_MJnHlGEG8m0tHIpD3GKJnVmlepX-0w1DtE02hdYOlr40E-LfOlTAFpMHkPvCsO6LdDkGILAvtng0qUXmHsKkCw18BgdS9_z9e9NqSOmuCxqeEdq41rFgjdKXjb8qCiTdDsip65zq__onsL_ugG-oHOBurzvmkClVY6H4JiKv-BIPueZHwe-SYxdb2aBzgaS85calY_zf2Otmy1E2FE-0D4V3OwJ2JaJGcvSnDcWdHC2BVCQQ4U4bEuASX_EJv52mn-R6r8'
const verifier = createVerifier({ key: publicKey, algorithms: ['RS256'] })
const payload = verifier(token)
console.log(payload)
// throws: TokenError: Invalid public key provided for algorithms RS256.
Unreleased commits have been found which are pending release, please publish the changes.
Issue generated by github-actions-notify-release.
npm: 8.6.0
node: 17.8.0
Pull request created to fix this.
Unreleased commits have been found which are pending release, please publish the changes.
Issue generated by github-actions-notify-release.
I'm not convinced this would be useful for end users, and it can possibly create an attack vector.
I've created this #216 before, without creating an issue first since it's just a minor fix
Anyway, on that fix, I create a new interface:
interface PrivateKey {
key: string | Buffer
passphrase: string | undefined
}
instead of using nodejs crypto's PrivateKeyInput
I'll make a change if better to use the nodejs provided interface.
Hi!
Was looking through the crypto of this module and one thing I was wondering about was the rationale for the algorithm used to protect the cache:
Lines 45 to 59 in b0893cf
Looking through the usage in the code, it seems that the key into the cache is always the hash of the full JWT token. The algorithm for hashing the tokens changes depending on the algorithm used in the header of the token. The accompanying blog post mentions:
To guarantee that the cache data is not compromised in case of unauthorized memory access, tokens are indexed in cache using a hashing algorithm which is chosen depending on the token encryption. For instance, HS384 encrypted token are hashed using the SHA384 algorithm, while EdDSA encrypted tokens with Ed448 curve are hashed using the SHAKE256 algorithm.
However, it doesn't mention why it's important to change the hash function.
The same function could be used for all key types, and simplify the code with less branches, using a HMAC with a random key generated on startup (fine since this is an in-memory cache). This would also protect against potential timing attacks in the lookup, since the key into the cache now uses a secret (I don't immediately see how one could find a distinguisher to perform such an attack with the current implementation, but maybe there's something I couldn't think of);
const { createHmac } = require('crypto')
const blindingKey = crypto.randomBytes(64) // Key / block size for BLAKE2s-256
module.exports = function key (data) {
return createHmac('blake2s256', blindingKey).update(data).digest('hex')
}
Something like SipHash would also be an ideal candidate for this, and yield much smaller keys, saving memory. However I don't think that's available in core
Obviously there's a reason why you wrote it the way you did, and I could have completely misunderstood!
By looking at benchmarks I think most of performance is coming from caching verified token in LRU Cache ( I didn't do profiling myself on any jwt library ). isn't it better to just wrap caching around jose or node-jsonwebtoken ( i chose jose cause of eddsa. although jsonwebtoken is widely used and has many starrs, I didn't like the code ). I would appreciate some insight about this.
Thanks
Unreleased commits have been found which are pending release, please publish the changes.
Issue generated by github-actions-notify-release.
Unreleased commits have been found which are pending release, please publish the changes.
Issue generated by github-actions-notify-release.
The following type definition should be added to the src/index.d.ts
file:
export interface JwtHeader {
alg: string | Algorithm;
typ?: string | undefined;
cty?: string | undefined;
crit?: Array<string | Exclude<keyof JwtHeader, 'crit'>> | undefined;
kid?: string | undefined;
jku?: string | undefined;
x5u?: string | string[] | undefined;
'x5t#S256'?: string | undefined;
x5t?: string | undefined;
x5c?: string | string[] | undefined;
}
This type is used in (fastify-jwt)[https://github.com/fastify/fastify-jwt] specifically is part of this PR: fastify/fastify-jwt#184
It was suggested to move it from fastify-jwt
to here as it makes more sense and then import it in fastify-jwt
along with other types already imported from this library.
Unreleased commits have been found which are pending release, please publish the changes.
Issue generated by github-actions-notify-release.
@panva says
what if the keys i resolve are coming from a remote JWKS uri, a key drops out, i shouldn't accept those tokens anymore, regardless of their exp.
Currently, it's not possible to run the benchmark scripts.
When you try to run them (for example with npm run benchmark:decode
), Node.js will refuse to execute the files with the following error:
Error [ERR_REQUIRE_ESM]: require() of ES Module /Users/hails/src/github.com/nearform/fast-jwt/node_modules/cronometro/dist/index.js from /Users/hails/src/github.com/nearform/fast-jwt/benchmarks/utils.js not supported.
Instead change the require of index.js in /Users/hails/src/github.com/nearform/fast-jwt/benchmarks/utils.js to a dynamic import() which is available in all CommonJS modules.
at Object.<anonymous> (/Users/hails/src/github.com/nearform/fast-jwt/benchmarks/utils.js:3:20)
at Object.<anonymous> (/Users/hails/src/github.com/nearform/fast-jwt/benchmarks/decode.js:3:39) {
code: 'ERR_REQUIRE_ESM'
}
The problem occurs because cronometro
is defined as an ES Module and and is not compatible with CommonJS. Instead of just using require
, we should use the dynamic import
approach, as described in this gist by @ShogunPanda
We should support fetching from a JWKS url and cache things.
#62 for the caching bit.
The README
states "This module is experimental". What is needed to push this library to a non-experimental phase? I recently integrated fast-jwt
with hapi
for use in our microservices and all unit and integration tests passed. I love the simplicity of this lib and would be willing to help move this effort along.
Consider the following scenario:
const signedTokenAlgoHS = 'eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJhIjoxfQ.57TF7smP9XDhIexBqPC-F1toZReYZLWb_YRU5tv0sxM'
verify(signedTokenAlgoHS, {
noTimestamp: true,
algorithms: ['RS256', 'HS256']
})
// Will result in:
// Error: Invalid public key provided for algorithms RS256, HS256.
If the algorithms
property is defined on the options, the verifer.js will check if the supplied algorithm list is compatible with the algorithm of the provided token. However the checkAreCompatibleAlgorithms
function only checks for the first algorithm in the list:
Lines 17 to 20 in 634783b
In our scenario above this will be marked as not valid (TokenError will be thrown) as it'll try to match the 'RS256' (expected) to 'HS256' (actual).
What should instead happen is that the method should loop through all of the expected algorithms and then if none of them match the actual then throw the TokenError.
This is the current behavior having problems:
const notBefore = 1000
const token = createSigner({ key: 'secret', notBefore })({ a: 1 })
const verify = createVerifier({ key: 'secret', cache: true })
try {
verify(token)
} catch (err) { // throws with a 'FAST_JWT_INACTIVE' code, alright
console.error(err)
}
setTimeout(() => {
try {
verify(token)
} catch (err) { // also throws with 'FAST_JWT_INACTIVE', but should not
console.error(err)
}
}, 2000)
The fact is current tests are passing because of an uneven behavior for comparing time between cache item and token properties. They use a timestamp where now === min
so they are able to pass, but will unlikely happen in real life.
// Check the cache with `now > min`
if (typeof value !== 'undefined' && (min === 0 || now > min) && (max === 0 || now <= max)) {
But:
// check the claim with `now >= adjusted `
const valid = greater ? now >= adjusted : now <= adjusted
The library currently supports keys that are either string
, Buffer
or a function
but the key could be an Object
representing a password protected private key as well, see the crypto documentation.
The format of the key is the following:
{
key: "password_protected_private_key",
passphrase: "secret"
}
I will create a PR for this issue.
Hi there,
When I set clockTolerance
in createVerifier
, it does not have any effect.
As a second point, in the documentation it is said that this value is added to the current timestamp.
It is enough to validate the expiry
, but in my case I would also like it to have effect on the notBefore
property of the token.
The use case is if a token provider generates a token with nbf
set at the current time, but the validation server clock is slightly lagging behind (even a couple of seconds), and will cause a token validation error.
I can send a PR fixing the tolerance for both expiry and notBefore if you are interested.
// classic expiry case, fails
const expiresIn = 1000
const token1 = createSigner({ key: 'secret', expiresIn })({ a: 1 })
setTimeout(() => {
try {
createVerifier({ key: 'secret', clockTolerance: 50000 })(token1)
} catch (err) {
console.error(err)
}
}, 1000)
// notBefore case, fails as well
const clockTimestamp = Date.now() + 1000
const notBefore = 1
const token2 = createSigner({ key: 'secret', clockTimestamp, notBefore })({ a: 1 })
try {
createVerifier({ key: 'secret', clockTolerance: 50000 })(token2)
} catch (err) {
console.error(err)
}
Options with allowedIss
to verify token. I think it should throw error if payload without iss value. Shouldn't it?
// file: test/example.spec.js
const { test } = require('tap')
const { createSigner, createVerifier } = require('../src')
const signSync = createSigner({ key: 'secret' })
const verifySync = createVerifier({ key: 'secret', allowedIss: 'foo.bar' })
// passing
test('token payload with iss value', async (t) => {
const tokenPayloadWithIss = signSync({ a: 1, b: 2, c: 3, iss: 'bad' })
t.throws(
() => {
verifySync(tokenPayloadWithIss, (err) => {
console.log(err)
})
},
{
code: 'FAST_JWT_INVALID_CLAIM_VALUE',
message: 'The iss claim value is not allowed.'
}
)
t.end()
})
// failing
test('token payload without iss value', async (t) => {
const tokenPayloadWithoutIss = signSync({ a: 1, b: 2, c: 3 })
t.throws(() => {
verifySync(
tokenPayloadWithoutIss,
(err) => {
console.log(err)
},
{
code: 'FAST_JWT_INVALID_CLAIM_VALUE',
message: 'The iss claim value is not allowed.'
}
)
})
t.end()
})
and it also happen verify with allowedJti
, allowedAud
, allowedSub
, and allowedNonce
Add Notify Twitter
workflow that will push a tweet to @NearFormOSS upon successful release.
To remove an EBADENGINE
warning by npm/yarn.
Pull request: #218
The JWT spec mandates that the payloads MUST be objects.
From: https://tools.ietf.org/html/rfc7519#section-7.2
- Verify that the resulting octet sequence is a UTF-8-encoded
representation of a completely valid JSON object conforming to
RFC 7159 [RFC7159]; let the JWT Claims Set be this JSON object.
We should remove the json
option from decoder and throw if the typ
is not JWT
in
Line 29 in 4f7fbfb
In the signer
,
Lines 113 to 117 in 4f7fbfb
We recently integrated fast-jwt
with an API implemented in restify
due to some performance issues in token verification. However, the tokens even though expired are still being allowed. See below the implementation:
const verifyWithCallback = createVerifier({
key: getKey,
complete: true,
cache: true,
ignoreExpiration: false,
algorithms: ['HS256','HS384', 'HS512','RS256']
})
let encodedToken = token
console.log('type: '+typeof(encodedToken))
await verifyWithCallback(encodedToken, (err, sections) => {
if(err){
console.log(err)
res.send(400)
return
}
else{
const decode = createDecoder()
let payload = decode(encodedToken)
...
Looking at the verify,js
file, I saw a line there that terminates the execution if the type of the payload is string. I'm not sure if that has a bearing.
createVerifier({ key: 'foobar', checkTyp: 'jwt' })('eyJ0eXAiOjEsImFsZyI6IkhTMjU2In0.eyJpYXQiOjE1OTk1NzM1NTN9.H8YZrUYEhiK3A8RZdFVyS6JP_ymnZQFAkLuFnl1TIbg')
TypeError: (header.typ || "").toLowerCase is not a function
at verifyToken (/Users/panva/.panvajs/node_modules/fast-jwt/src/verifier.js:176:42)
at verify (/Users/panva/.panvajs/node_modules/fast-jwt/src/verifier.js:276:7)
This is related to my comment here. As a consumer of jwts i don't control i would expect to get a consistent error type for invalid JWTs
When signing let's have the algorithm option not to default to HS256 but rather autodetected depending on the key provided.
Keys stored under /benchmarks
are being flagged as a security vulnerability. Removing them from the published package should solve it.
I'll open a PR
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.