Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Not guarding against giant-JSON DoS attacks? #212

Closed
jtlapp opened this issue Jun 5, 2016 · 11 comments
Closed

Not guarding against giant-JSON DoS attacks? #212

jtlapp opened this issue Jun 5, 2016 · 11 comments

Comments

@jtlapp
Copy link

jtlapp commented Jun 5, 2016

I posted this to [email protected] 7 days ago but never got anything but an automated response. I figure that because DoS attacks are intractable anyway, there's no harm in posting here.

Scanning through the code and glancing through the dependent module "jws", it appears to me that clients could sent monster-sized JSON objects inside JWTs, and the module will always decode the JSON, eating up both clock cycles and memory. It seems to me that there should options to guard against this. Some possible options:

(1) Allow for enforcing a size limit on JWTs (or header/payload).
(2) Allow for verifying the signature before decoding JSON.

There may be another solution having to do with with_indifferent_access, mentioned on this discussion for a Ruby library, but I didn't follow it: https://github.com/nov/json-jwt

Also, I've been making a general assessment of JWTs and am so far finding them too insecure to use for anything important. Any input you can provide on StackExchange would be helpful: http://security.stackexchange.com/questions/125135/what-scenarios-really-benefit-from-signed-jwts

@jfromaniello
Copy link
Member

Thanks for sharing this concern here and sorry we didn't respond on the whitehat address (I will check what happen).

I will think more about this:

(1) Allow for enforcing a size limit on JWTs (or header/payload).

This will be a great fit for this library. We can put a maxLength option on the verify function, what would be a good default for this option?

(2) Allow for verifying the signature before decoding JSON.

I am going to check this but I was under the impression that this was the case. We decode the header first, then verify the signature, then decode the payload.

@jtlapp
Copy link
Author

jtlapp commented Jul 15, 2016

Quick note: You have to be able to decode first, because in some applications the key may be a function of the user identified in the payload.

@jfromaniello
Copy link
Member

You are right.. generally not on the "user identifier" but on the "issuer".

This library doesn't take care of that as it expects payload and the secret in the form of an string.. but there is plenty of examples doing what you suggest.

An alternative for this could be changing this api to have options.secret as function and validating the length here before calling the secret function.

@jtlapp
Copy link
Author

jtlapp commented Jul 15, 2016

Probably the most robust thing to do would be to validate the JSON, putting limits on the number of unquoted, unescaped colons and braces. It's really hard to come up with a single regex that is mindful of both quotes and character escapes, though. It can't be done for CSV, for example. So this sort of validation would come at more than ideal expense in clock cycles.

I think the correct solution lies in the JSON library. We should be able to set limits on the parser, such as the maximum number of objects to generate.

@jtlapp
Copy link
Author

jtlapp commented Jul 15, 2016

If you use a streaming JSON parser, you could count objects or otherwise verify objects during the parse and abort an excessive parse.

@jfromaniello
Copy link
Member

JSON.parse is quite limited, but it has an optional reviver function. Example:

const propertiesToParse = {
   issuer: {
      isValid: value => typeof value === 'string' && value.length < 2000
   }
};

const reviver = (key, value) => {
  const validator = propertiesToParse[key];
  if (!validator) {
     //skip this property since is not whitelisted
     return undefined;
  }
  if (!validator.isValid(value)) {
     //skip this property since is not valid according to the validator
     return undefined;
  }

  //this property is whitelisted and is valid;
  return value;
};

//test payload
const json = JSON.stringify({issuer: 'foo', foo: 'bar'});

console.log(JSON.parse(json, reviver));

//output: 
//> { issuer: 'foo' }

@jlitowitz
Copy link

jlitowitz commented Jan 19, 2017

How about something like this? In verify.js, immediately after

if (!jwtString){
    return done(new JsonWebTokenError('jwt must be provided'));
  }

add something like:

if(options.maxLength && options.maxLength > 0 && jwtString.length > options.maxLength) {
    return done(new JsonWebTokenError('jwt too long'));
}

This way it'll be up to the developer of any given application to determine what is a sensibly long encoded JWT string, with reverse compatibility for existing code? Just a thought.

@panva
Copy link
Contributor

panva commented Jun 25, 2019

Huge request body sizes in node.js are usually dealt with either at the TLS-terminating proxy such as Nginx, Apache or Caddy or at the request body parser level. Also, closing as stale.

@panva panva closed this as completed Jun 25, 2019
@jtlapp
Copy link
Author

jtlapp commented Jun 25, 2019

It's not the absolute size of the request that's the problem. It's the fact that a JSON request consumes a lot of clock cycles to parse. A parser should be able to handle anything thrown at it, as the programmer intends, but a web server should be more discriminating about what it processes.

@jtlapp
Copy link
Author

jtlapp commented Jun 25, 2019

Especially when the upper bound on size is knowably small. The server stack could do a lot more to protect itself.

@panva
Copy link
Contributor

panva commented Jun 25, 2019

Going off the title here

Not guarding against giant-JSON DoS attacks?

Something "giant" has to get through a webserver and bodyparser first. Note that most body parsers for the node ecosystem (either express or koa) have defaults set to 100kb or 56kb

Javascript's JSON.parse cannot be passed any options to parse "with a limit" and not sending arbitrary limit sized JWTs to verify can easily be done on the application level.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants