What are the steps required to reproduce the bug?
Setup a REST endpoint with JSON as the format
Submit a very large JSON payload to the endpoint
What behavior were you expecting?
An error message that does not expose the internal details of the fatal error externally.
What happened instead?
The following out of memory error:
Fatal error: Allowed memory size of 268435456 bytes exhausted (tried to allocate 51118159 bytes) in reporoot/vendor/symfony/serializer/Encoder/JsonDecode.php on line 88
Details
OS Ubuntu 14.04
Apache 2.2
PHP 5.6.32
Drupal 8.4.0
We’d ideally like to see a way to prevent the internal details of the fatal error from being exposed externally. This may be achievable by comparing the size of the payload to the available memory in the active PHP process before attempting to decode it.
Comments
Comment #2
cilefen CreditAttribution: cilefen at Institute for Advanced Study commentedWhat is the display_errors PHP setting?
Comment #3
Wim LeersComment #4
malik.kotob CreditAttribution: malik.kotob commented@cilefen this is encountered when display_errors is set to true or false.
Comment #5
dawehnerWhen a client would ask me: We need to be protected against attackers, I would start telling them: That is not that easy. There are many vectors:typical problems are for example DDos/cache bypassing attacks etc. but then far along the line there is this. Sure we could try to fix it in PHP, but do we really believe this is a good place to fix it? I would argue that filtering out on the server level is the only proper way to fix this particular problem.
Comment #6
Wim LeersI agree that there is only so much we can do here. Drupal can never protect against all attack vectors here. I do see two simple things we can do to address the most extreme cases though:
max_request_body_size
setting inrest.settings
, set to e.g. 1 megabyte by default (if we do this, the above would not be necessary of course)It'd help to know whether in @malik.kotob's case the request body was just "big" (say half a megabyte) or "impossibly enormous" (say, 20 megabytes). Can you provide that information, @malik.kotob?
Comment #7
malik.kotob CreditAttribution: malik.kotob commentedI agree in that we can't seek out all attack vectors, but if one presents itself I don't see any reason not to contribute a fix that can help someone else out in the community. With that being said if this isn't the right place to do it then I definitely wouldn't argue in favor of it.
@Wim Leers:
1. This would help in some requests, but wouldn't prevent the error when the body is smaller than the memory limit, but larger than what is available in memory. Maybe you're not suggesting that it would, but just wanted to point that out in case.
2. I see this as a very viable solution. In our case the request body was enormous, about 50MB. We are, however, accepting requests that are ~30MB without issue, which is a valid use case for us. The trouble with only filtering things out at the server level is that we may not want to limit an admin, for example, from uploading a 50MB file. We truly want to impose this limit only on our custom endpoint.
Comment #8
malik.kotob CreditAttribution: malik.kotob commentedComment #9
Wim LeersThe problem is that it's impossible to determine how much memory decoding a payload will consume…
This would most definitely be MUCH larger than what I would consider a sane default. 30 MB request bodies, wow, what?! Does this include base64-encoded files?
If that's the case, that's trivial?
Comment #10
malik.kotob CreditAttribution: malik.kotob commentedPardon my ignorance, but is there a way to make a conservative estimate? If not, I think the one potential solution you outlined could work well:
It does, unfortunately, and the problem with using the above snippet, which I think you're saying could be added to our custom resource endpoint that extends
ResourceBase
, is that the memory issue is hit before ourpost
implementation is (in rest'shandle
method inRequestHandler.php
). That's why my thought was that if it's not possible to conservatively calculate how much memory the decoding will take,introducing the
max_request_body_size
setting would give us something to work with upstream.Comment #11
Wim LeersNo, because that'd require Drupal to estimate the algorithmic complexity/memory consumption of external libraries…
Your number one problem is that you're using a
base64
-encoded payload. That's a big red flag.While not pretty, you could handle that by modifying the
_controller
that is specified in\Drupal\rest\Plugin\ResourceBase::getBaseRoute()
, because you can override that method. That way the complexity remains tied to your rather unfortunate (but required, I'm sure!)@RestResource
plugin, rather than making the core REST module more complex.I hope that makes sense to you as well!
Comment #12
malik.kotob CreditAttribution: malik.kotob commentedThanks for the explanation and tip!