Limit Request Size? #7752
Replies: 4 comments
-
We might be able to wrap the stream reader in an interface that ensures |
Beta Was this translation helpful? Give feedback.
-
The issue is that the checks depend on the content-type/parser used, for example, for file uploads we don't want to use this flag, this involves different checks on the multi part and file uploads parsers. A very easy fix however, since the out of memory and high cpu usage most of the time will come from the most used parser, the json parser, and at some extent the form parser, is adding a check there, such as:
At least for now, those are the checks I'm thinking that would prevent 90% of possible attackers attempting to send large requests to crash the server. Multi-part requests would still fail, or maybe not, that depends if the call to the django multi-part parser honor the settings values. |
Beta Was this translation helpful? Give feedback.
-
It looks like
I'm pretty sure Django still handles parsing those, so they should already be honoring the |
Beta Was this translation helpful? Give feedback.
-
I am confused as to why when I was using |
Beta Was this translation helpful? Give feedback.
-
Django uses the settings
DATA_UPLOAD_MAX_NUMBER_FIELDS
andDATA_UPLOAD_MAX_MEMORY_SIZE
to help against denial of service of large suspicious requests.Django-Rest-Framework's parsers do not honor DATA_UPLOAD_MAX_MEMORY_SIZE setting in any way since it never uses request.body directly but instead request.read(). DATA_UPLOAD_MAX_NUMBER_FIELDS is probably not honored as well but this one is not that important.
Since the setting is not honored, a valid huge json request can cause an unhandled MemoryError on the parsers which is quite ugly.
I'm wondering if there's an easy way to honor this flag, or should we manually write/override the existing parsers to check this flag?
Beta Was this translation helpful? Give feedback.
All reactions