Huge Memory usage from Serializer #9361
Unanswered
PolSpock
asked this question in
Question & Answer
Replies: 1 comment 2 replies
-
Huge doesn't mean anything so can you tell how many ram you processes are actually using? I'm saying Python does not necessarily release the memory back to the Operating System. So in the end all your workers have become greedy and you are running out of memory. Lowering the amount of workers to 1 will probably solve it if your machine has enough memory to host at least 1 greedy process. Gunicorn has the |
Beta Was this translation helpful? Give feedback.
2 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hi,
I'm currently developing with Django Rest Framework an interface where users have to upload huge JSON files. So basically a GET request for getting the forms, and a POST request for uploading files.
These JSON files will be merged into one big JSON Array cause they depend on each other.
Then, i use
rest_framework.serializers.ModelSerializer
to transform myJSON Array
to anOrderedDict
.Currently, this
OrderedDict
will be used for treatment without saving in a database.However, i'm encountering several memory issues. As i found on these issues ;
Django Rest Framework seems to not release memory automatically for Serialized data after treatment.
So if users upload several files in a short time, my server is running out of memory cause of big Serializer memory usage
So how to force memory from Serializer to be released ? Or does anyone have a better logic to implement for my use case ? (Upload JSON files -> merging into one big JSON array -> validation -> casting to OrdererdDict -> treatment)
Regards
Beta Was this translation helpful? Give feedback.
All reactions