-
-
Notifications
You must be signed in to change notification settings - Fork 951
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Consider FileResponse.chunk_size
when handling multiple ranges
#2703
Conversation
starlette/responses.py
Outdated
chunk = await file.read(min(self.chunk_size, end - start)) | ||
start += len(chunk) | ||
await send({"type": "http.response.body", "body": chunk, "more_body": True}) | ||
if len(chunk) < self.chunk_size or start >= end: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is the len(chunk) < self.chunk_size
necessary?
I'm too tired, I'll check tomorrow. Just leaving my own comment here.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
it can be mathematically deduced that is not required
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes, there is no need to check once len(chunk) < self.chunk_size
How do I have a different behavior in Python versions with this diff? 🤔 |
Coverage is not able to understand that the |
Co-authored-by: Frost Ming <[email protected]>
@abersheeran @frostming anything else here, or we are good? |
lgtm |
FileResponse.chunk_size
on multiple rangesFileResponse.chunk_size
when handling multiple ranges
…ode#2703) * Take in consideration the `FileResponse.chunk_size` on multiple ranges * Update starlette/responses.py * Update starlette/responses.py * Update starlette/responses.py Co-authored-by: Frost Ming <[email protected]> --------- Co-authored-by: Frost Ming <[email protected]>
cc @frostming