#32326 closed Cleanup/optimization (wontfix)
Proposing a more concrete example of 'Streaming large CSV files'
Reported by: | niauah | Owned by: | nobody |
---|---|---|---|
Component: | Documentation | Version: | 3.1 |
Severity: | Normal | Keywords: | streamingresponse |
Cc: | niauah | Triage Stage: | Unreviewed |
Has patch: | no | Needs documentation: | no |
Needs tests: | no | Patch needs improvement: | no |
Easy pickings: | no | UI/UX: | no |
Description
In the current document, the 'large CSV' is demonstrated through a simple 65536-element list, while the streaming feature is useful combined with a Python generator function. Here I propose a slightly modified example with a Python yield
function.
Change History (3)
follow-up: 2 comment:1 by , 4 years ago
Resolution: | → wontfix |
---|---|
Status: | new → closed |
comment:2 by , 4 years ago
Replying to Mariusz Felisiak:
Thanks for this proposition, however the current example is sufficient for Django documentation, IMO. We already have an extensive docs and we cannot document each use case and non-Django caveats.
Thanks for the reply. I understand that the document cannot cover detailed non-Django caveats (e.g. issues on Python methods). If necessary, I can revise the patch and delete them.
However, in the original example:
rows = (["Row {}".format(idx), str(idx)] for idx in range(65536)) pseudo_buffer = Echo() writer = csv.writer(pseudo_buffer) response = StreamingHttpResponse((writer.writerow(row) for row in rows), content_type="text/csv")
the 65536-row list is fully traversed at declaration, not in a 'call-by-need' fashion.
IMO, providing an example combined with yield method would be more illustrative of the 'streaming' nature.
comment:3 by , 4 years ago
Cc: | added |
---|
Thanks for this proposition, however the current example is sufficient for Django documentation, IMO. We already have an extensive docs and we cannot document each use case and non-Django caveats.