﻿id	summary	reporter	owner	description	type	status	component	version	severity	resolution	keywords	cc	stage	has_patch	needs_docs	needs_tests	needs_better_patch	easy	ui_ux
35415	Adding content_type to StreamingHttpResponse on Linux causes memory error after streaming around 1GB-2GB of data.	LouisB12345	nobody	"This bug took a few days to work out and was extremely annoying.
I'm running Django under ASGI and im using was trying to use to stream a on-the-fly zip-file using the StreamingHttpResponse, note: i dont know if this occurs under WSGI.
I'm developing on a Windows operating system and after I deemed the code to be functional i tried it on the Linux vm i have set up. 
I noticed that the download would fail almost everytime. The cause was that the memory usage kept increasing after some time, usually after around 1-2GB was streamed. So after eliminating multiple factors I came to the conclusion that when i add content_type= withing the StreamingHttpResponse this bug occurs.

You can replicate the bug on Linux with the code below, if you remove the content_type it works as expected but with it the bug occurs.
{{{
from os.path import basename
import logging
import aiofiles
from django.contrib.auth.mixins import LoginRequiredMixin
from django.http import StreamingHttpResponse
from django.views import View
from guppy import hpy

H = hpy()

LOGGER = logging.getLogger(__name__)


class DownloadSelectedFiles(LoginRequiredMixin, View):
    def get(self, request) -> StreamingHttpResponse:
        file_name = ""f.txt""
        response = StreamingHttpResponse(file_data(file_name), content_type=""application/octet-stream"")
        response[""Content-Disposition""] = f'attachment; filename=""{basename(file_name)}""'
        return response


async def file_data(file_path):
    async with aiofiles.open(file_path, ""rb"") as f:
        LOGGER.info(f""Current threads are {threading.active_count()} opening file {file_path}\n{H.heap()}"")
        teller = 0
        while chunk := await f.read(65536):
            teller += 1
            await asyncio.sleep(0)
            if teller % 1000 == 0:
                LOGGER.info(f""Current threads are {threading.active_count()} yielding chunk nr.{teller}\n{H.heap()}"")
            yield chunk
}}}

I have some images of the output of the Logs to show the difference."	Bug	closed	HTTP handling	5.0	Normal	invalid		LouisB12345	Unreviewed	0	0	0	0	0	0
