﻿id	summary	reporter	owner	description	type	status	component	version	severity	resolution	keywords	cc	stage	has_patch	needs_docs	needs_tests	needs_better_patch	easy	ui_ux
36526	bulk_update uses more memory than expected	Anže Pečar		"I recently tried to update a large number of objects with:

{{{
things = list(Thing.objects.all()) # A large number of objects e.g. > 1_000_000
Thing.objects.bulk_update(things, [""description""], batch_size=300)
}}}

The first line above fits into the available memory (~2GB in my case), but the second line caused a SIGTERM, even though I had an additional 2GB of available memory. This was a bit surprising as I wasn't expecting bulk_update to use this much memory since all the objects to update were already loaded.

My solution was:

{{{
for batch in batched(things, 300):
     Thing.objects.bulk_update(batch, [""description""], batch_size=300)
}}}

The first example `bulk_update` used 2.8GB of memory, but in the second example, it only used 62MB.

[https://github.com/anze3db/django-bulk-update-memory A GitHub repository that reproduces the problem with memray results.]

As we can see from the [https://github.com/user-attachments/assets/dd0bdcac-682f-4e79-aa25-aa5a4a2e6b9d memray flamegraph] the majority of the memory in my example (2.1GB) is used to prepare the when statement for all the batches before executing them. If we change this to generate the when statement only for the current batch the memory consumption is going to be greatly reduced. I'd be happy to contribute this patch unless there are concerns on adding more compute between update queries and making the transactions longer. Let me know :)

This might be related to https://code.djangoproject.com/ticket/31202, but I decided to open a new issue because I wouldn't mind waiting longer for bulk_update to complete, but the SIGTERM surprised me."	Uncategorized	new	Uncategorized	5.2	Normal				Unreviewed	0	0	0	0	0	0
