Opened 5 years ago

Closed 23 months ago

#21160 closed Bug (fixed)

in_bulk() fails on SQLite when passing more than 999 ids

Reported by: Andrei Picus Owned by: felixxm
Component: Database layer (models, ORM) Version: master
Severity: Normal Keywords: in_bulk sqlite 999 1000
Cc: Triage Stage: Ready for checkin
Has patch: yes Needs documentation: no
Needs tests: no Patch needs improvement: no
Easy pickings: no UI/UX: no

Description (last modified by Daniele Procida)

As we all know, SQLite doesn't allow more than 999 variables in a query, failing with the 'too many SQL variables' error if you pass in more. This is a problem when using the in_bulk() method.

I've looked through the code and found out that all backends define a bulk_batch_size ops feature and it is set to 999 in the SQLite backend.

This ops feature is only used in _batched_insert. I've written a patch that makes use of it in the in_bulk method as well. You can check it here:

I've also written two tests that test passing in a large number of ids and also the efficiency in this case. The efficiency test runs only on databases that don't support 1000 variables in a query.

Any comments are welcome.

Change History (11)

comment:1 Changed 5 years ago by Andrei Picus

Pull request:

For the love of me I can't find the way to edit the original description.

comment:2 Changed 5 years ago by Tim Graham

Severity: Release blockerNormal

comment:3 Changed 5 years ago by Daniele Procida

Description: modified (diff)

comment:4 Changed 5 years ago by Aymeric Augustin

Triage Stage: UnreviewedAccepted

At first sight, the patch looks reasonable.

comment:5 Changed 5 years ago by Andrei Picus

Any other thoughts on this?

comment:6 Changed 5 years ago by Anssi Kääriäinen

Patch needs improvement: set

Updated patch can be found from

The usage of bulk_batch_size() is incorrect in the patch. The method is documented to receive a list of objects, but for the use case in this ticket it receives an iterable of field values. We can either change the method definition so that it can receive either an iterable of field values, or a list of objects. Or we can just add a new feature "max_query_params" and use that instead.

I slightly prefer the approach of adding max_query_params in to database features. Then we can make supports_1000_query_params to be a property which is calculated based on max_query_params, and also make the base bulk_batch_size() to use the max_query_params feature.

comment:7 Changed 2 years ago by felixxm

Owner: changed from nobody to felixxm
Status: newassigned

comment:8 Changed 23 months ago by GitHub <noreply@…>

In f42c7cc:

Refs #21160 -- Replaced DatabaseFeatures.supports_1000_query_parameters by a DatabaseFeatures.max_query_params.

comment:9 Changed 23 months ago by felixxm

Patch needs improvement: unset

comment:10 Changed 23 months ago by Tim Graham

Triage Stage: AcceptedReady for checkin

comment:11 Changed 23 months ago by GitHub <noreply@…>

Resolution: fixed
Status: assignedclosed

In 1b6f05e9:

Fixed #21160 -- Fixed QuerySet.in_bulk() crash on SQLite when requesting more than 999 ids.

Thanks Andrei Picus and Anssi Kääriäinen for the initial patch
and Tim Graham for the review.

Note: See TracTickets for help on using tickets.
Back to Top