Opened 3 months ago
Last modified 3 weeks ago
#36430 closed Cleanup/optimization
bulk_batch_size() special-cases single field on SQLite according to outdated limit — at Version 1
Reported by: | Jacob Walls | Owned by: | |
---|---|---|---|
Component: | Database layer (models, ORM) | Version: | dev |
Severity: | Normal | Keywords: | SQLITE_MAX_COMPOUND_SELECT, bulk_create |
Cc: | Sage Abdullah | Triage Stage: | Ready for checkin |
Has patch: | yes | Needs documentation: | no |
Needs tests: | no | Patch needs improvement: | no |
Easy pickings: | no | UI/UX: | no |
Description (last modified by )
On SQLite, bulk_batch_size()
special-cases a field list of length 1 and applies the SQLITE_MAX_COMPOUND_SELECT
limit to arrive at a value of 500.
I think this must date from before bulk inserts used VALUES
syntax, which became available in SQLite in 2012, see discussion in Laravel.
When the list of fields exceeds 1, we go through the elif
branch and arrive at much higher limits. I'm pretty sure this shows that the limit of 500 for fields=["pk"]
is overly protective and can just be removed.
I don't have a unit test to provide, but you can play with changing the limit from 500 to 501 and see that test_large_delete
still passes (no trouble selecting 501 objects). (You can also adjust test_max_batch_size()
to provide a s
(I found this while trying to refactor a different call site away from max_query_params
in the hopes of just calling bulk_batch_size()
until I saw how overly protective it was.)
I think this would be a good idea to resolve before anyone invests effort in reading the dynamic limit for this potentially irrelevant param.
Here's a rough test after all to demo, just adjust
test_max_batch_size
like this:tests/bulk_create/tests.py
name", "iso_two_letter", "description"]The test fails as it was *more* efficient than expected: