Opened 9 months ago
Last modified 10 days ago
#36144 new Cleanup/optimization
DatabaseOperations.bulk_batch_size() should consider more database limits on SQLite and Oracle
| Reported by: | Sarah Boyce | Owned by: | |
|---|---|---|---|
| Component: | Database layer (models, ORM) | Version: | 5.2 |
| Severity: | Normal | Keywords: | |
| Cc: | Simon Charette | Triage Stage: | Accepted |
| Has patch: | no | Needs documentation: | no |
| Needs tests: | no | Patch needs improvement: | no |
| Easy pickings: | no | UI/UX: | no |
Description (last modified by )
DatabaseOperations.bulk_batch_size() is used to calculate the maximum batch size when doing operations such as bulk_update and bulk_create
When investigating the impact of composite primary keys on the maximum batch size calculation for bulk_update(), it became clear that there are more database limits that need to be considered when calculating the maximum batch size in order to have a bullet proof solution.
One possible limit in play on SQLite is SQLITE_MAX_EXPR_DEPTH which is 1000 (see https://www.sqlite.org/limits.html#max_expr_depth).
On Oracle, we found that a query could error with the ambiguous message: ORA-00907: missing right parenthesis, which may be due to hitting some limit (possibly documented here: https://docs.oracle.com/en/database/oracle/oracle-database/23/lnpls/plsql-program-limits.html)
We may need to revisit the API design.
PR discussion: https://github.com/django/django/pull/19088#discussion_r1929940327
Ticket which sparked the discussion/discovery: #36118
Change History (2)
comment:1 by , 9 months ago
| Description: | modified (diff) |
|---|
comment:2 by , 9 months ago
| Triage Stage: | Unreviewed → Accepted |
|---|---|
| Version: | 5.1 → 5.2 |
Thank you Sarah, I read the conversation and it makes sense.