Opened 3 weeks ago
Last modified 2 weeks ago
#36144 new Cleanup/optimization
DatabaseOperations.bulk_batch_size() should consider more database limits on SQLite and Oracle
Description (last modified by ) ¶
DatabaseOperations.bulk_batch_size()
is used to calculate the maximum batch size when doing operations such as bulk_update and bulk_create
When investigating the impact of composite primary keys on the maximum batch size calculation for bulk_update()
, it became clear that there are more database limits that need to be considered when calculating the maximum batch size in order to have a bullet proof solution.
One possible limit in play on SQLite is SQLITE_MAX_EXPR_DEPTH
which is 1000 (see https://www.sqlite.org/limits.html#max_expr_depth).
On Oracle, we found that a query could error with the ambiguous message: ORA-00907: missing right parenthesis
, which may be due to hitting some limit (possibly documented here: https://docs.oracle.com/en/database/oracle/oracle-database/23/lnpls/plsql-program-limits.html)
We may need to revisit the API design.
PR discussion: https://github.com/django/django/pull/19088#discussion_r1929940327
Ticket which sparked the discussion/discovery: #36118
According to the ticket's flags, the next step(s) to move this issue forward are:
- To provide a patch by sending a pull request. Claim the ticket when you start working so that someone else doesn't duplicate effort. Before sending a pull request, review your work against the patch review checklist. Check the "Has patch" flag on the ticket after sending a pull request and include a link to the pull request in the ticket comment when making that update. The usual format is:
[https://github.com/django/django/pull/#### PR]
.
Change History (2)
comment:1 by , 3 weeks ago
Description: | modified (diff) |
---|
comment:2 by , 2 weeks ago
Triage Stage: | Unreviewed → Accepted |
---|---|
Version: | 5.1 → 5.2 |
Thank you Sarah, I read the conversation and it makes sense.