Opened 5 weeks ago
Last modified 5 weeks ago
#36144 new Cleanup/optimization
DatabaseOperations.bulk_batch_size() should consider more database limits on SQLite and Oracle — at Initial Version
Description ¶
DatabaseOperations.bulk_batch_size()
is used to calculate the maximum batch size when doing operations such as bulk_update and bulk_create
When investigating the impact of composite primary keys on the maximum batch size calculation for bulk_update()
, it became clear that there are more database limits that need to be considered when calculating the maximum batch size in order to have a bullet proof solution.
One possible limit in play on SQLite is SQLITE_MAX_EXPR_DEPTH
which is 1000 (see https://www.sqlite.org/limits.html#max_expr_depth).
On Oracle, we found that a query could error with the ambiguous message: ORA-00907: missing right parenthesis
, which may be due to hitting some limit (possibly documented here: https://docs.oracle.com/en/database/oracle/oracle-database/23/lnpls/plsql-program-limits.html)
We may need to revisit the API design.
PR discussion: https://github.com/django/django/pull/19088#discussion_r1929940327
According to the ticket's flags, the next step(s) to move this issue forward are:
- To provide a patch by sending a pull request. Claim the ticket when you start working so that someone else doesn't duplicate effort. Before sending a pull request, review your work against the patch review checklist. Check the "Has patch" flag on the ticket after sending a pull request and include a link to the pull request in the ticket comment when making that update. The usual format is:
[https://github.com/django/django/pull/#### PR]
.