Opened 4 days ago

Last modified 2 days ago

#36144 new Cleanup/optimization

DatabaseOperations.bulk_batch_size() should consider more database limits on SQLite and Oracle — at Initial Version

Reported by: Sarah Boyce Owned by:
Component: Database layer (models, ORM) Version: 5.2
Severity: Normal Keywords:
Cc: Simon Charette Triage Stage: Accepted
Has patch: no Needs documentation: no
Needs tests: no Patch needs improvement: no
Easy pickings: no UI/UX: no

Description

DatabaseOperations.bulk_batch_size() is used to calculate the maximum batch size when doing operations such as bulk_update and bulk_create

When investigating the impact of composite primary keys on the maximum batch size calculation for bulk_update(), it became clear that there are more database limits that need to be considered when calculating the maximum batch size in order to have a bullet proof solution.

One possible limit in play on SQLite is SQLITE_MAX_EXPR_DEPTH which is 1000 (see https://www.sqlite.org/limits.html#max_expr_depth).
On Oracle, we found that a query could error with the ambiguous message: ORA-00907: missing right parenthesis, which may be due to hitting some limit (possibly documented here: https://docs.oracle.com/en/database/oracle/oracle-database/23/lnpls/plsql-program-limits.html)

We may need to revisit the API design.

PR discussion: https://github.com/django/django/pull/19088#discussion_r1929940327

Change History (0)

Note: See TracTickets for help on using tickets.
Back to Top