﻿id	summary	reporter	owner	description	type	status	component	version	severity	resolution	keywords	cc	stage	has_patch	needs_docs	needs_tests	needs_better_patch	easy	ui_ux
36430	bulk_batch_size() special-cases single field on SQLite according to outdated limit	Jacob Walls		"On SQLite, `bulk_batch_size()` [https://github.com/django/django/blob/1a744343999c9646912cee76ba0a2fa6ef5e6240/django/db/backends/sqlite3/operations.py#L49 special-cases] a field list of length 1 and applies the `SQLITE_MAX_COMPOUND_SELECT` limit to arrive at a value of 500.

I think this must date from before bulk inserts used `VALUES` syntax, which became available in SQLite in 2012, see [https://github.com/laravel/framework/issues/25262#issuecomment-414836191 discussion in Laravel].

When the list of fields exceeds 1, we go through the `elif` branch and arrive at much higher limits. I'm pretty sure this shows that the limit of 500 for `fields=[""pk""]` is overly protective and can just be removed.

I don't have a unit test to provide, but you can play with changing the limit from 500 to 501 and see that `test_large_delete` still passes (no trouble selecting 501 objects). (You can also adjust `test_max_batch_size()` to provide a s

(I found this while trying to [https://github.com/django/django/pull/19502#discussion_r2118612666 refactor] a different call site away from `max_query_params` in the hopes of just calling `bulk_batch_size()` until I saw how overly protective it was.)

I think this would be a good idea to resolve before anyone invests effort in [https://github.com/django/django/pull/19427/files#r2062643293 reading the dynamic limit] for this potentially irrelevant param."	Cleanup/optimization	new	Database layer (models, ORM)	dev	Normal		SQLITE_MAX_COMPOUND_SELECT, bulk_create		Unreviewed	0	0	0	0	0	0
