Opened 14 years ago
Closed 9 years ago
#14131 closed Cleanup/optimization (fixed)
The pagination module should have some limit, or a warning should be given in the documentation
Reported by: | Mike Lissner | Owned by: | Daniel Jilg |
---|---|---|---|
Component: | Documentation | Version: | dev |
Severity: | Normal | Keywords: | |
Cc: | winsmith@… | Triage Stage: | Accepted |
Has patch: | yes | Needs documentation: | no |
Needs tests: | no | Patch needs improvement: | no |
Easy pickings: | no | UI/UX: | no |
Description
Perhaps I'm off base on this, but I was just analyzing my slow queries log, and I noticed that the pagination module produces some incredibly slow queries in MySQL when high page numbers are requested. This just took down my server, so I was thinking about possible solutions.
The pagination module is using the OFFSET MySQL clause, which, as I understand it, must iterate over every row up to the OFFSET number in order to work.
The two solutions I thought of for this were:
- Add something to the documentation that indicates how this might produce very slow queries.
- Add code to the pagination module to either:
- Solve the problem with smarter MySQL queries (not sure if/how this is possible, sorry); or
- Allow a max_pages attribute, which throws some kind of error when the max_page number is exceeded. It could even have a sane default, keeping people within normal limits.
Admittedly, 2b is the solution I'm coding up right now, but it would be nice if Django enforced, or at least hinted at, a solution to this problem.
I've also (FWIW) tested pagination on Google, Yahoo, and Bing, and all three limit their results (Bing: page 20, Google: page 100, and Yahoo: page 100).
Change History (15)
comment:1 by , 14 years ago
Triage Stage: | Unreviewed → Accepted |
---|
comment:2 by , 14 years ago
Component: | Contrib apps → Core framework |
---|
comment:3 by , 13 years ago
Severity: | → Normal |
---|---|
Type: | → Cleanup/optimization |
comment:6 by , 9 years ago
Component: | Core (Other) → Documentation |
---|
comment:7 by , 9 years ago
Cc: | added |
---|---|
Owner: | changed from | to
Status: | new → assigned |
comment:11 by , 9 years ago
Has patch: | unset |
---|---|
Resolution: | fixed |
Status: | closed → new |
Reopening per my comment on the PR that was missed before the merge: "I think you should at least describe the reason so that someone knowledgeable about their database can ascertain whether or not the warning applies to them. "
comment:15 by , 9 years ago
Resolution: | → fixed |
---|---|
Status: | new → closed |
I agree that this should be documented as a potential gotcha.
LIMIT/OFFSET is slow for the reasons you mentioned.
There are a number of ways around this problem. One approach is to index a table and use WHERE/LIMIT rather than LIMIT/OFFSET. I'm not sure where in Django this sort of optimization belongs. On the one hand, it's a problem shared by all queryset slicing operations that produce a LIMIT/OFFSET statement. On the other, it requires specific knowledge about the contents of the table (and an index), so it might be easier to pass the clue in to a custom bit here in Pagination. Or possibly I'm completely off base here.
More discussion of the same issue here:
http://stackoverflow.com/questions/1243952/how-can-i-speed-up-a-mysql-query-with-a-large-offset-in-the-limit-clause
and slides 12 to 14 dicuss the WHERE/LIMIT approach here:
http://www.slideshare.net/Eweaver/efficient-pagination-using-mysql