Opened 5 years ago

Closed 5 years ago

#30206 closed Cleanup/optimization (duplicate)

Optimise paginator for tables with massive records

Reported by: M. Javidan Darugar Owned by: M. Javidan Darugar
Component: Core (Other) Version: 2.1
Severity: Normal Keywords:
Cc: Triage Stage: Unreviewed
Has patch: no Needs documentation: no
Needs tests: no Patch needs improvement: no
Easy pickings: no UI/UX: no

Description

I had a problem with Paginator class to slice huge dataset. In case of millions of record it can take 30 second to reach the last page. So I cam up with a solution to overcome this problem.

The problem is that for slicing data, current solution needs to compute whole list to render data bottom:top. The reason is that, current script will generate for example SELECT ID, COL_1, ..., COL_N, ... WHERE ... which has huge burden for database to slice the data. To overcome this problem we can instead select primary keys and do the slicing step and then fetch records that their pk are in that sliced list. Very simple but very efficient solution. I improved the performance for our project significantly using this approach. So form 30 seconds to only 2-3 seconds for 8 million records.

In this ticket I propose same approach to improve django Paginator class.

Change History (3)

comment:1 by M. Javidan Darugar, 5 years ago

Owner: changed from javidan.m.d@… to M. Javidan Darugar

comment:2 by M. Javidan Darugar, 5 years ago

Owner: changed from M. Javidan Darugar to M. Javidan Darugar

comment:3 by M. Javidan Darugar, 5 years ago

Resolution: duplicate
Status: assignedclosed
Note: See TracTickets for help on using tickets.
Back to Top