Opened 14 years ago
Closed 11 years ago
#15825 closed Bug (fixed)
File-based caching doesn't cull truly randomly
Reported by: | gkuenning | Owned by: | nobody |
---|---|---|---|
Component: | Core (Cache system) | Version: | dev |
Severity: | Normal | Keywords: | |
Cc: | Jaap Roes | Triage Stage: | Accepted |
Has patch: | yes | Needs documentation: | no |
Needs tests: | no | Patch needs improvement: | no |
Easy pickings: | no | UI/UX: | no |
Description
The culling algorithm in filebased.py uses an essentially random culling method, which is well known to lead to an approximation to LRU. However, the randomness is improperly applied. For example, consider a frequently-accessed page that, by luck, happens to sort first in the file list (due to an alphatbetically low hash value). The culling code will always delete this file, while a truly random algorithm would only do so 1/n of the time.
The fix is easy: at approximately line 131 in filebased.py, "== 0" should be replaced with "== ranno" where ranno is chosen in the immediately preceding line with:
ranno = random.randint(0, self._cull_frequency)
Attachments (2)
Change History (11)
comment:1 by , 14 years ago
Easy pickings: | unset |
---|---|
Triage Stage: | Unreviewed → Accepted |
Version: | 1.2 → SVN |
comment:2 by , 14 years ago
If this is fixed, please note that the tests will need some subtle fixing, particularly FileBasedCacheTests.test_cull
. The tests check for a certain number deleted, but because of the way that files are grouped into folders in the file based backend, if this is randomized the number deleted can change. The 'sorted' call just a few lines above this line was added because of this issue, to fix a difference that appeared on different platforms, #14250.
comment:3 by , 14 years ago
See also #15806. Not exactly related, but right around in that same part of code and probably worth fixing at the same time.
by , 13 years ago
Attachment: | merged_patch.diff added |
---|
comment:4 by , 13 years ago
Has patch: | set |
---|---|
UI/UX: | unset |
I've added patch against trunk AND patch that also fixes #15806 (it's actually just a result of merge) to avoid merging conflicts.
I've removed sorted() cause it's make no sense with random culling. I've also replaced nested for that traverse directories to delete paths by shutils.rmtree, as it does the same.
Tests changed in two ways: "cull%d" became "cull-%d" and initial inserts count for cull test was reduced from 49 to 34, to have 34 unique 2-char prefixes after hashing. It's actually a bit rude but in fact, #14250 was about as rude as this.
I see two other ways to solve last problem with tests:
- change length of directory name from 2 to 3, so there will be 4096 possible prefixes (conceptually, same hack but from different pov);
- change the way of culling: delete exactly _num_entries_cull_frequency FILES, not directories (bad thing, cause it will not scale with big cache sizes).
Waiting for review.
comment:6 by , 11 years ago
Just ran the latest patch "merge_patch" against the cache tests; they passed. Then, I ran the applied patch against the full test suite, which produced a failure. Stack trace here:
FAIL: test_cull (cache.tests.FileBasedCacheTests)
Traceback (most recent call last):
File "../django/tests/cache/tests.py", line 1085, in test_cull
self.perform_cull_test(35, 24)
File "../django/tests/cache/tests.py", line 495, in perform_cull_test
self.assertEqual(count, final_count)
AssertionError: 23 != 24
comment:7 by , 11 years ago
Cc: | added |
---|
comment:8 by , 11 years ago
Created a pull request that addresses this issue: https://github.com/django/django/pull/1514 (also #20536)
comment:9 by , 11 years ago
Resolution: | → fixed |
---|---|
Status: | new → closed |
The referenced line of code in trunk is 123, or
doomed = [os.path.join(self._dir, k) for (i, k) in enumerate(filelist) if i % self._cull_frequency == 0]