FileSystemStorage.modified_time() is not timezone aware
|Reported by:||James Aylett||Owned by:||nobody|
|Has patch:||no||Needs documentation:||no|
|Needs tests:||no||Patch needs improvement:||no|
I'm using the S3BotoStorage backend from django_storages, and collecstatic does not always upload new copies of static files. S3BotoStorage modified times come back in UTC (verified against Amazon's S3 console):
from django.contrib.staticfiles import finders, storage f = list(finders.get_finders()) pss = list(f.list()) ps = pss[-1] from django.contrib.staticfiles.storage import staticfiles_storage as sf sf._setup() tlm = sf.modified_time(ps) slm = ps.modified_time(ps) print tlm print slm
2014-05-02 10:25:33 2014-05-02 09:57:57
However the local machine is in PDT (it's a Heroku dyno on an Amazon west coast instance somewhere):
>>> import os >>> os.system('ls -l static/css') total 8 -rw------- 1 u29977 29977 5838 2014-05-02 09:57 base.css 0 >>> os.system('date') Fri May 2 11:39:28 PDT 2014 0
So the correct modified time in UTC for the local
static/css/base.css is 2014-05-02 16:57:57, or several hours *after* the target file on S3 was last modified.
FileSystemStorage isn't timezone aware, the comparison for "newer than target" in
collectstatic fails, and only updates that are more than seven hours (currently) after the previous successful update actually get deployed.
I'm pretty sure the actual problem is that
datetime.datetime.fromtimestamp converts to localtime but returns a naive object. I *believe* that forcing it into UTC by using
…fromtimestamp(since_epoch, django.utils.timezone.utc) would be appropriate (in the methods at the end of
django/core/files/storage.py), but timezones are fiddly things and I don't even have the brainpower right now to figure out how to write a simple test for this that can work no matter which underlying TZ the machine is configured for.