Opened 16 years ago
Closed 16 years ago
#8792 closed (wontfix)
Django's unit test system wipes out "custom sql" data before performing tests
Reported by: | deltoide | Owned by: | nobody |
---|---|---|---|
Component: | Testing framework | Version: | dev |
Severity: | Keywords: | ||
Cc: | Triage Stage: | Unreviewed | |
Has patch: | no | Needs documentation: | no |
Needs tests: | no | Patch needs improvement: | no |
Easy pickings: | no | UI/UX: | no |
Description
Django's unit test system issues "flush" command (resulting in a truncate for all tables) just before performing tests. This causes problems when having custom sql scripts (<appname>/sql/<modelname>.sql) because all this data is wiped out by the truncate and might be required for the tests.
Django's test execution path:
- tables creation
- custom sql execution (potential data insertion)
- flush (truncate all tables)
- perform tests (without potential custom sql data!)
I think it might be good to execute custom sql after the truncate has occurred ...
Change History (8)
comment:1 by , 16 years ago
milestone: | 1.0 → post-1.0 |
---|
comment:2 by , 16 years ago
Thanks for the quick review.
In my particular case, I can't use the fixtures because I need to link my data (because of using generic relations) to the content type id which I need to get using a subselect query ... using the fixtures would require to hard-code the content type id and that is something I don't want to do because it's likely to change if any new model gets added to my application ...
comment:3 by , 16 years ago
Ah - #7052 rises again. For future reference - another way to solve this problem (rather than using raw SQL) is to use a post_syncdb trigger. You can register a python method that listens for the post_syncdb signal, and as a result it will get executed as part of a flush. This is what django.contrib.contenttypes does to set up the content types in the first place. If your application as a management.py module that registers a signal handler in a similar way, you can get fixtures without needing raw SQL, and in a way that is compatible with the test framework.
follow-up: 6 comment:5 by , 16 years ago
Resolution: | → invalid |
---|---|
Status: | new → closed |
In 1.1 unit tests no longer flush, but instead use a transaction. So this is no longer valid.
comment:6 by , 16 years ago
Replying to jacob:
In 1.1 unit tests no longer flush, but instead use a transaction. So this is no longer valid.
Hmm, well flush is still used on MySQL/MyISAM since it doesn't support transactions. But hooking into the post_syncdb signal seems like a viable solution for the case where tests still flush the DB, so leaving closed.
comment:7 by , 16 years ago
Resolution: | invalid |
---|---|
Status: | closed → reopened |
3 reasons why I would like to reopen this ticket and see this being implemented in Django core:
(a) One may use custom SQL not only for storing initial data, but to execute advanced SQL - creation of views, stored procedures and other things, which are not supposed to be supported by Django, but may be required in certain application. This means, this bug still needs be fixed.
(b) Another aspect of this is that custom SQL is more convenient to store initial data: for many its more easy to write SQL hand, writing JSON structures is longer and more time consuming. And when you change your database structure, old fixtures became invalid,
so there should be easy upgrade (I am not aware of such mechanism existing).
(c) Keeping my SQL initial data in both custom SQL and fixtures is a huge DUPplication, so I would like to aviod it.
It seems the only way to make things working properly is to connect to post_syncdb and execute my custom SQL manually, but then the question aroses, is not that Django supposed to do - to execute custom SQL after table creation?
comment:8 by , 16 years ago
Resolution: | → wontfix |
---|---|
Status: | reopened → closed |
Please don't reopen tickets closed by a committer. The correct way to revisit issues is to take it up on django-dev.
Also, see Karen's comment just above yours: the post_syncdb
signal is the best place for this anyway. That custom SQL stuff has always been a hack, and anything that discourages its use in favor of something more robust is good in my book.
This won't be fixed in time for 1.0.
In the meantime, you should probably use a fixture instead of custom SQL; it's the better way of getting initial data into your database.