Opened 17 years ago
Closed 14 years ago
#4788 closed (duplicate)
django selftests should skip tests bound to fail
Reported by: | Owned by: | Kevin Kubasik | |
---|---|---|---|
Component: | Testing framework | Version: | dev |
Severity: | Keywords: | ||
Cc: | devin@…, sciyoshi@…, Jay Hargis | Triage Stage: | Accepted |
Has patch: | yes | Needs documentation: | no |
Needs tests: | no | Patch needs improvement: | no |
Easy pickings: | no | UI/UX: | no |
Description
There are some cases where selftests must fail, e.g. for mysql with a transaction enabled storage engine, the serialization selftests don't work because mysql tests foreign key constraints prematurely.
Ideally, the test frame work should allow to skip tests properly, so that at the end of the tests you get a bottom line like:
200 tests passed; 14 skipped; 1 failure
See also this discussion
Attachments (5)
Change History (23)
comment:1 by , 17 years ago
comment:3 by , 17 years ago
Owner: | changed from | to
---|---|
Status: | new → assigned |
For reference - the mysql tests aren't the only ones affected by this; strictly, the YAML parser tests should register as non-run/fail if you don't have pyYAML installed, and the markup tests should variously have known failures if docutils, markdown or texttile are not available. At the moment, YAML tests are skipped, and the markup tests become no-ops if their dependent libraries are not available. This leads to bugs like #5362 - my first impression with this test was to mark it 'worksforme', until I realized that the test was hiding the underlying problem.
comment:4 by , 17 years ago
Owner: | removed |
---|---|
Status: | assigned → new |
comment:5 by , 17 years ago
I'm quoting Russell's idea from django-developers so that it doesn't get lost:
However, my
preferred solution would fix this at the output layer, rather than the
test layer - i.e., let the tests run and fail, but filter the output
against a list of known failures so that the failures are reported in
the final output as "X tests passed (with Y known and acceptable
failures)" rather than the current flood of stack traces.
comment:6 by , 17 years ago
Owner: | set to |
---|
comment:7 by , 16 years ago
Owner: | changed from | to
---|
comment:8 by , 16 years ago
Cc: | added |
---|
by , 16 years ago
Attachment: | skipped_test_deco_r8069.diff added |
---|
added decorators for skipping tests
comment:9 by , 16 years ago
Has patch: | set |
---|---|
milestone: | → post-1.0 |
Needs documentation: | set |
Needs tests: | set |
Patch needs improvement: | set |
I've added a patch that defines decorators for conditionally skipping tests and a usage of it per Jason and Russell's discussion in IRC about #7611.
When tests are skipped, I'm raising an Exception SkippedTest. Now ideally these would be handled differently than other errors, but that'd require practically rewriting unittest as mtredinnick mentioned.
What we could do is filter this out in the output layer per Russell's idea. Check errors vs. SkippedTest and count those as a separate category in the output. Then we'd have to roll our own TestRunner instead of using unittest.TextTestRunner. Which would overlap a bit with #7884.
That's the direction I'm leaning, but I thought I'd bring the topic up now to see if that's the direction we need to be headed.
comment:10 by , 16 years ago
Patch needs improvement: | unset |
---|
I've implemented what I've mentioned before. Including decorators extending conditional_skip for the three cases mentioned in http://groups.google.com/group/django-developers/browse_thread/thread/f68628091d75f5c1
More could/should be added as needed.
I'm going to write up docs and tests for this shortly.
by , 16 years ago
Attachment: | skipped_test_deco_r8075.diff added |
---|
by , 16 years ago
Attachment: | skipped_test_deco_r8127.diff added |
---|
skip test decorators. changes to auth tests. and decorator tests and documentation.
comment:11 by , 16 years ago
Needs documentation: | unset |
---|---|
Needs tests: | unset |
I've finished up the patch. Though this somewhat disagrees with TestCase's urls attribute.
I'm not quite sure how we should handle that. It seems requiring the views is a better solution than adding the urls for the reasons brought up in #7611.
comment:12 by , 16 years ago
Cc: | added |
---|
Attaching an updated patch for Django >= 1.0. I've also made the urls use reverse lookups instead of the hardcoded paths since those views may be included at different URLs in projects.
comment:13 by , 16 years ago
comment:14 by , 16 years ago
Cc: | added |
---|
comment:16 by , 16 years ago
Owner: | changed from | to
---|
comment:18 by , 14 years ago
Resolution: | → duplicate |
---|---|
Status: | new → closed |
I'm going to close this in favor of #12991; unittest2 provides test skipping as a native feature, and the (soon to be in trunk) patch for that ticket includes introducing test skipping calls.
Hooking "skipped" tests into Python's unittest framework is a little tricky. I tried to do it in the past for an entirely different project and was only partially successful (and not in any way that I would include in Django). But it will be interesting to see what you come up with.