﻿id	summary	reporter	owner	description	type	status	component	version	severity	resolution	keywords	cc	stage	has_patch	needs_docs	needs_tests	needs_better_patch	easy	ui_ux
25432	Django ORM race condition	Yuval Adam	nobody	"I've hit an interesting problem that isn't covered by the current Django documentation, and might even be a bug that Django can handle better. It started off as a SO question at http://stackoverflow.com/q/32661885/24545 but here's the gist of it.

After creating a new object {{{MyModel.objects.create(foo=goo)}}} and inserting it into the database, it is possible that immediate subsequent calls to fetch that object might fail (i.e. {{{MyModel.objects.get(foo=goo)}}} will throw {{{DoesNotExist}}}). I have seen this happen in a test case where I make two subsequent API calls that do exactly this and got a ~5% failure rate.

In most cases this might not be a problem, but I am using this query to make sure I'm not creating two duplicate objects. This is essentially an UPSERT problem. In my case, my solution was to set {{{unique=True}}} on my {{{foo}}} field and attempt to create the object in any case, which will naturally fail on a duplicate, then I just catch the {{{IntegrityError}}} and fail gracefully. In this we we use DB semantics which guarantees no duplicates.

The relevant settings for this test case are: PostgreSQL, default Django transaction settings and no specific caching.

So 2 questions here:

1. What happens if my application '''requires''' that any two transactions {{{a}}} and {{{b}}} behave such that {{{b}}} always sees fresh data that was written in {{{a}}}? How can I enforce this in Django?
2. How do we document this behavior in a better way? If this is possible or impossible, Django must be clearer on how such transactions are handled."	Bug	closed	Database layer (models, ORM)	1.8	Normal	invalid			Unreviewed	0	0	0	0	0	0
