#27267 closed Bug (needsinfo)
Renaming a primary key fails with "cannot drop constraint on table because other objects depend on it"
Reported by: | Melvyn Sopacua | Owned by: | nobody |
---|---|---|---|
Component: | Migrations | Version: | 1.8 |
Severity: | Normal | Keywords: | |
Cc: | Triage Stage: | Unreviewed | |
Has patch: | no | Needs documentation: | no |
Needs tests: | no | Patch needs improvement: | no |
Easy pickings: | no | UI/UX: | no |
Description
When Django creates a foreign key constraint it uses a randomly generated string as part of the contraint name at the time it creates it. This has a side-effect that the names of these constraints are different for each database created. The better fix may then also be to get rid of this random string.
When one changes the type of a field that has foreign key references to it, then one cannot migrate the app. Attempts to use RunSQL fail, because as said, each database will have a different contraint name. This is most evident by just trying to create the test database with the following bit in a migration:
operations = [ migrations.RunSQL( [ 'ALTER TABLE dpe_dpecharacter DROP CONSTRAINT {}'.format( FKEY_NAME ), ], ]
So if the path is chosen not to get rid of the random string, we need a way to find out what the actual name of the foreign key constraint is, preferably wrapped as migrations.DropForeignKey().
The real world case that triggered this, was that initial design used a CharField as primary key, yet limitations in either or both contrib.contenttypes or Mezzanine prevented this from working correctly and an autogenerated primary key had to be used instead. Dropping the primary key is no issue, since it's name is predictable and consistent amonst different incarnations of the database.
Change History (15)
comment:2 by , 8 years ago
I'll look into it more closely and reduce to bare minimum. If I remove the above drop statement, the migration doesn't work.
comment:3 by , 8 years ago
Version: | 1.10 → 1.8 |
---|
Updated version. This may be a 1.8 issue.
Just ran into the same beast, different trigger. I renamed a primary key and runs into the foreign key constraint:
migrations.RenameField( model_name='itemtype', old_name='item_id', new_name='item_type_id', ),
Output:
django.db.utils.InternalError: cannot drop constraint dpe_itemtype_pkey on table dpe_itemtype because other objects depend on it DETAIL: constraint dpe_it_itemtype_ptr_id_3b50b94341daa361_fk_dpe_itemtype_item_id on table dpe_itemtypepage depends on index dpe_itemtype_pkey constraint dpe_itemb_item_type_id_5c80224be6efb2ca_fk_dpe_itemtype_item_id on table dpe_itembase depends on index dpe_itemtype_pkey HINT: Use DROP ... CASCADE to drop the dependent objects too.
comment:4 by , 8 years ago
Type: | Uncategorized → Bug |
---|
Could it be a duplicate of #22761? The same error message appears there. If not, can you provide a minimal set of models to reproduce?
comment:5 by , 8 years ago
I'm not using the to_field argument explicitedly (but my to_field is not named 'id') and that bug report is setup to be limited to a few cases. So far my experience is that any change to a model field that affects a foreign key pointing to it will blow up. This includes renaming the field, changing it's type or making a different field the primary key.
The minimal case is this:
class ItemType(models.Model): item_id = models.AutoField(primary_key=True) name = models.CharField(max_length=63) class Item(models.Model): item_type = models.ForeignKey(ItemType) name = models.CharField(max_length=63)
Now change ItemType.item_id to ItemType.item_type_id. If this doesn't blow up for you, let me know and I'll make a proper test case.
comment:6 by , 8 years ago
I see the same RenameField
operation from comment:3 but don't get a crash when running it. Tested with PostgreSQL 9.3.
comment:7 by , 8 years ago
I tried for a few hours to reduce it to a test case and failing. There's something at play that I'm not considering (all relations have been setup identical to the models in the project). So it's a corner-case, but I can't pick what in my design is causing it. Taking a break and a fresh look in a few days.
comment:8 by , 8 years ago
Resolution: | → needsinfo |
---|---|
Status: | new → closed |
Summary: | Migrations: Need API to drop a foreign key constraint → Renaming a primary key fails with "cannot drop constraint on table because other objects depend on it" |
Thanks, please reopen if you are able to reproduce.
comment:9 by , 8 years ago
Resolution: | needsinfo |
---|---|
Status: | closed → new |
I'm not able to reproduce but closer to the cause.
I cannot save an instance of the model referenced 'dpecharacter' - this is the initial model that had a varchar primary key that I dropped. When I try to save a model I run into an integrity error for the id field (the autogenerated id). I checked and the sequence has correctly been created (postgres) as dpe_dpecharacter_id_seq
in the public schema, but apparently isn't consulted.
When I try to regenrate the autofield, using a named auto field, makemigrations provides me with the "null field but no default" prompt. When I give a one-off default, migrate
gives me:
django.db.utils.ProgrammingError: multiple default values specified for column "character_id" of table "dpe_dpecharacter"
The field declaration is: character_id = models.AutoField(primary_key=True)
. The migration code is:
# -*- coding: utf-8 -*- # Generated by Django 1.10.1 on 2016-10-07 12:53 from __future__ import unicode_literals from django.db import migrations, models class Migration(migrations.Migration): dependencies = [ ('dpe', '0012_incubatorspot_destined_for'), ] operations = [ migrations.RemoveField( model_name='dpecharacter', name='id', ), migrations.AddField( model_name='dpecharacter', name='character_id', field=models.AutoField(default=0, primary_key=True, serialize=False), preserve_default=False, ), ]
When I remove the default=0 argument from the field declaration, the migration goes through.
The old sequence has been dropped and the new one created with the correct name. Yet, once again I run into the integrity error, this time for the new field name "character_id".
Two problems here:
- I don't know how I got into this situation.
- I don't seem to be able to get out of it.
What should I be looking for as to why the serial isn't being consulted?
comment:10 by , 8 years ago
One step closer: the default in the column definition doesn't contain the Nextval() function for the serial. So two weird points:
- Django prompts for missing default and given that the Nextval() for the serial isn't being picked up this seems logical
- When a default is specified, suddenly psychopg2 complains about two defaults, so between that check and the SQL backend a default value is inserted *but* doesn't end up in the table definition.
comment:11 by , 8 years ago
And here's the cause:
-- % djmanage sqlmigrate dpe 0013_regenerate_primary_key_for_dpe_character BEGIN; -- -- Remove field id from dpecharacter -- ALTER TABLE "dpe_dpecharacter" DROP COLUMN "id" CASCADE; -- -- Add field character_id to dpecharacter -- ALTER TABLE "dpe_dpecharacter" ADD COLUMN "character_id" serial NOT NULL PRIMARY KEY; ALTER TABLE "dpe_dpecharacter" ALTER COLUMN "character_id" DROP DEFAULT; COMMIT;
That's of course the preserve_default=False in the migration.
So that begs the question: Why does Django prompt for a default value for an AutoField?
comment:12 by , 8 years ago
Setting preserve_default to True or omitting the keyword argument still results in the DROP DEFAULT statement being emitted. I executed the statements from sqlmigrate without the DROP DEFAULT part to get out of the catch-22. As you can see in the initial comment this was also a table that ran into the drop contraint error.
I'm still baffled what makes this model and the item model different so that we can get to the root cause.
For completeness here's the model definition:
class DpeCharacter( uni_factory(max_length=32, verbose_name=_('name of character')) ): character_id = models.AutoField(primary_key=True) char_class = models.ForeignKey( CharacterClass, related_name='class_characters', ) account = models.ForeignKey( 'DpeAccount', models.CASCADE, related_name='my_characters', null=True, blank=True, ) level = models.PositiveSmallIntegerField( verbose_name=_('current level'), validators=( MinValueValidator(1, 'Minimum level is 1'), MaxValueValidator(105, 'Maximum level is 105'), ) ) avatar = models.ImageField( verbose_name=_('avatar picture'), help_text=_('Square JPEG or PNG format, min 32x32, max 128x128 pixels'), upload_to=avatar_upload_to, height_field='avatar_height', width_field='avatar_width', null=True, blank=True, ) avatar_height = models.PositiveSmallIntegerField( validators=( MinValueValidator(32, 'Avatar image too small.'), MaxValueValidator(128, 'Avatar image too tall.'), ), null=True, blank=True, ) avatar_width = models.PositiveSmallIntegerField( validators=( MinValueValidator(32, 'Avatar image too narrow'), MaxValueValidator(128, 'Avatar image too wide'), ), null=True, blank=True, ) def __str__(self): return '{} ({}-{})'.format( self.name, self.char_class.abbreviation, self.level ) def clean(self): if self.avatar is not None: if self.avatar_height != self.avatar_width: raise ValidationError({'avatar': 'Image should be square'}) class Meta: verbose_name = _('character') verbose_name_plural = _('characters') app_label = 'dpe'
And uni_factory:
class UniquelyNamedItemManager(models.Manager): def get_by_natural_key(self, name): return self.get(name=name) class PagedUniquelyNamedItemManager(PageManager, UniquelyNamedItemManager): pass def uni_factory(**kwargs): defaults = { 'verbose_name': _('name'), 'help_text': _('must be unique'), 'is_paged': False, } defaults.update(kwargs) if defaults.pop('is_paged', False): manager = PagedUniquelyNamedItemManager else: manager = UniquelyNamedItemManager class UniquelyNamedItemMixin(models.Model): objects = manager() name = UniqueNameField(**defaults) def __str__(self): return self.name def natural_key(self): return self.name class Meta(models.base.ModelBase): app_label = 'dpe' abstract = True return UniquelyNamedItemMixin
And finally, UniqueNameFIeld:
class UniqueNameField(CharField): title_transform = True max_length = 63 def __init__(self, title_transform=True, *args, **kwargs): kwargs['unique'] = True kwargs['max_length'] = kwargs.get('max_length', self.max_length) self.title_transform = title_transform super(UniqueNameField, self).__init__(*args, **kwargs) def deconstruct(self): name, path, args, kwargs = super(CharField, self).deconstruct() kwargs['title_transform'] = self.title_transform return name, path, args, kwargs def pre_save(self, model_instance, add): name = super(UniqueNameField, self).pre_save(model_instance, add) if self.title_transform: name = name.title() setattr(model_instance, self.name, name) return name
comment:13 by , 8 years ago
I'm not sure how changing a CharField
primary key to an AutoField
primary key should work (is it possible to autopopulate the column using the sequence?), but the current behavior of prompting for a default then erring with "multiple default values specified for column "character_id" of table "t27267_foo"" isn't good. I guess we could open a separate ticket for this issue. I'm not sure what else could be done with the information you provided.
comment:14 by , 8 years ago
Resolution: | → needsinfo |
---|---|
Status: | new → closed |
comment:15 by , 7 years ago
For what it is worth, I encountered the same error.
I am using Django 2.01 and Postgres 9.6
I have a workaround for the problem simply by declaring my migration to be non-atomic
atomic = False
Here is what it looks like in practice for me. For certain migrations, I initialize them with data; if I undo the migration, I also want to clear out that seed data, in the right order as all my OneToOneFields have models.PROTECT declared, like so
guid = models.OneToOneField(Guid, on_delete=models.PROTECT, primary_key=True)
so then my migrations looks like
# Generated by Django 2.0.2 on 2018-03-07 23:52 from django.core import management from django.db import migrations, models import django.db.models.deletion from dim.models import Guid, Role def create_fixture(apps, schema_editor): management.call_command('dumpdata', 'dim.Role', format='json', indent=4, output='dim/fixtures/000002_dim_role.json' ) def init_data(apps, schema_editor): Role.objects.create(guid=Guid.objects.create(), name="Basic Consumer") Role.objects.create(guid=Guid.objects.create(), name="Basic Publisher") def clear_data(apps, schema_editor): import pdb pdb.set_trace() for role in Role.objects.all(): guid = role.guid.guid role.delete(purge=True) Guid.objects.get(guid=guid).delete(purge=True) migrations.RunSQL("drop table dim_role cascade") class Migration(migrations.Migration): atomic = False dependencies = [ ('dim', '0001_initial'), ] operations = [ migrations.CreateModel( name='Role', fields=[ ('guid', models.OneToOneField(on_delete=django.db.models.deletion.PROTECT, primary_key=True, serialize=False, to='dim.Guid')), ('name', models.CharField(max_length=64, unique=True, verbose_name='name')), ('created_ts', models.DateTimeField(auto_now_add=True)), ('updated_ts', models.DateTimeField(auto_now=True, null=True)), ('deleted_ts', models.DateTimeField(null=True)), ], options={ 'verbose_name': 'role', 'verbose_name_plural': 'roles', 'db_table': 'dim_role', }, ), migrations.RunPython(init_data, reverse_code=clear_data), migrations.RunPython(create_fixture, reverse_code=migrations.RunPython.noop) ]
Dropping a foreign key from from a field should be done automatically if you convert the field to it's underlying datatype.
For example, if you have a foreign key pointing to a
CharField
namedfoo
replacing theForeignKey
by aCharField
should be detected as anAlterField
operation when runningmakemigrations
. TheAlterField
operation should take care of dropping the constraint onmigrate
.For example given these initial models.
Changing your definitions to
Running
makemigrations
should generate a migration file with an operation similar toAnd running the migration containing this operation should drop the foreign key constraint.