9

I have Alembic migration that declares Foreign Key constraint like that:

op.create_table( 'that', ... Column('this_id', String), ForeignKeyConstraint(['this_id'], ['this.id']) ... ) 

I have a requirement in my project to support two databases - PostgreSQL and MySQL. And since name of the constraint is not defined, each of the database generates it automatically. In MySQL it looks like this_ibfk_1 and in Postgres like that_this_id_key.

Now I need to write a migration that will drop the constraint. But how can I reference it considering that I don't know its name?

2
  • For PostgreSQL check the view pg_constraint and find the name of the constraint you're looking for. Commented Jan 28, 2015 at 14:31
  • information_schema views should work for both. Commented Jan 28, 2015 at 15:04

6 Answers 6

6

This answer may be about 4 years late, but I just had that problem myself: Alembic would throw upon dropping a constraint whose name is not known.
Here follows a bunch of solutions to find it:

  • get an SQLAlchemy Table object to manipulate:
from alembic import op from sqlalchemy import Table, MetaData meta = MetaData(bind=op.get_bind()) my_table = Table('my_table_name', meta) my_table_constraints = list(my_table.constraints) 

All of the constraints on your table are now listed in my_table_constraints. To get their names:

my_constraint_names = list(map(lambda x: x.name, my_table_constraints)) 
  • "guess" the name of your constraint. For a Postgres database it will most likely be something along the lines of '<table>_<column>_fkey, or 'fk_<table>_<column>'.

  • For a Postgres database, check the contents of catalog tables. The table you're interested in is pg_constraints. If you use pgAdmin, this table is located under Servers > [Your server] > Databases > [Your db] > Catalogs > PostgreSQL Catalog (pg_catalog) > Tables > pg_constraints.
    If you cannot use a GUI tool or want to query it more precisely, this answer explains how to do it.

Sign up to request clarification or add additional context in comments.

1 Comment

Note that MetaData no longer accepts a bind argument in recent versions of SQLAlchemy.
5

To get the foreign key's name in postgres, you can use psql

psql -U <username> -h <DB hostname> <database name> 

After login into the SQL prompt, use \d to inspect a certain table:

\d <table name> 

And you will see the foreign key constraints.

When alembic generates migration files, the foreign key's name defaults to None

op.create_foreign_key(None, 'table_name', 'other_table', ['other_table_id'], ['id']) op.drop_constraint(None, 'table_name', type_='foreignkey') 

Simply replace None with the foreign key's name you looked up from psql.

Best practice of ForeignKey creation with alembic

When you use alembic to automatically generate a revision file, it's better to assign a foreign key's name to the upgrade() and downgrade() function before applying the revision. Whatever the name you want can be used.

It makes sure that it always use the specified name to create foreign key regardless of database types/ versions. So your migrations will never fail

1 Comment

"Best practice" is to define a naming convention in the metadata..
4

You can use the inspect method in sqlalchemy for getting the table details, more details in here

In the inspect method, you can get existing foreign keys of specified table name using get_foreign_keys method. From that list, you can find the name of your foreign key by checking the value of referred_table.

Hope this helps.

Comments

0

As said in snakecharmerb's answer, define naming convention in the metadata to avoid this problem entirely: https://alembic.sqlalchemy.org/en/latest/naming.html

1 Comment

While this link may answer the question, it is better to include the essential parts of the answer here and provide the link for reference. Link-only answers can become invalid if the linked page changes. - From Review
0

I faced a similar issue with sqlite. This won't work with postgres, but I'm sure it could be adapted to support postgres.

First, I got a list of tables and made a migration using:

my_tables = ["user", "account", "session"] # etc, I just added these manually conn = op.get_bind() for table in my_tables: # And https://stackoverflow.com/a/5499071/557406 row = conn.execute( f""" SELECT sql FROM ( SELECT sql sql, type type, tbl_name tbl_name, name name FROM sqlite_master UNION ALL SELECT sql, type, tbl_name, name FROM sqlite_temp_master ) WHERE type == 'table' AND sql NOTNULL AND name NOT LIKE 'sqlite_%' AND tbl_name = '{table}' LIMIT 1 """ ).first() lines = [line.strip() for line in row[0].split("\n")] new_create = [line for line in lines if not line.startswith("FOREIGN KEY")] new_create[0] = new_create[0].replace(table, f"{table}_new") if new_create[-2].endswith(","): new_create[-2] = new_create[-2][:-1] # NOTE: If we rename a table first, the existing foreign keys # point to the new table conn.execute("PRAGMA foreign_keys=off;") conn.execute("BEGIN TRANSACTION;") conn.execute("\n".join(new_create)) conn.execute(f"INSERT INTO {table}_new SELECT * FROM {table};") conn.execute(f"DROP TABLE {table}") conn.execute(f"ALTER TABLE {table}_new RENAME TO {table};") conn.execute("COMMIT;") conn.execute("PRAGMA foreign_keys=on;") 

Note that I drop lines that start with "FOREIGN KEY". These are the unnamed foreign keys, the named ones mention "FOREIGN KEY" in the create line, but do not start with that text.

I then ran this migration.

I went back to my SQLAlchemy(app) and added a naming convention:

metadata = sa.MetaData( naming_convention={ "ix": "ix_%(column_0_label)s", "uq": "uq_%(table_name)s_%(column_0_name)s", "ck": "ck_%(table_name)s_%(constraint_name)s", "fk": "fk_%(table_name)s_%(column_0_name)s_%(referred_table_name)s", "pk": "pk_%(table_name)s", } ) db = SQLAlchemy(app=app, metadata=metadata) 

Then I used alembic to autogenerate a second migration. This detected all the foreign keys that had been dropped and created them following the naming convention. I also had to add conn.execute("PRAGMA foreign_keys=off;") and a corresponding "on" line to the autogenerated migration

Also note, this does not work for other database types, so I used `if conn.engine.name == "sqlite"` to only run this on sqlite databases

Comments

0

Ideally, configuring a metadata naming convention would prevent the problem of unnamed constraints ever arising. That said, testing on PostgreSQL and MariaDB with SQLAlchemy 2.0.40 and Alembic 1.15.2 I found that Alembic automatically detected and removed unnamed ForeignKeyConstraints.

SQLite does not assign a name to unnamed constraints. The Alembic documentation recommends using Batch Mode and then manually setting a naming convention and name in the autogenerated revision.

Batch mode is configured by setting a flag when calling context.configure in env.py:

context.configure( connection=connection, target_metadata=target_metadata, render_as_batch=True ) 

The "dropping" revision should then be edited like this (assuming that there is a table level constraint such that t2.fkcol references t1.id):

 def upgrade(): # ### commands auto generated by Alembic - please adjust! ### naming_convention = { 'fk': 'fk_%(table_name)s_%(column_0_name)s_%(referred_table_name)s', } with op.batch_alter_table( 't28194665b', schema=None, naming_convention=naming_convention ) as batch_op: batch_op.drop_constraint('fk_t2_fkcol_t1', type_='foreignkey') 

Batch mode will create a new table without the constraint, copy the data and then drop the original table.

Comments

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.