pax_global_header00006660000000000000000000000064135310676010014514gustar00rootroot0000000000000052 comment=bee044a1c187be7f98dbfa08cc3017b3b9362362 zzzeek-alembic-bee044a1c187/000077500000000000000000000000001353106760100156025ustar00rootroot00000000000000zzzeek-alembic-bee044a1c187/.coveragerc000066400000000000000000000000701353106760100177200ustar00rootroot00000000000000[run] include=alembic/* [report] omit=alembic/testing/*zzzeek-alembic-bee044a1c187/.gitignore000066400000000000000000000002371353106760100175740ustar00rootroot00000000000000*.pyc *.pyo /build/ dist/ /docs/build/output/ *.orig alembic.ini .venv *.egg-info .coverage coverage.xml .tox *.patch /scratch /scratch_test_* /test_schema.db zzzeek-alembic-bee044a1c187/.gitreview000066400000000000000000000001401353106760100176030ustar00rootroot00000000000000[gerrit] host=gerrit.sqlalchemy.org project=sqlalchemy/alembic defaultbranch=master port=29418 zzzeek-alembic-bee044a1c187/.pre-commit-config.yaml000066400000000000000000000005061353106760100220640ustar00rootroot00000000000000# See https://pre-commit.com for more information # See https://pre-commit.com/hooks.html for more hooks repos: - repo: https://github.com/python/black/ rev: 19.3b0 hooks: - id: black args: [-l 79] - repo: https://github.com/sqlalchemyorg/zimports/ rev: master hooks: - id: zimports zzzeek-alembic-bee044a1c187/CHANGES000066400000000000000000000002621353106760100165750ustar00rootroot00000000000000===== MOVED ===== Please see: /docs/changelog.html /docs/build/changelog.rst or http://alembic.sqlalchemy.org/en/latest/changelog.html for the current CHANGES. zzzeek-alembic-bee044a1c187/LICENSE000066400000000000000000000020421353106760100166050ustar00rootroot00000000000000Copyright 2009-2019 Michael Bayer. Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.zzzeek-alembic-bee044a1c187/MANIFEST.in000066400000000000000000000004011353106760100173330ustar00rootroot00000000000000recursive-include docs *.html *.css *.txt *.js *.jpg *.png *.py Makefile *.rst *.sty recursive-include tests *.py *.dat recursive-include alembic/templates *.mako README *.py include README* LICENSE run_tests.py CHANGES* tox.ini prune docs/build/output zzzeek-alembic-bee044a1c187/README.rst000066400000000000000000000114751353106760100173010ustar00rootroot00000000000000Alembic is a database migrations tool written by the author of `SQLAlchemy `_. A migrations tool offers the following functionality: * Can emit ALTER statements to a database in order to change the structure of tables and other constructs * Provides a system whereby "migration scripts" may be constructed; each script indicates a particular series of steps that can "upgrade" a target database to a new version, and optionally a series of steps that can "downgrade" similarly, doing the same steps in reverse. * Allows the scripts to execute in some sequential manner. The goals of Alembic are: * Very open ended and transparent configuration and operation. A new Alembic environment is generated from a set of templates which is selected among a set of options when setup first occurs. The templates then deposit a series of scripts that define fully how database connectivity is established and how migration scripts are invoked; the migration scripts themselves are generated from a template within that series of scripts. The scripts can then be further customized to define exactly how databases will be interacted with and what structure new migration files should take. * Full support for transactional DDL. The default scripts ensure that all migrations occur within a transaction - for those databases which support this (Postgresql, Microsoft SQL Server), migrations can be tested with no need to manually undo changes upon failure. * Minimalist script construction. Basic operations like renaming tables/columns, adding/removing columns, changing column attributes can be performed through one line commands like alter_column(), rename_table(), add_constraint(). There is no need to recreate full SQLAlchemy Table structures for simple operations like these - the functions themselves generate minimalist schema structures behind the scenes to achieve the given DDL sequence. * "auto generation" of migrations. While real world migrations are far more complex than what can be automatically determined, Alembic can still eliminate the initial grunt work in generating new migration directives from an altered schema. The ``--autogenerate`` feature will inspect the current status of a database using SQLAlchemy's schema inspection capabilities, compare it to the current state of the database model as specified in Python, and generate a series of "candidate" migrations, rendering them into a new migration script as Python directives. The developer then edits the new file, adding additional directives and data migrations as needed, to produce a finished migration. Table and column level changes can be detected, with constraints and indexes to follow as well. * Full support for migrations generated as SQL scripts. Those of us who work in corporate environments know that direct access to DDL commands on a production database is a rare privilege, and DBAs want textual SQL scripts. Alembic's usage model and commands are oriented towards being able to run a series of migrations into a textual output file as easily as it runs them directly to a database. Care must be taken in this mode to not invoke other operations that rely upon in-memory SELECTs of rows - Alembic tries to provide helper constructs like bulk_insert() to help with data-oriented operations that are compatible with script-based DDL. * Non-linear, dependency-graph versioning. Scripts are given UUID identifiers similarly to a DVCS, and the linkage of one script to the next is achieved via human-editable markers within the scripts themselves. The structure of a set of migration files is considered as a directed-acyclic graph, meaning any migration file can be dependent on any other arbitrary set of migration files, or none at all. Through this open-ended system, migration files can be organized into branches, multiple roots, and mergepoints, without restriction. Commands are provided to produce new branches, roots, and merges of branches automatically. * Provide a library of ALTER constructs that can be used by any SQLAlchemy application. The DDL constructs build upon SQLAlchemy's own DDLElement base and can be used standalone by any application or script. * At long last, bring SQLite and its inablity to ALTER things into the fold, but in such a way that SQLite's very special workflow needs are accommodated in an explicit way that makes the most of a bad situation, through the concept of a "batch" migration, where multiple changes to a table can be batched together to form a series of instructions for a single, subsequent "move-and-copy" workflow. You can even use "move-and-copy" workflow for other databases, if you want to recreate a table in the background on a busy system. Documentation and status of Alembic is at https://alembic.sqlalchemy.org/ zzzeek-alembic-bee044a1c187/README.unittests.rst000066400000000000000000000214721353106760100213400ustar00rootroot00000000000000================================ SQLALCHEMY / ALEMBIC UNIT TESTS ================================ Note that Alembic uses a test framework that is mostly equivalent to the one that SQLAlchemy uses. While it is as of May, 2019 still vendored over (e.g. copied from SQLAlchemy into Alembic's source tree with local modifications), the potential plan is that Alembic will use SQLAlchemy's suite directly one support for older SQLAlchemy versions is dropped. This document is mostly copied directly from that of SQLAlchemy. Note that Alembic's test suite currently has "backend" tests (e.g., tests that require a real database) only for PostgreSQL and MySQL; other backends like Oracle and SQL Server are not required. Basic Test Running ================== A test target exists within the setup.py script. For basic test runs:: python setup.py test Running with Tox ================ For more elaborate CI-style test running, the tox script provided will run against various Python / database targets. For a basic run against Python 2.7 using an in-memory SQLite database:: tox -e py27-sqlite The tox runner contains a series of target combinations that can run against various combinations of databases. The test suite can be run against SQLite with "backend" tests also running against a PostgreSQL database:: tox -e py36-sqlite-postgresql Or to run just "backend" tests (NOTE: Alembic has no tests marked this way so this option is not important) against a MySQL databases:: tox -e py36-mysql-backendonly Running against backends other than SQLite requires that a database of that vendor be available at a specific URL. See "Setting Up Databases" below for details. The py.test Engine ================== Both the tox runner and the setup.py runner are using py.test to invoke the test suite. Within the realm of py.test, SQLAlchemy itself is adding a large series of option and customizations to the py.test runner using plugin points, to allow for SQLAlchemy's multiple database support, database setup/teardown and connectivity, multi process support, as well as lots of skip / database selection rules. Running tests with py.test directly grants more immediate control over database options and test selection. A generic py.test run looks like:: py.test -n4 Above, the full test suite will run against SQLite, using four processes. If the "-n" flag is not used, the pytest-xdist is skipped and the tests will run linearly, which will take a pretty long time. The py.test command line is more handy for running subsets of tests and to quickly allow for custom database connections. Example:: py.test --dburi=postgresql+psycopg2://scott:tiger@localhost/test test/sql/test_query.py Above will run the tests in the test/sql/test_query.py file (a pretty good file for basic "does this database work at all?" to start with) against a running PostgreSQL database at the given URL. The py.test frontend can also run tests against multiple kinds of databases at once - a large subset of tests are marked as "backend" tests, which will be run against each available backend, and additionally lots of tests are targeted at specific backends only, which only run if a matching backend is made available. For example, to run the test suite against both PostgreSQL and MySQL at the same time:: py.test -n4 --db postgresql --db mysql Setting Up Databases ==================== The test suite identifies several built-in database tags that run against a pre-set URL. These can be seen using --dbs:: $ py.test --dbs Available --db options (use --dburi to override) default sqlite:///:memory: firebird firebird://sysdba:masterkey@localhost//Users/classic/foo.fdb mssql mssql+pyodbc://scott:tiger@ms_2008 mssql_pymssql mssql+pymssql://scott:tiger@ms_2008 mysql mysql://scott:tiger@127.0.0.1:3306/test?charset=utf8 oracle oracle://scott:tiger@127.0.0.1:1521 oracle8 oracle://scott:tiger@127.0.0.1:1521/?use_ansi=0 pg8000 postgresql+pg8000://scott:tiger@127.0.0.1:5432/test postgresql postgresql://scott:tiger@127.0.0.1:5432/test postgresql_psycopg2cffi postgresql+psycopg2cffi://scott:tiger@127.0.0.1:5432/test pymysql mysql+pymysql://scott:tiger@127.0.0.1:3306/test?charset=utf8 sqlite sqlite:///:memory: sqlite_file sqlite:///querytest.db What those mean is that if you have a database running that can be accessed by the above URL, you can run the test suite against it using ``--db ``. The URLs are present in the ``setup.cfg`` file. You can make your own URLs by creating a new file called ``test.cfg`` and adding your own ``[db]`` section:: # test.cfg file [db] my_postgresql=postgresql://username:pass@hostname/dbname Above, we can now run the tests with ``my_postgresql``:: py.test --db my_postgresql We can also override the existing names in our ``test.cfg`` file, so that we can run with the tox runner also:: # test.cfg file [db] postgresql=postgresql://username:pass@hostname/dbname Now when we run ``tox -e py27-postgresql``, it will use our custom URL instead of the fixed one in setup.cfg. Database Configuration ====================== The test runner will by default create and drop tables within the default database that's in the database URL, *unless* the multiprocessing option is in use via the py.test "-n" flag, which invokes pytest-xdist. The multiprocessing option is **enabled by default** for both the tox runner and the setup.py frontend. When multiprocessing is used, the SQLAlchemy testing framework will create a new database for each process, and then tear it down after the test run is complete. So it will be necessary for the database user to have access to CREATE DATABASE in order for this to work. Several tests require alternate usernames or schemas to be present, which are used to test dotted-name access scenarios. On some databases such as Oracle or Sybase, these are usernames, and others such as PostgreSQL and MySQL they are schemas. The requirement applies to all backends except SQLite and Firebird. The names are:: test_schema test_schema_2 (only used on PostgreSQL) Please refer to your vendor documentation for the proper syntax to create these namespaces - the database user must have permission to create and drop tables within these schemas. Its perfectly fine to run the test suite without these namespaces present, it only means that a handful of tests which expect them to be present will fail. Additional steps specific to individual databases are as follows:: POSTGRESQL: To enable unicode testing with JSONB, create the database with UTF8 encoding:: postgres=# create database test with owner=scott encoding='utf8' template=template0; To include tests for HSTORE, create the HSTORE type engine:: postgres=# \c test; You are now connected to database "test" as user "postgresql". test=# create extension hstore; CREATE EXTENSION Full-text search configuration should be set to English, else several tests of ``.match()`` will fail. This can be set (if it isn't so already) with: ALTER DATABASE test SET default_text_search_config = 'pg_catalog.english' ORACLE: a user named "test_schema" is created in addition to the default user. The primary database user needs to be able to create and drop tables, synonyms, and constraints within the "test_schema" user. For this to work fully, including that the user has the "REFERENCES" role in a remote schema for tables not yet defined (REFERENCES is per-table), it is required that the test the user be present in the "DBA" role: grant dba to scott; MSSQL: Tests that involve multiple connections require Snapshot Isolation ability implemented on the test database in order to prevent deadlocks that will occur with record locking isolation. This feature is only available with MSSQL 2005 and greater. You must enable snapshot isolation at the database level and set the default cursor isolation with two SQL commands: ALTER DATABASE MyDatabase SET ALLOW_SNAPSHOT_ISOLATION ON ALTER DATABASE MyDatabase SET READ_COMMITTED_SNAPSHOT ON CONFIGURING LOGGING ------------------- SQLAlchemy logs its activity and debugging through Python's logging package. Any log target can be directed to the console with command line options, such as:: $ ./py.test test/orm/test_unitofwork.py -s \ --log-debug=sqlalchemy.pool --log-info=sqlalchemy.engine Above we add the py.test "-s" flag so that standard out is not suppressed. DEVELOPING AND TESTING NEW DIALECTS (SQLAlchemy Only) ------------------------------------------------------- See the file README.dialects.rst for detail on dialects. zzzeek-alembic-bee044a1c187/alembic/000077500000000000000000000000001353106760100171765ustar00rootroot00000000000000zzzeek-alembic-bee044a1c187/alembic/__init__.py000066400000000000000000000005011353106760100213030ustar00rootroot00000000000000from os import path import sys from . import context # noqa from . import op # noqa from .runtime import environment from .runtime import migration __version__ = "1.1.0" package_dir = path.abspath(path.dirname(__file__)) sys.modules["alembic.migration"] = migration sys.modules["alembic.environment"] = environment zzzeek-alembic-bee044a1c187/alembic/autogenerate/000077500000000000000000000000001353106760100216615ustar00rootroot00000000000000zzzeek-alembic-bee044a1c187/alembic/autogenerate/__init__.py000066400000000000000000000006571353106760100240020ustar00rootroot00000000000000from .api import _render_migration_diffs # noqa from .api import compare_metadata # noqa from .api import produce_migrations # noqa from .api import render_python_code # noqa from .api import RevisionContext # noqa from .compare import _produce_net_changes # noqa from .compare import comparators # noqa from .render import render_op_text # noqa from .render import renderers # noqa from .rewriter import Rewriter # noqa zzzeek-alembic-bee044a1c187/alembic/autogenerate/api.py000066400000000000000000000405731353106760100230150ustar00rootroot00000000000000"""Provide the 'autogenerate' feature which can produce migration operations automatically.""" import contextlib from sqlalchemy.engine.reflection import Inspector from . import compare from . import render from .. import util from ..operations import ops def compare_metadata(context, metadata): """Compare a database schema to that given in a :class:`~sqlalchemy.schema.MetaData` instance. The database connection is presented in the context of a :class:`.MigrationContext` object, which provides database connectivity as well as optional comparison functions to use for datatypes and server defaults - see the "autogenerate" arguments at :meth:`.EnvironmentContext.configure` for details on these. The return format is a list of "diff" directives, each representing individual differences:: from alembic.migration import MigrationContext from alembic.autogenerate import compare_metadata from sqlalchemy.schema import SchemaItem from sqlalchemy.types import TypeEngine from sqlalchemy import (create_engine, MetaData, Column, Integer, String, Table) import pprint engine = create_engine("sqlite://") engine.execute(''' create table foo ( id integer not null primary key, old_data varchar, x integer )''') engine.execute(''' create table bar ( data varchar )''') metadata = MetaData() Table('foo', metadata, Column('id', Integer, primary_key=True), Column('data', Integer), Column('x', Integer, nullable=False) ) Table('bat', metadata, Column('info', String) ) mc = MigrationContext.configure(engine.connect()) diff = compare_metadata(mc, metadata) pprint.pprint(diff, indent=2, width=20) Output:: [ ( 'add_table', Table('bat', MetaData(bind=None), Column('info', String(), table=), schema=None)), ( 'remove_table', Table(u'bar', MetaData(bind=None), Column(u'data', VARCHAR(), table=), schema=None)), ( 'add_column', None, 'foo', Column('data', Integer(), table=)), ( 'remove_column', None, 'foo', Column(u'old_data', VARCHAR(), table=None)), [ ( 'modify_nullable', None, 'foo', u'x', { 'existing_server_default': None, 'existing_type': INTEGER()}, True, False)]] :param context: a :class:`.MigrationContext` instance. :param metadata: a :class:`~sqlalchemy.schema.MetaData` instance. .. seealso:: :func:`.produce_migrations` - produces a :class:`.MigrationScript` structure based on metadata comparison. """ migration_script = produce_migrations(context, metadata) return migration_script.upgrade_ops.as_diffs() def produce_migrations(context, metadata): """Produce a :class:`.MigrationScript` structure based on schema comparison. This function does essentially what :func:`.compare_metadata` does, but then runs the resulting list of diffs to produce the full :class:`.MigrationScript` object. For an example of what this looks like, see the example in :ref:`customizing_revision`. .. versionadded:: 0.8.0 .. seealso:: :func:`.compare_metadata` - returns more fundamental "diff" data from comparing a schema. """ autogen_context = AutogenContext(context, metadata=metadata) migration_script = ops.MigrationScript( rev_id=None, upgrade_ops=ops.UpgradeOps([]), downgrade_ops=ops.DowngradeOps([]), ) compare._populate_migration_script(autogen_context, migration_script) return migration_script def render_python_code( up_or_down_op, sqlalchemy_module_prefix="sa.", alembic_module_prefix="op.", render_as_batch=False, imports=(), render_item=None, ): """Render Python code given an :class:`.UpgradeOps` or :class:`.DowngradeOps` object. This is a convenience function that can be used to test the autogenerate output of a user-defined :class:`.MigrationScript` structure. """ opts = { "sqlalchemy_module_prefix": sqlalchemy_module_prefix, "alembic_module_prefix": alembic_module_prefix, "render_item": render_item, "render_as_batch": render_as_batch, } autogen_context = AutogenContext(None, opts=opts) autogen_context.imports = set(imports) return render._indent( render._render_cmd_body(up_or_down_op, autogen_context) ) def _render_migration_diffs(context, template_args): """legacy, used by test_autogen_composition at the moment""" autogen_context = AutogenContext(context) upgrade_ops = ops.UpgradeOps([]) compare._produce_net_changes(autogen_context, upgrade_ops) migration_script = ops.MigrationScript( rev_id=None, upgrade_ops=upgrade_ops, downgrade_ops=upgrade_ops.reverse(), ) render._render_python_into_templatevars( autogen_context, migration_script, template_args ) class AutogenContext(object): """Maintains configuration and state that's specific to an autogenerate operation.""" metadata = None """The :class:`~sqlalchemy.schema.MetaData` object representing the destination. This object is the one that is passed within ``env.py`` to the :paramref:`.EnvironmentContext.configure.target_metadata` parameter. It represents the structure of :class:`.Table` and other objects as stated in the current database model, and represents the destination structure for the database being examined. While the :class:`~sqlalchemy.schema.MetaData` object is primarily known as a collection of :class:`~sqlalchemy.schema.Table` objects, it also has an :attr:`~sqlalchemy.schema.MetaData.info` dictionary that may be used by end-user schemes to store additional schema-level objects that are to be compared in custom autogeneration schemes. """ connection = None """The :class:`~sqlalchemy.engine.base.Connection` object currently connected to the database backend being compared. This is obtained from the :attr:`.MigrationContext.bind` and is utimately set up in the ``env.py`` script. """ dialect = None """The :class:`~sqlalchemy.engine.Dialect` object currently in use. This is normally obtained from the :attr:`~sqlalchemy.engine.base.Connection.dialect` attribute. """ imports = None """A ``set()`` which contains string Python import directives. The directives are to be rendered into the ``${imports}`` section of a script template. The set is normally empty and can be modified within hooks such as the :paramref:`.EnvironmentContext.configure.render_item` hook. .. versionadded:: 0.8.3 .. seealso:: :ref:`autogen_render_types` """ migration_context = None """The :class:`.MigrationContext` established by the ``env.py`` script.""" def __init__( self, migration_context, metadata=None, opts=None, autogenerate=True ): if ( autogenerate and migration_context is not None and migration_context.as_sql ): raise util.CommandError( "autogenerate can't use as_sql=True as it prevents querying " "the database for schema information" ) if opts is None: opts = migration_context.opts self.metadata = metadata = ( opts.get("target_metadata", None) if metadata is None else metadata ) if ( autogenerate and metadata is None and migration_context is not None and migration_context.script is not None ): raise util.CommandError( "Can't proceed with --autogenerate option; environment " "script %s does not provide " "a MetaData object or sequence of objects to the context." % (migration_context.script.env_py_location) ) include_symbol = opts.get("include_symbol", None) include_object = opts.get("include_object", None) object_filters = [] if include_symbol: def include_symbol_filter( object_, name, type_, reflected, compare_to ): if type_ == "table": return include_symbol(name, object_.schema) else: return True object_filters.append(include_symbol_filter) if include_object: object_filters.append(include_object) self._object_filters = object_filters self.migration_context = migration_context if self.migration_context is not None: self.connection = self.migration_context.bind self.dialect = self.migration_context.dialect self.imports = set() self.opts = opts self._has_batch = False @util.memoized_property def inspector(self): return Inspector.from_engine(self.connection) @contextlib.contextmanager def _within_batch(self): self._has_batch = True yield self._has_batch = False def run_filters(self, object_, name, type_, reflected, compare_to): """Run the context's object filters and return True if the targets should be part of the autogenerate operation. This method should be run for every kind of object encountered within an autogenerate operation, giving the environment the chance to filter what objects should be included in the comparison. The filters here are produced directly via the :paramref:`.EnvironmentContext.configure.include_object` and :paramref:`.EnvironmentContext.configure.include_symbol` functions, if present. """ for fn in self._object_filters: if not fn(object_, name, type_, reflected, compare_to): return False else: return True @util.memoized_property def sorted_tables(self): """Return an aggregate of the :attr:`.MetaData.sorted_tables` collection(s). For a sequence of :class:`.MetaData` objects, this concatenates the :attr:`.MetaData.sorted_tables` collection for each individual :class:`.MetaData` in the order of the sequence. It does **not** collate the sorted tables collections. .. versionadded:: 0.9.0 """ result = [] for m in util.to_list(self.metadata): result.extend(m.sorted_tables) return result @util.memoized_property def table_key_to_table(self): """Return an aggregate of the :attr:`.MetaData.tables` dictionaries. The :attr:`.MetaData.tables` collection is a dictionary of table key to :class:`.Table`; this method aggregates the dictionary across multiple :class:`.MetaData` objects into one dictionary. Duplicate table keys are **not** supported; if two :class:`.MetaData` objects contain the same table key, an exception is raised. .. versionadded:: 0.9.0 """ result = {} for m in util.to_list(self.metadata): intersect = set(result).intersection(set(m.tables)) if intersect: raise ValueError( "Duplicate table keys across multiple " "MetaData objects: %s" % (", ".join('"%s"' % key for key in sorted(intersect))) ) result.update(m.tables) return result class RevisionContext(object): """Maintains configuration and state that's specific to a revision file generation operation.""" def __init__( self, config, script_directory, command_args, process_revision_directives=None, ): self.config = config self.script_directory = script_directory self.command_args = command_args self.process_revision_directives = process_revision_directives self.template_args = { "config": config # Let templates use config for # e.g. multiple databases } self.generated_revisions = [self._default_revision()] def _to_script(self, migration_script): template_args = {} for k, v in self.template_args.items(): template_args.setdefault(k, v) if getattr(migration_script, "_needs_render", False): autogen_context = self._last_autogen_context # clear out existing imports if we are doing multiple # renders autogen_context.imports = set() if migration_script.imports: autogen_context.imports.update(migration_script.imports) render._render_python_into_templatevars( autogen_context, migration_script, template_args ) return self.script_directory.generate_revision( migration_script.rev_id, migration_script.message, refresh=True, head=migration_script.head, splice=migration_script.splice, branch_labels=migration_script.branch_label, version_path=migration_script.version_path, depends_on=migration_script.depends_on, **template_args ) def run_autogenerate(self, rev, migration_context): self._run_environment(rev, migration_context, True) def run_no_autogenerate(self, rev, migration_context): self._run_environment(rev, migration_context, False) def _run_environment(self, rev, migration_context, autogenerate): if autogenerate: if self.command_args["sql"]: raise util.CommandError( "Using --sql with --autogenerate does not make any sense" ) if set(self.script_directory.get_revisions(rev)) != set( self.script_directory.get_revisions("heads") ): raise util.CommandError("Target database is not up to date.") upgrade_token = migration_context.opts["upgrade_token"] downgrade_token = migration_context.opts["downgrade_token"] migration_script = self.generated_revisions[-1] if not getattr(migration_script, "_needs_render", False): migration_script.upgrade_ops_list[-1].upgrade_token = upgrade_token migration_script.downgrade_ops_list[ -1 ].downgrade_token = downgrade_token migration_script._needs_render = True else: migration_script._upgrade_ops.append( ops.UpgradeOps([], upgrade_token=upgrade_token) ) migration_script._downgrade_ops.append( ops.DowngradeOps([], downgrade_token=downgrade_token) ) self._last_autogen_context = autogen_context = AutogenContext( migration_context, autogenerate=autogenerate ) if autogenerate: compare._populate_migration_script( autogen_context, migration_script ) if self.process_revision_directives: self.process_revision_directives( migration_context, rev, self.generated_revisions ) hook = migration_context.opts["process_revision_directives"] if hook: hook(migration_context, rev, self.generated_revisions) for migration_script in self.generated_revisions: migration_script._needs_render = True def _default_revision(self): op = ops.MigrationScript( rev_id=self.command_args["rev_id"] or util.rev_id(), message=self.command_args["message"], upgrade_ops=ops.UpgradeOps([]), downgrade_ops=ops.DowngradeOps([]), head=self.command_args["head"], splice=self.command_args["splice"], branch_label=self.command_args["branch_label"], version_path=self.command_args["version_path"], depends_on=self.command_args["depends_on"], ) return op def generate_scripts(self): for generated_revision in self.generated_revisions: yield self._to_script(generated_revision) zzzeek-alembic-bee044a1c187/alembic/autogenerate/compare.py000066400000000000000000001047221353106760100236670ustar00rootroot00000000000000import contextlib import logging import re from sqlalchemy import event from sqlalchemy import schema as sa_schema from sqlalchemy import types as sqltypes from sqlalchemy.engine.reflection import Inspector from sqlalchemy.util import OrderedSet from alembic.ddl.base import _fk_spec from .render import _user_defined_render from .. import util from ..operations import ops from ..util import compat from ..util import sqla_compat log = logging.getLogger(__name__) def _populate_migration_script(autogen_context, migration_script): upgrade_ops = migration_script.upgrade_ops_list[-1] downgrade_ops = migration_script.downgrade_ops_list[-1] _produce_net_changes(autogen_context, upgrade_ops) upgrade_ops.reverse_into(downgrade_ops) comparators = util.Dispatcher(uselist=True) def _produce_net_changes(autogen_context, upgrade_ops): connection = autogen_context.connection include_schemas = autogen_context.opts.get("include_schemas", False) inspector = Inspector.from_engine(connection) default_schema = connection.dialect.default_schema_name if include_schemas: schemas = set(inspector.get_schema_names()) # replace default schema name with None schemas.discard("information_schema") # replace the "default" schema with None schemas.discard(default_schema) schemas.add(None) else: schemas = [None] comparators.dispatch("schema", autogen_context.dialect.name)( autogen_context, upgrade_ops, schemas ) @comparators.dispatch_for("schema") def _autogen_for_tables(autogen_context, upgrade_ops, schemas): inspector = autogen_context.inspector conn_table_names = set() version_table_schema = ( autogen_context.migration_context.version_table_schema ) version_table = autogen_context.migration_context.version_table for s in schemas: tables = set(inspector.get_table_names(schema=s)) if s == version_table_schema: tables = tables.difference( [autogen_context.migration_context.version_table] ) conn_table_names.update(zip([s] * len(tables), tables)) metadata_table_names = OrderedSet( [(table.schema, table.name) for table in autogen_context.sorted_tables] ).difference([(version_table_schema, version_table)]) _compare_tables( conn_table_names, metadata_table_names, inspector, upgrade_ops, autogen_context, ) def _compare_tables( conn_table_names, metadata_table_names, inspector, upgrade_ops, autogen_context, ): default_schema = inspector.bind.dialect.default_schema_name # tables coming from the connection will not have "schema" # set if it matches default_schema_name; so we need a list # of table names from local metadata that also have "None" if schema # == default_schema_name. Most setups will be like this anyway but # some are not (see #170) metadata_table_names_no_dflt_schema = OrderedSet( [ (schema if schema != default_schema else None, tname) for schema, tname in metadata_table_names ] ) # to adjust for the MetaData collection storing the tables either # as "schemaname.tablename" or just "tablename", create a new lookup # which will match the "non-default-schema" keys to the Table object. tname_to_table = dict( ( no_dflt_schema, autogen_context.table_key_to_table[ sa_schema._get_table_key(tname, schema) ], ) for no_dflt_schema, (schema, tname) in zip( metadata_table_names_no_dflt_schema, metadata_table_names ) ) metadata_table_names = metadata_table_names_no_dflt_schema for s, tname in metadata_table_names.difference(conn_table_names): name = "%s.%s" % (s, tname) if s else tname metadata_table = tname_to_table[(s, tname)] if autogen_context.run_filters( metadata_table, tname, "table", False, None ): upgrade_ops.ops.append( ops.CreateTableOp.from_table(metadata_table) ) log.info("Detected added table %r", name) modify_table_ops = ops.ModifyTableOps(tname, [], schema=s) comparators.dispatch("table")( autogen_context, modify_table_ops, s, tname, None, metadata_table, ) if not modify_table_ops.is_empty(): upgrade_ops.ops.append(modify_table_ops) removal_metadata = sa_schema.MetaData() for s, tname in conn_table_names.difference(metadata_table_names): name = sa_schema._get_table_key(tname, s) exists = name in removal_metadata.tables t = sa_schema.Table(tname, removal_metadata, schema=s) if not exists: event.listen( t, "column_reflect", # fmt: off autogen_context.migration_context.impl. _compat_autogen_column_reflect (inspector), # fmt: on ) inspector.reflecttable(t, None) if autogen_context.run_filters(t, tname, "table", True, None): modify_table_ops = ops.ModifyTableOps(tname, [], schema=s) comparators.dispatch("table")( autogen_context, modify_table_ops, s, tname, t, None ) if not modify_table_ops.is_empty(): upgrade_ops.ops.append(modify_table_ops) upgrade_ops.ops.append(ops.DropTableOp.from_table(t)) log.info("Detected removed table %r", name) existing_tables = conn_table_names.intersection(metadata_table_names) existing_metadata = sa_schema.MetaData() conn_column_info = {} for s, tname in existing_tables: name = sa_schema._get_table_key(tname, s) exists = name in existing_metadata.tables t = sa_schema.Table(tname, existing_metadata, schema=s) if not exists: event.listen( t, "column_reflect", # fmt: off autogen_context.migration_context.impl. _compat_autogen_column_reflect(inspector), # fmt: on ) inspector.reflecttable(t, None) conn_column_info[(s, tname)] = t for s, tname in sorted(existing_tables, key=lambda x: (x[0] or "", x[1])): s = s or None name = "%s.%s" % (s, tname) if s else tname metadata_table = tname_to_table[(s, tname)] conn_table = existing_metadata.tables[name] if autogen_context.run_filters( metadata_table, tname, "table", False, conn_table ): modify_table_ops = ops.ModifyTableOps(tname, [], schema=s) with _compare_columns( s, tname, conn_table, metadata_table, modify_table_ops, autogen_context, inspector, ): comparators.dispatch("table")( autogen_context, modify_table_ops, s, tname, conn_table, metadata_table, ) if not modify_table_ops.is_empty(): upgrade_ops.ops.append(modify_table_ops) def _make_index(params, conn_table): ix = sa_schema.Index( params["name"], *[conn_table.c[cname] for cname in params["column_names"]], unique=params["unique"] ) if "duplicates_constraint" in params: ix.info["duplicates_constraint"] = params["duplicates_constraint"] return ix def _make_unique_constraint(params, conn_table): uq = sa_schema.UniqueConstraint( *[conn_table.c[cname] for cname in params["column_names"]], name=params["name"] ) if "duplicates_index" in params: uq.info["duplicates_index"] = params["duplicates_index"] return uq def _make_foreign_key(params, conn_table): tname = params["referred_table"] if params["referred_schema"]: tname = "%s.%s" % (params["referred_schema"], tname) options = params.get("options", {}) const = sa_schema.ForeignKeyConstraint( [conn_table.c[cname] for cname in params["constrained_columns"]], ["%s.%s" % (tname, n) for n in params["referred_columns"]], onupdate=options.get("onupdate"), ondelete=options.get("ondelete"), deferrable=options.get("deferrable"), initially=options.get("initially"), name=params["name"], ) # needed by 0.7 conn_table.append_constraint(const) return const @contextlib.contextmanager def _compare_columns( schema, tname, conn_table, metadata_table, modify_table_ops, autogen_context, inspector, ): name = "%s.%s" % (schema, tname) if schema else tname metadata_cols_by_name = dict( (c.name, c) for c in metadata_table.c if not c.system ) conn_col_names = dict((c.name, c) for c in conn_table.c) metadata_col_names = OrderedSet(sorted(metadata_cols_by_name)) for cname in metadata_col_names.difference(conn_col_names): if autogen_context.run_filters( metadata_cols_by_name[cname], cname, "column", False, None ): modify_table_ops.ops.append( ops.AddColumnOp.from_column_and_tablename( schema, tname, metadata_cols_by_name[cname] ) ) log.info("Detected added column '%s.%s'", name, cname) for colname in metadata_col_names.intersection(conn_col_names): metadata_col = metadata_cols_by_name[colname] conn_col = conn_table.c[colname] if not autogen_context.run_filters( metadata_col, colname, "column", False, conn_col ): continue alter_column_op = ops.AlterColumnOp(tname, colname, schema=schema) comparators.dispatch("column")( autogen_context, alter_column_op, schema, tname, colname, conn_col, metadata_col, ) if alter_column_op.has_changes(): modify_table_ops.ops.append(alter_column_op) yield for cname in set(conn_col_names).difference(metadata_col_names): if autogen_context.run_filters( conn_table.c[cname], cname, "column", True, None ): modify_table_ops.ops.append( ops.DropColumnOp.from_column_and_tablename( schema, tname, conn_table.c[cname] ) ) log.info("Detected removed column '%s.%s'", name, cname) class _constraint_sig(object): def md_name_to_sql_name(self, context): return self.name def __eq__(self, other): return self.const == other.const def __ne__(self, other): return self.const != other.const def __hash__(self): return hash(self.const) class _uq_constraint_sig(_constraint_sig): is_index = False is_unique = True def __init__(self, const): self.const = const self.name = const.name self.sig = tuple(sorted([col.name for col in const.columns])) @property def column_names(self): return [col.name for col in self.const.columns] class _ix_constraint_sig(_constraint_sig): is_index = True def __init__(self, const): self.const = const self.name = const.name self.sig = tuple(sorted([col.name for col in const.columns])) self.is_unique = bool(const.unique) def md_name_to_sql_name(self, context): return sqla_compat._get_index_final_name(context.dialect, self.const) @property def column_names(self): return sqla_compat._get_index_column_names(self.const) class _fk_constraint_sig(_constraint_sig): def __init__(self, const, include_options=False): self.const = const self.name = const.name ( self.source_schema, self.source_table, self.source_columns, self.target_schema, self.target_table, self.target_columns, onupdate, ondelete, deferrable, initially, ) = _fk_spec(const) self.sig = ( self.source_schema, self.source_table, tuple(self.source_columns), self.target_schema, self.target_table, tuple(self.target_columns), ) if include_options: self.sig += ( (None if onupdate.lower() == "no action" else onupdate.lower()) if onupdate else None, (None if ondelete.lower() == "no action" else ondelete.lower()) if ondelete else None, # convert initially + deferrable into one three-state value "initially_deferrable" if initially and initially.lower() == "deferred" else "deferrable" if deferrable else "not deferrable", ) @comparators.dispatch_for("table") def _compare_indexes_and_uniques( autogen_context, modify_ops, schema, tname, conn_table, metadata_table ): inspector = autogen_context.inspector is_create_table = conn_table is None is_drop_table = metadata_table is None # 1a. get raw indexes and unique constraints from metadata ... if metadata_table is not None: metadata_unique_constraints = set( uq for uq in metadata_table.constraints if isinstance(uq, sa_schema.UniqueConstraint) ) metadata_indexes = set(metadata_table.indexes) else: metadata_unique_constraints = set() metadata_indexes = set() conn_uniques = conn_indexes = frozenset() supports_unique_constraints = False unique_constraints_duplicate_unique_indexes = False if conn_table is not None: # 1b. ... and from connection, if the table exists if hasattr(inspector, "get_unique_constraints"): try: conn_uniques = inspector.get_unique_constraints( tname, schema=schema ) supports_unique_constraints = True except NotImplementedError: pass except TypeError: # number of arguments is off for the base # method in SQLAlchemy due to the cache decorator # not being present pass else: for uq in conn_uniques: if uq.get("duplicates_index"): unique_constraints_duplicate_unique_indexes = True try: conn_indexes = inspector.get_indexes(tname, schema=schema) except NotImplementedError: pass # 2. convert conn-level objects from raw inspector records # into schema objects if is_drop_table: # for DROP TABLE uniques are inline, don't need them conn_uniques = set() else: conn_uniques = set( _make_unique_constraint(uq_def, conn_table) for uq_def in conn_uniques ) conn_indexes = set(_make_index(ix, conn_table) for ix in conn_indexes) # 2a. if the dialect dupes unique indexes as unique constraints # (mysql and oracle), correct for that if unique_constraints_duplicate_unique_indexes: _correct_for_uq_duplicates_uix( conn_uniques, conn_indexes, metadata_unique_constraints, metadata_indexes, ) # 3. give the dialect a chance to omit indexes and constraints that # we know are either added implicitly by the DB or that the DB # can't accurately report on autogen_context.migration_context.impl.correct_for_autogen_constraints( conn_uniques, conn_indexes, metadata_unique_constraints, metadata_indexes, ) # 4. organize the constraints into "signature" collections, the # _constraint_sig() objects provide a consistent facade over both # Index and UniqueConstraint so we can easily work with them # interchangeably metadata_unique_constraints = set( _uq_constraint_sig(uq) for uq in metadata_unique_constraints ) metadata_indexes = set(_ix_constraint_sig(ix) for ix in metadata_indexes) conn_unique_constraints = set( _uq_constraint_sig(uq) for uq in conn_uniques ) conn_indexes = set(_ix_constraint_sig(ix) for ix in conn_indexes) # 5. index things by name, for those objects that have names metadata_names = dict( (c.md_name_to_sql_name(autogen_context), c) for c in metadata_unique_constraints.union(metadata_indexes) if c.name is not None ) conn_uniques_by_name = dict((c.name, c) for c in conn_unique_constraints) conn_indexes_by_name = dict((c.name, c) for c in conn_indexes) conn_names = dict( (c.name, c) for c in conn_unique_constraints.union(conn_indexes) if c.name is not None ) doubled_constraints = dict( (name, (conn_uniques_by_name[name], conn_indexes_by_name[name])) for name in set(conn_uniques_by_name).intersection( conn_indexes_by_name ) ) # 6. index things by "column signature", to help with unnamed unique # constraints. conn_uniques_by_sig = dict((uq.sig, uq) for uq in conn_unique_constraints) metadata_uniques_by_sig = dict( (uq.sig, uq) for uq in metadata_unique_constraints ) metadata_indexes_by_sig = dict((ix.sig, ix) for ix in metadata_indexes) unnamed_metadata_uniques = dict( (uq.sig, uq) for uq in metadata_unique_constraints if uq.name is None ) # assumptions: # 1. a unique constraint or an index from the connection *always* # has a name. # 2. an index on the metadata side *always* has a name. # 3. a unique constraint on the metadata side *might* have a name. # 4. The backend may double up indexes as unique constraints and # vice versa (e.g. MySQL, Postgresql) def obj_added(obj): if obj.is_index: if autogen_context.run_filters( obj.const, obj.name, "index", False, None ): modify_ops.ops.append(ops.CreateIndexOp.from_index(obj.const)) log.info( "Detected added index '%s' on %s", obj.name, ", ".join(["'%s'" % obj.column_names]), ) else: if not supports_unique_constraints: # can't report unique indexes as added if we don't # detect them return if is_create_table or is_drop_table: # unique constraints are created inline with table defs return if autogen_context.run_filters( obj.const, obj.name, "unique_constraint", False, None ): modify_ops.ops.append( ops.AddConstraintOp.from_constraint(obj.const) ) log.info( "Detected added unique constraint '%s' on %s", obj.name, ", ".join(["'%s'" % obj.column_names]), ) def obj_removed(obj): if obj.is_index: if obj.is_unique and not supports_unique_constraints: # many databases double up unique constraints # as unique indexes. without that list we can't # be sure what we're doing here return if autogen_context.run_filters( obj.const, obj.name, "index", True, None ): modify_ops.ops.append(ops.DropIndexOp.from_index(obj.const)) log.info( "Detected removed index '%s' on '%s'", obj.name, tname ) else: if is_create_table or is_drop_table: # if the whole table is being dropped, we don't need to # consider unique constraint separately return if autogen_context.run_filters( obj.const, obj.name, "unique_constraint", True, None ): modify_ops.ops.append( ops.DropConstraintOp.from_constraint(obj.const) ) log.info( "Detected removed unique constraint '%s' on '%s'", obj.name, tname, ) def obj_changed(old, new, msg): if old.is_index: if autogen_context.run_filters( new.const, new.name, "index", False, old.const ): log.info( "Detected changed index '%s' on '%s':%s", old.name, tname, ", ".join(msg), ) modify_ops.ops.append(ops.DropIndexOp.from_index(old.const)) modify_ops.ops.append(ops.CreateIndexOp.from_index(new.const)) else: if autogen_context.run_filters( new.const, new.name, "unique_constraint", False, old.const ): log.info( "Detected changed unique constraint '%s' on '%s':%s", old.name, tname, ", ".join(msg), ) modify_ops.ops.append( ops.DropConstraintOp.from_constraint(old.const) ) modify_ops.ops.append( ops.AddConstraintOp.from_constraint(new.const) ) for added_name in sorted(set(metadata_names).difference(conn_names)): obj = metadata_names[added_name] obj_added(obj) for existing_name in sorted(set(metadata_names).intersection(conn_names)): metadata_obj = metadata_names[existing_name] if existing_name in doubled_constraints: conn_uq, conn_idx = doubled_constraints[existing_name] if metadata_obj.is_index: conn_obj = conn_idx else: conn_obj = conn_uq else: conn_obj = conn_names[existing_name] if conn_obj.is_index != metadata_obj.is_index: obj_removed(conn_obj) obj_added(metadata_obj) else: msg = [] if conn_obj.is_unique != metadata_obj.is_unique: msg.append( " unique=%r to unique=%r" % (conn_obj.is_unique, metadata_obj.is_unique) ) if conn_obj.sig != metadata_obj.sig: msg.append( " columns %r to %r" % (conn_obj.sig, metadata_obj.sig) ) if msg: obj_changed(conn_obj, metadata_obj, msg) for removed_name in sorted(set(conn_names).difference(metadata_names)): conn_obj = conn_names[removed_name] if not conn_obj.is_index and conn_obj.sig in unnamed_metadata_uniques: continue elif removed_name in doubled_constraints: if ( conn_obj.sig not in metadata_indexes_by_sig and conn_obj.sig not in metadata_uniques_by_sig ): conn_uq, conn_idx = doubled_constraints[removed_name] obj_removed(conn_uq) obj_removed(conn_idx) else: obj_removed(conn_obj) for uq_sig in unnamed_metadata_uniques: if uq_sig not in conn_uniques_by_sig: obj_added(unnamed_metadata_uniques[uq_sig]) def _correct_for_uq_duplicates_uix( conn_unique_constraints, conn_indexes, metadata_unique_constraints, metadata_indexes, ): # dedupe unique indexes vs. constraints, since MySQL / Oracle # doesn't really have unique constraints as a separate construct. # but look in the metadata and try to maintain constructs # that already seem to be defined one way or the other # on that side. This logic was formerly local to MySQL dialect, # generalized to Oracle and others. See #276 metadata_uq_names = set( [ cons.name for cons in metadata_unique_constraints if cons.name is not None ] ) unnamed_metadata_uqs = set( [ _uq_constraint_sig(cons).sig for cons in metadata_unique_constraints if cons.name is None ] ) metadata_ix_names = set( [cons.name for cons in metadata_indexes if cons.unique] ) conn_ix_names = dict( (cons.name, cons) for cons in conn_indexes if cons.unique ) uqs_dupe_indexes = dict( (cons.name, cons) for cons in conn_unique_constraints if cons.info["duplicates_index"] ) for overlap in uqs_dupe_indexes: if overlap not in metadata_uq_names: if ( _uq_constraint_sig(uqs_dupe_indexes[overlap]).sig not in unnamed_metadata_uqs ): conn_unique_constraints.discard(uqs_dupe_indexes[overlap]) elif overlap not in metadata_ix_names: conn_indexes.discard(conn_ix_names[overlap]) @comparators.dispatch_for("column") def _compare_nullable( autogen_context, alter_column_op, schema, tname, cname, conn_col, metadata_col, ): # work around SQLAlchemy issue #3023 if metadata_col.primary_key: return metadata_col_nullable = metadata_col.nullable conn_col_nullable = conn_col.nullable alter_column_op.existing_nullable = conn_col_nullable if conn_col_nullable is not metadata_col_nullable: alter_column_op.modify_nullable = metadata_col_nullable log.info( "Detected %s on column '%s.%s'", "NULL" if metadata_col_nullable else "NOT NULL", tname, cname, ) @comparators.dispatch_for("column") def _setup_autoincrement( autogen_context, alter_column_op, schema, tname, cname, conn_col, metadata_col, ): if metadata_col.table._autoincrement_column is metadata_col: alter_column_op.kw["autoincrement"] = True elif metadata_col.autoincrement is True: alter_column_op.kw["autoincrement"] = True elif metadata_col.autoincrement is False: alter_column_op.kw["autoincrement"] = False @comparators.dispatch_for("column") def _compare_type( autogen_context, alter_column_op, schema, tname, cname, conn_col, metadata_col, ): conn_type = conn_col.type alter_column_op.existing_type = conn_type metadata_type = metadata_col.type if conn_type._type_affinity is sqltypes.NullType: log.info( "Couldn't determine database type " "for column '%s.%s'", tname, cname, ) return if metadata_type._type_affinity is sqltypes.NullType: log.info( "Column '%s.%s' has no type within " "the model; can't compare", tname, cname, ) return isdiff = autogen_context.migration_context._compare_type( conn_col, metadata_col ) if isdiff: alter_column_op.modify_type = metadata_type log.info( "Detected type change from %r to %r on '%s.%s'", conn_type, metadata_type, tname, cname, ) def _render_server_default_for_compare( metadata_default, metadata_col, autogen_context ): rendered = _user_defined_render( "server_default", metadata_default, autogen_context ) if rendered is not False: return rendered if isinstance(metadata_default, sa_schema.DefaultClause): if isinstance(metadata_default.arg, compat.string_types): metadata_default = metadata_default.arg else: metadata_default = str( metadata_default.arg.compile( dialect=autogen_context.dialect, compile_kwargs={"literal_binds": True}, ) ) if isinstance(metadata_default, compat.string_types): if metadata_col.type._type_affinity is sqltypes.String: metadata_default = re.sub(r"^'|'$", "", metadata_default) return repr(metadata_default) else: return metadata_default else: return None @comparators.dispatch_for("column") def _compare_server_default( autogen_context, alter_column_op, schema, tname, cname, conn_col, metadata_col, ): metadata_default = metadata_col.server_default conn_col_default = conn_col.server_default if conn_col_default is None and metadata_default is None: return False rendered_metadata_default = _render_server_default_for_compare( metadata_default, metadata_col, autogen_context ) rendered_conn_default = ( conn_col.server_default.arg.text if conn_col.server_default else None ) alter_column_op.existing_server_default = conn_col_default isdiff = autogen_context.migration_context._compare_server_default( conn_col, metadata_col, rendered_metadata_default, rendered_conn_default, ) if isdiff: alter_column_op.modify_server_default = metadata_default log.info("Detected server default on column '%s.%s'", tname, cname) @comparators.dispatch_for("column") def _compare_column_comment( autogen_context, alter_column_op, schema, tname, cname, conn_col, metadata_col, ): if not sqla_compat._dialect_supports_comments(autogen_context.dialect): return metadata_comment = metadata_col.comment conn_col_comment = conn_col.comment if conn_col_comment is None and metadata_comment is None: return False alter_column_op.existing_comment = conn_col_comment if conn_col_comment != metadata_comment: alter_column_op.modify_comment = metadata_comment log.info("Detected column comment '%s.%s'", tname, cname) @comparators.dispatch_for("table") def _compare_foreign_keys( autogen_context, modify_table_ops, schema, tname, conn_table, metadata_table, ): # if we're doing CREATE TABLE, all FKs are created # inline within the table def if conn_table is None or metadata_table is None: return inspector = autogen_context.inspector metadata_fks = set( fk for fk in metadata_table.constraints if isinstance(fk, sa_schema.ForeignKeyConstraint) ) conn_fks = inspector.get_foreign_keys(tname, schema=schema) backend_reflects_fk_options = conn_fks and "options" in conn_fks[0] conn_fks = set(_make_foreign_key(const, conn_table) for const in conn_fks) # give the dialect a chance to correct the FKs to match more # closely autogen_context.migration_context.impl.correct_for_autogen_foreignkeys( conn_fks, metadata_fks ) metadata_fks = set( _fk_constraint_sig(fk, include_options=backend_reflects_fk_options) for fk in metadata_fks ) conn_fks = set( _fk_constraint_sig(fk, include_options=backend_reflects_fk_options) for fk in conn_fks ) conn_fks_by_sig = dict((c.sig, c) for c in conn_fks) metadata_fks_by_sig = dict((c.sig, c) for c in metadata_fks) metadata_fks_by_name = dict( (c.name, c) for c in metadata_fks if c.name is not None ) conn_fks_by_name = dict( (c.name, c) for c in conn_fks if c.name is not None ) def _add_fk(obj, compare_to): if autogen_context.run_filters( obj.const, obj.name, "foreign_key_constraint", False, compare_to ): modify_table_ops.ops.append( ops.CreateForeignKeyOp.from_constraint(const.const) ) log.info( "Detected added foreign key (%s)(%s) on table %s%s", ", ".join(obj.source_columns), ", ".join(obj.target_columns), "%s." % obj.source_schema if obj.source_schema else "", obj.source_table, ) def _remove_fk(obj, compare_to): if autogen_context.run_filters( obj.const, obj.name, "foreign_key_constraint", True, compare_to ): modify_table_ops.ops.append( ops.DropConstraintOp.from_constraint(obj.const) ) log.info( "Detected removed foreign key (%s)(%s) on table %s%s", ", ".join(obj.source_columns), ", ".join(obj.target_columns), "%s." % obj.source_schema if obj.source_schema else "", obj.source_table, ) # so far it appears we don't need to do this by name at all. # SQLite doesn't preserve constraint names anyway for removed_sig in set(conn_fks_by_sig).difference(metadata_fks_by_sig): const = conn_fks_by_sig[removed_sig] if removed_sig not in metadata_fks_by_sig: compare_to = ( metadata_fks_by_name[const.name].const if const.name in metadata_fks_by_name else None ) _remove_fk(const, compare_to) for added_sig in set(metadata_fks_by_sig).difference(conn_fks_by_sig): const = metadata_fks_by_sig[added_sig] if added_sig not in conn_fks_by_sig: compare_to = ( conn_fks_by_name[const.name].const if const.name in conn_fks_by_name else None ) _add_fk(const, compare_to) @comparators.dispatch_for("table") def _compare_table_comment( autogen_context, modify_table_ops, schema, tname, conn_table, metadata_table, ): if not sqla_compat._dialect_supports_comments(autogen_context.dialect): return # if we're doing CREATE TABLE, comments will be created inline # with the create_table op. if conn_table is None or metadata_table is None: return if conn_table.comment is None and metadata_table.comment is None: return if metadata_table.comment is None and conn_table.comment is not None: modify_table_ops.ops.append( ops.DropTableCommentOp( tname, existing_comment=conn_table.comment, schema=schema ) ) elif metadata_table.comment != conn_table.comment: modify_table_ops.ops.append( ops.CreateTableCommentOp( tname, metadata_table.comment, existing_comment=conn_table.comment, schema=schema, ) ) zzzeek-alembic-bee044a1c187/alembic/autogenerate/render.py000066400000000000000000000654541353106760100235300ustar00rootroot00000000000000import re from mako.pygen import PythonPrinter from sqlalchemy import schema as sa_schema from sqlalchemy import sql from sqlalchemy import types as sqltypes from .. import util from ..operations import ops from ..util import compat from ..util import sqla_compat from ..util.compat import string_types from ..util.compat import StringIO MAX_PYTHON_ARGS = 255 try: from sqlalchemy.sql.naming import conv def _render_gen_name(autogen_context, name): if isinstance(name, conv): return _f_name(_alembic_autogenerate_prefix(autogen_context), name) else: return name except ImportError: def _render_gen_name(autogen_context, name): return name def _indent(text): text = re.compile(r"^", re.M).sub(" ", text).strip() text = re.compile(r" +$", re.M).sub("", text) return text def _render_python_into_templatevars( autogen_context, migration_script, template_args ): imports = autogen_context.imports for upgrade_ops, downgrade_ops in zip( migration_script.upgrade_ops_list, migration_script.downgrade_ops_list ): template_args[upgrade_ops.upgrade_token] = _indent( _render_cmd_body(upgrade_ops, autogen_context) ) template_args[downgrade_ops.downgrade_token] = _indent( _render_cmd_body(downgrade_ops, autogen_context) ) template_args["imports"] = "\n".join(sorted(imports)) default_renderers = renderers = util.Dispatcher() def _render_cmd_body(op_container, autogen_context): buf = StringIO() printer = PythonPrinter(buf) printer.writeline( "# ### commands auto generated by Alembic - please adjust! ###" ) if not op_container.ops: printer.writeline("pass") else: for op in op_container.ops: lines = render_op(autogen_context, op) for line in lines: printer.writeline(line) printer.writeline("# ### end Alembic commands ###") return buf.getvalue() def render_op(autogen_context, op): renderer = renderers.dispatch(op) lines = util.to_list(renderer(autogen_context, op)) return lines def render_op_text(autogen_context, op): return "\n".join(render_op(autogen_context, op)) @renderers.dispatch_for(ops.ModifyTableOps) def _render_modify_table(autogen_context, op): opts = autogen_context.opts render_as_batch = opts.get("render_as_batch", False) if op.ops: lines = [] if render_as_batch: with autogen_context._within_batch(): lines.append( "with op.batch_alter_table(%r, schema=%r) as batch_op:" % (op.table_name, op.schema) ) for t_op in op.ops: t_lines = render_op(autogen_context, t_op) lines.extend(t_lines) lines.append("") else: for t_op in op.ops: t_lines = render_op(autogen_context, t_op) lines.extend(t_lines) return lines else: return ["pass"] @renderers.dispatch_for(ops.CreateTableCommentOp) def _render_create_table_comment(autogen_context, op): templ = ( "{prefix}create_table_comment(\n" "{indent}'{tname}',\n" "{indent}{comment},\n" "{indent}existing_comment={existing},\n" "{indent}schema={schema}\n" ")" ) return templ.format( prefix=_alembic_autogenerate_prefix(autogen_context), tname=op.table_name, comment="'%s'" % op.comment if op.comment is not None else None, existing="'%s'" % op.existing_comment if op.existing_comment is not None else None, schema="'%s'" % op.schema if op.schema is not None else None, indent=" ", ) @renderers.dispatch_for(ops.DropTableCommentOp) def _render_drop_table_comment(autogen_context, op): templ = ( "{prefix}drop_table_comment(\n" "{indent}'{tname}',\n" "{indent}existing_comment={existing},\n" "{indent}schema={schema}\n" ")" ) return templ.format( prefix=_alembic_autogenerate_prefix(autogen_context), tname=op.table_name, existing="'%s'" % op.existing_comment if op.existing_comment is not None else None, schema="'%s'" % op.schema if op.schema is not None else None, indent=" ", ) @renderers.dispatch_for(ops.CreateTableOp) def _add_table(autogen_context, op): table = op.to_table() args = [ col for col in [ _render_column(col, autogen_context) for col in table.columns ] if col ] + sorted( [ rcons for rcons in [ _render_constraint(cons, autogen_context) for cons in table.constraints ] if rcons is not None ] ) if len(args) > MAX_PYTHON_ARGS: args = "*[" + ",\n".join(args) + "]" else: args = ",\n".join(args) text = "%(prefix)screate_table(%(tablename)r,\n%(args)s" % { "tablename": _ident(op.table_name), "prefix": _alembic_autogenerate_prefix(autogen_context), "args": args, } if op.schema: text += ",\nschema=%r" % _ident(op.schema) comment = sqla_compat._comment_attribute(table) if comment: text += ",\ncomment=%r" % _ident(comment) for k in sorted(op.kw): text += ",\n%s=%r" % (k.replace(" ", "_"), op.kw[k]) text += "\n)" return text @renderers.dispatch_for(ops.DropTableOp) def _drop_table(autogen_context, op): text = "%(prefix)sdrop_table(%(tname)r" % { "prefix": _alembic_autogenerate_prefix(autogen_context), "tname": _ident(op.table_name), } if op.schema: text += ", schema=%r" % _ident(op.schema) text += ")" return text @renderers.dispatch_for(ops.CreateIndexOp) def _add_index(autogen_context, op): index = op.to_index() has_batch = autogen_context._has_batch if has_batch: tmpl = ( "%(prefix)screate_index(%(name)r, [%(columns)s], " "unique=%(unique)r%(kwargs)s)" ) else: tmpl = ( "%(prefix)screate_index(%(name)r, %(table)r, [%(columns)s], " "unique=%(unique)r%(schema)s%(kwargs)s)" ) text = tmpl % { "prefix": _alembic_autogenerate_prefix(autogen_context), "name": _render_gen_name(autogen_context, index.name), "table": _ident(index.table.name), "columns": ", ".join( _get_index_rendered_expressions(index, autogen_context) ), "unique": index.unique or False, "schema": (", schema=%r" % _ident(index.table.schema)) if index.table.schema else "", "kwargs": ( ", " + ", ".join( [ "%s=%s" % (key, _render_potential_expr(val, autogen_context)) for key, val in index.kwargs.items() ] ) ) if len(index.kwargs) else "", } return text @renderers.dispatch_for(ops.DropIndexOp) def _drop_index(autogen_context, op): has_batch = autogen_context._has_batch if has_batch: tmpl = "%(prefix)sdrop_index(%(name)r)" else: tmpl = ( "%(prefix)sdrop_index(%(name)r, " "table_name=%(table_name)r%(schema)s)" ) text = tmpl % { "prefix": _alembic_autogenerate_prefix(autogen_context), "name": _render_gen_name(autogen_context, op.index_name), "table_name": _ident(op.table_name), "schema": ((", schema=%r" % _ident(op.schema)) if op.schema else ""), } return text @renderers.dispatch_for(ops.CreateUniqueConstraintOp) def _add_unique_constraint(autogen_context, op): return [_uq_constraint(op.to_constraint(), autogen_context, True)] @renderers.dispatch_for(ops.CreateForeignKeyOp) def _add_fk_constraint(autogen_context, op): args = [repr(_render_gen_name(autogen_context, op.constraint_name))] if not autogen_context._has_batch: args.append(repr(_ident(op.source_table))) args.extend( [ repr(_ident(op.referent_table)), repr([_ident(col) for col in op.local_cols]), repr([_ident(col) for col in op.remote_cols]), ] ) kwargs = [ "referent_schema", "onupdate", "ondelete", "initially", "deferrable", "use_alter", ] if not autogen_context._has_batch: kwargs.insert(0, "source_schema") for k in kwargs: if k in op.kw: value = op.kw[k] if value is not None: args.append("%s=%r" % (k, value)) return "%(prefix)screate_foreign_key(%(args)s)" % { "prefix": _alembic_autogenerate_prefix(autogen_context), "args": ", ".join(args), } @renderers.dispatch_for(ops.CreatePrimaryKeyOp) def _add_pk_constraint(constraint, autogen_context): raise NotImplementedError() @renderers.dispatch_for(ops.CreateCheckConstraintOp) def _add_check_constraint(constraint, autogen_context): raise NotImplementedError() @renderers.dispatch_for(ops.DropConstraintOp) def _drop_constraint(autogen_context, op): if autogen_context._has_batch: template = "%(prefix)sdrop_constraint" "(%(name)r, type_=%(type)r)" else: template = ( "%(prefix)sdrop_constraint" "(%(name)r, '%(table_name)s'%(schema)s, type_=%(type)r)" ) text = template % { "prefix": _alembic_autogenerate_prefix(autogen_context), "name": _render_gen_name(autogen_context, op.constraint_name), "table_name": _ident(op.table_name), "type": op.constraint_type, "schema": (", schema=%r" % _ident(op.schema)) if op.schema else "", } return text @renderers.dispatch_for(ops.AddColumnOp) def _add_column(autogen_context, op): schema, tname, column = op.schema, op.table_name, op.column if autogen_context._has_batch: template = "%(prefix)sadd_column(%(column)s)" else: template = "%(prefix)sadd_column(%(tname)r, %(column)s" if schema: template += ", schema=%(schema)r" template += ")" text = template % { "prefix": _alembic_autogenerate_prefix(autogen_context), "tname": tname, "column": _render_column(column, autogen_context), "schema": schema, } return text @renderers.dispatch_for(ops.DropColumnOp) def _drop_column(autogen_context, op): schema, tname, column_name = op.schema, op.table_name, op.column_name if autogen_context._has_batch: template = "%(prefix)sdrop_column(%(cname)r)" else: template = "%(prefix)sdrop_column(%(tname)r, %(cname)r" if schema: template += ", schema=%(schema)r" template += ")" text = template % { "prefix": _alembic_autogenerate_prefix(autogen_context), "tname": _ident(tname), "cname": _ident(column_name), "schema": _ident(schema), } return text @renderers.dispatch_for(ops.AlterColumnOp) def _alter_column(autogen_context, op): tname = op.table_name cname = op.column_name server_default = op.modify_server_default type_ = op.modify_type nullable = op.modify_nullable comment = op.modify_comment autoincrement = op.kw.get("autoincrement", None) existing_type = op.existing_type existing_nullable = op.existing_nullable existing_comment = op.existing_comment existing_server_default = op.existing_server_default schema = op.schema indent = " " * 11 if autogen_context._has_batch: template = "%(prefix)salter_column(%(cname)r" else: template = "%(prefix)salter_column(%(tname)r, %(cname)r" text = template % { "prefix": _alembic_autogenerate_prefix(autogen_context), "tname": tname, "cname": cname, } if existing_type is not None: text += ",\n%sexisting_type=%s" % ( indent, _repr_type(existing_type, autogen_context), ) if server_default is not False: rendered = _render_server_default(server_default, autogen_context) text += ",\n%sserver_default=%s" % (indent, rendered) if type_ is not None: text += ",\n%stype_=%s" % (indent, _repr_type(type_, autogen_context)) if nullable is not None: text += ",\n%snullable=%r" % (indent, nullable) if comment is not False: text += ",\n%scomment=%r" % (indent, comment) if existing_comment is not None: text += ",\n%sexisting_comment=%r" % (indent, existing_comment) if nullable is None and existing_nullable is not None: text += ",\n%sexisting_nullable=%r" % (indent, existing_nullable) if autoincrement is not None: text += ",\n%sautoincrement=%r" % (indent, autoincrement) if server_default is False and existing_server_default: rendered = _render_server_default( existing_server_default, autogen_context ) text += ",\n%sexisting_server_default=%s" % (indent, rendered) if schema and not autogen_context._has_batch: text += ",\n%sschema=%r" % (indent, schema) text += ")" return text class _f_name(object): def __init__(self, prefix, name): self.prefix = prefix self.name = name def __repr__(self): return "%sf(%r)" % (self.prefix, _ident(self.name)) def _ident(name): """produce a __repr__() object for a string identifier that may use quoted_name() in SQLAlchemy 0.9 and greater. The issue worked around here is that quoted_name() doesn't have very good repr() behavior by itself when unicode is involved. """ if name is None: return name elif isinstance(name, sql.elements.quoted_name): if compat.py2k: # the attempt to encode to ascii here isn't super ideal, # however we are trying to cut down on an explosion of # u'' literals only when py2k + SQLA 0.9, in particular # makes unit tests testing code generation very difficult try: return name.encode("ascii") except UnicodeError: return compat.text_type(name) else: return compat.text_type(name) elif isinstance(name, compat.string_types): return name def _render_potential_expr( value, autogen_context, wrap_in_text=True, is_server_default=False ): if isinstance(value, sql.ClauseElement): if wrap_in_text: template = "%(prefix)stext(%(sql)r)" else: template = "%(sql)r" return template % { "prefix": _sqlalchemy_autogenerate_prefix(autogen_context), "sql": autogen_context.migration_context.impl.render_ddl_sql_expr( value, is_server_default=is_server_default ), } else: return repr(value) def _get_index_rendered_expressions(idx, autogen_context): return [ repr(_ident(getattr(exp, "name", None))) if isinstance(exp, sa_schema.Column) else _render_potential_expr(exp, autogen_context) for exp in idx.expressions ] def _uq_constraint(constraint, autogen_context, alter): opts = [] has_batch = autogen_context._has_batch if constraint.deferrable: opts.append(("deferrable", str(constraint.deferrable))) if constraint.initially: opts.append(("initially", str(constraint.initially))) if not has_batch and alter and constraint.table.schema: opts.append(("schema", _ident(constraint.table.schema))) if not alter and constraint.name: opts.append( ("name", _render_gen_name(autogen_context, constraint.name)) ) if alter: args = [repr(_render_gen_name(autogen_context, constraint.name))] if not has_batch: args += [repr(_ident(constraint.table.name))] args.append(repr([_ident(col.name) for col in constraint.columns])) args.extend(["%s=%r" % (k, v) for k, v in opts]) return "%(prefix)screate_unique_constraint(%(args)s)" % { "prefix": _alembic_autogenerate_prefix(autogen_context), "args": ", ".join(args), } else: args = [repr(_ident(col.name)) for col in constraint.columns] args.extend(["%s=%r" % (k, v) for k, v in opts]) return "%(prefix)sUniqueConstraint(%(args)s)" % { "prefix": _sqlalchemy_autogenerate_prefix(autogen_context), "args": ", ".join(args), } def _user_autogenerate_prefix(autogen_context, target): prefix = autogen_context.opts["user_module_prefix"] if prefix is None: return "%s." % target.__module__ else: return prefix def _sqlalchemy_autogenerate_prefix(autogen_context): return autogen_context.opts["sqlalchemy_module_prefix"] or "" def _alembic_autogenerate_prefix(autogen_context): if autogen_context._has_batch: return "batch_op." else: return autogen_context.opts["alembic_module_prefix"] or "" def _user_defined_render(type_, object_, autogen_context): if "render_item" in autogen_context.opts: render = autogen_context.opts["render_item"] if render: rendered = render(type_, object_, autogen_context) if rendered is not False: return rendered return False def _render_column(column, autogen_context): rendered = _user_defined_render("column", column, autogen_context) if rendered is not False: return rendered opts = [] if column.server_default: rendered = _render_server_default( column.server_default, autogen_context ) if rendered: opts.append(("server_default", rendered)) if ( column.autoincrement is not None and column.autoincrement != sqla_compat.AUTOINCREMENT_DEFAULT ): opts.append(("autoincrement", column.autoincrement)) if column.nullable is not None: opts.append(("nullable", column.nullable)) if column.system: opts.append(("system", column.system)) comment = sqla_compat._comment_attribute(column) if comment: opts.append(("comment", "%r" % comment)) # TODO: for non-ascii colname, assign a "key" return "%(prefix)sColumn(%(name)r, %(type)s, %(kw)s)" % { "prefix": _sqlalchemy_autogenerate_prefix(autogen_context), "name": _ident(column.name), "type": _repr_type(column.type, autogen_context), "kw": ", ".join(["%s=%s" % (kwname, val) for kwname, val in opts]), } def _render_server_default(default, autogen_context, repr_=True): rendered = _user_defined_render("server_default", default, autogen_context) if rendered is not False: return rendered if isinstance(default, sa_schema.DefaultClause): if isinstance(default.arg, compat.string_types): default = default.arg else: return _render_potential_expr( default.arg, autogen_context, is_server_default=True ) if isinstance(default, string_types) and repr_: default = repr(re.sub(r"^'|'$", "", default)) return default def _repr_type(type_, autogen_context): rendered = _user_defined_render("type", type_, autogen_context) if rendered is not False: return rendered if hasattr(autogen_context.migration_context, "impl"): impl_rt = autogen_context.migration_context.impl.render_type( type_, autogen_context ) else: impl_rt = None mod = type(type_).__module__ imports = autogen_context.imports if mod.startswith("sqlalchemy.dialects"): dname = re.match(r"sqlalchemy\.dialects\.(\w+)", mod).group(1) if imports is not None: imports.add("from sqlalchemy.dialects import %s" % dname) if impl_rt: return impl_rt else: return "%s.%r" % (dname, type_) elif impl_rt: return impl_rt elif mod.startswith("sqlalchemy."): if "_render_%s_type" % type_.__visit_name__ in globals(): fn = globals()["_render_%s_type" % type_.__visit_name__] return fn(type_, autogen_context) else: prefix = _sqlalchemy_autogenerate_prefix(autogen_context) return "%s%r" % (prefix, type_) else: prefix = _user_autogenerate_prefix(autogen_context, type_) return "%s%r" % (prefix, type_) def _render_ARRAY_type(type_, autogen_context): return _render_type_w_subtype( type_, autogen_context, "item_type", r"(.+?\()" ) def _render_type_w_subtype( type_, autogen_context, attrname, regexp, prefix=None ): outer_repr = repr(type_) inner_type = getattr(type_, attrname, None) if inner_type is None: return False inner_repr = repr(inner_type) inner_repr = re.sub(r"([\(\)])", r"\\\1", inner_repr) sub_type = _repr_type(getattr(type_, attrname), autogen_context) outer_type = re.sub(regexp + inner_repr, r"\1%s" % sub_type, outer_repr) if prefix: return "%s%s" % (prefix, outer_type) mod = type(type_).__module__ if mod.startswith("sqlalchemy.dialects"): dname = re.match(r"sqlalchemy\.dialects\.(\w+)", mod).group(1) return "%s.%s" % (dname, outer_type) elif mod.startswith("sqlalchemy"): prefix = _sqlalchemy_autogenerate_prefix(autogen_context) return "%s%s" % (prefix, outer_type) else: return None _constraint_renderers = util.Dispatcher() def _render_constraint(constraint, autogen_context): try: renderer = _constraint_renderers.dispatch(constraint) except ValueError: util.warn("No renderer is established for object %r" % constraint) return "[Unknown Python object %r]" % constraint else: return renderer(constraint, autogen_context) @_constraint_renderers.dispatch_for(sa_schema.PrimaryKeyConstraint) def _render_primary_key(constraint, autogen_context): rendered = _user_defined_render("primary_key", constraint, autogen_context) if rendered is not False: return rendered if not constraint.columns: return None opts = [] if constraint.name: opts.append( ("name", repr(_render_gen_name(autogen_context, constraint.name))) ) return "%(prefix)sPrimaryKeyConstraint(%(args)s)" % { "prefix": _sqlalchemy_autogenerate_prefix(autogen_context), "args": ", ".join( [repr(c.name) for c in constraint.columns] + ["%s=%s" % (kwname, val) for kwname, val in opts] ), } def _fk_colspec(fk, metadata_schema): """Implement a 'safe' version of ForeignKey._get_colspec() that won't fail if the remote table can't be resolved. """ colspec = fk._get_colspec() tokens = colspec.split(".") tname, colname = tokens[-2:] if metadata_schema is not None and len(tokens) == 2: table_fullname = "%s.%s" % (metadata_schema, tname) else: table_fullname = ".".join(tokens[0:-1]) if ( not fk.link_to_name and fk.parent is not None and fk.parent.table is not None ): # try to resolve the remote table in order to adjust for column.key. # the FK constraint needs to be rendered in terms of the column # name. parent_metadata = fk.parent.table.metadata if table_fullname in parent_metadata.tables: col = parent_metadata.tables[table_fullname].c.get(colname) if col is not None: colname = _ident(col.name) colspec = "%s.%s" % (table_fullname, colname) return colspec def _populate_render_fk_opts(constraint, opts): if constraint.onupdate: opts.append(("onupdate", repr(constraint.onupdate))) if constraint.ondelete: opts.append(("ondelete", repr(constraint.ondelete))) if constraint.initially: opts.append(("initially", repr(constraint.initially))) if constraint.deferrable: opts.append(("deferrable", repr(constraint.deferrable))) if constraint.use_alter: opts.append(("use_alter", repr(constraint.use_alter))) @_constraint_renderers.dispatch_for(sa_schema.ForeignKeyConstraint) def _render_foreign_key(constraint, autogen_context): rendered = _user_defined_render("foreign_key", constraint, autogen_context) if rendered is not False: return rendered opts = [] if constraint.name: opts.append( ("name", repr(_render_gen_name(autogen_context, constraint.name))) ) _populate_render_fk_opts(constraint, opts) apply_metadata_schema = constraint.parent.metadata.schema return ( "%(prefix)sForeignKeyConstraint([%(cols)s], " "[%(refcols)s], %(args)s)" % { "prefix": _sqlalchemy_autogenerate_prefix(autogen_context), "cols": ", ".join( "%r" % _ident(f.parent.name) for f in constraint.elements ), "refcols": ", ".join( repr(_fk_colspec(f, apply_metadata_schema)) for f in constraint.elements ), "args": ", ".join( ["%s=%s" % (kwname, val) for kwname, val in opts] ), } ) @_constraint_renderers.dispatch_for(sa_schema.UniqueConstraint) def _render_unique_constraint(constraint, autogen_context): rendered = _user_defined_render("unique", constraint, autogen_context) if rendered is not False: return rendered return _uq_constraint(constraint, autogen_context, False) @_constraint_renderers.dispatch_for(sa_schema.CheckConstraint) def _render_check_constraint(constraint, autogen_context): rendered = _user_defined_render("check", constraint, autogen_context) if rendered is not False: return rendered # detect the constraint being part of # a parent type which is probably in the Table already. # ideally SQLAlchemy would give us more of a first class # way to detect this. if ( constraint._create_rule and hasattr(constraint._create_rule, "target") and isinstance(constraint._create_rule.target, sqltypes.TypeEngine) ): return None opts = [] if constraint.name: opts.append( ("name", repr(_render_gen_name(autogen_context, constraint.name))) ) return "%(prefix)sCheckConstraint(%(sqltext)s%(opts)s)" % { "prefix": _sqlalchemy_autogenerate_prefix(autogen_context), "opts": ", " + (", ".join("%s=%s" % (k, v) for k, v in opts)) if opts else "", "sqltext": _render_potential_expr( constraint.sqltext, autogen_context, wrap_in_text=False ), } @renderers.dispatch_for(ops.ExecuteSQLOp) def _execute_sql(autogen_context, op): if not isinstance(op.sqltext, string_types): raise NotImplementedError( "Autogenerate rendering of SQL Expression language constructs " "not supported here; please use a plain SQL string" ) return "op.execute(%r)" % op.sqltext renderers = default_renderers.branch() zzzeek-alembic-bee044a1c187/alembic/autogenerate/rewriter.py000066400000000000000000000126341353106760100241040ustar00rootroot00000000000000from alembic import util from alembic.operations import ops class Rewriter(object): """A helper object that allows easy 'rewriting' of ops streams. The :class:`.Rewriter` object is intended to be passed along to the :paramref:`.EnvironmentContext.configure.process_revision_directives` parameter in an ``env.py`` script. Once constructed, any number of "rewrites" functions can be associated with it, which will be given the opportunity to modify the structure without having to have explicit knowledge of the overall structure. The function is passed the :class:`.MigrationContext` object and ``revision`` tuple that are passed to the :paramref:`.Environment Context.configure.process_revision_directives` function normally, and the third argument is an individual directive of the type noted in the decorator. The function has the choice of returning a single op directive, which normally can be the directive that was actually passed, or a new directive to replace it, or a list of zero or more directives to replace it. .. seealso:: :ref:`autogen_rewriter` - usage example .. versionadded:: 0.8 """ _traverse = util.Dispatcher() _chained = None def __init__(self): self.dispatch = util.Dispatcher() def chain(self, other): """Produce a "chain" of this :class:`.Rewriter` to another. This allows two rewriters to operate serially on a stream, e.g.:: writer1 = autogenerate.Rewriter() writer2 = autogenerate.Rewriter() @writer1.rewrites(ops.AddColumnOp) def add_column_nullable(context, revision, op): op.column.nullable = True return op @writer2.rewrites(ops.AddColumnOp) def add_column_idx(context, revision, op): idx_op = ops.CreateIndexOp( 'ixc', op.table_name, [op.column.name]) return [ op, idx_op ] writer = writer1.chain(writer2) :param other: a :class:`.Rewriter` instance :return: a new :class:`.Rewriter` that will run the operations of this writer, then the "other" writer, in succession. """ wr = self.__class__.__new__(self.__class__) wr.__dict__.update(self.__dict__) wr._chained = other return wr def rewrites(self, operator): """Register a function as rewriter for a given type. The function should receive three arguments, which are the :class:`.MigrationContext`, a ``revision`` tuple, and an op directive of the type indicated. E.g.:: @writer1.rewrites(ops.AddColumnOp) def add_column_nullable(context, revision, op): op.column.nullable = True return op """ return self.dispatch.dispatch_for(operator) def _rewrite(self, context, revision, directive): try: _rewriter = self.dispatch.dispatch(directive) except ValueError: _rewriter = None yield directive else: for r_directive in util.to_list( _rewriter(context, revision, directive) ): yield r_directive def __call__(self, context, revision, directives): self.process_revision_directives(context, revision, directives) if self._chained: self._chained(context, revision, directives) @_traverse.dispatch_for(ops.MigrationScript) def _traverse_script(self, context, revision, directive): upgrade_ops_list = [] for upgrade_ops in directive.upgrade_ops_list: ret = self._traverse_for(context, revision, directive.upgrade_ops) if len(ret) != 1: raise ValueError( "Can only return single object for UpgradeOps traverse" ) upgrade_ops_list.append(ret[0]) directive.upgrade_ops = upgrade_ops_list downgrade_ops_list = [] for downgrade_ops in directive.downgrade_ops_list: ret = self._traverse_for( context, revision, directive.downgrade_ops ) if len(ret) != 1: raise ValueError( "Can only return single object for DowngradeOps traverse" ) downgrade_ops_list.append(ret[0]) directive.downgrade_ops = downgrade_ops_list @_traverse.dispatch_for(ops.OpContainer) def _traverse_op_container(self, context, revision, directive): self._traverse_list(context, revision, directive.ops) @_traverse.dispatch_for(ops.MigrateOperation) def _traverse_any_directive(self, context, revision, directive): pass def _traverse_for(self, context, revision, directive): directives = list(self._rewrite(context, revision, directive)) for directive in directives: traverser = self._traverse.dispatch(directive) traverser(self, context, revision, directive) return directives def _traverse_list(self, context, revision, directives): dest = [] for directive in directives: dest.extend(self._traverse_for(context, revision, directive)) directives[:] = dest def process_revision_directives(self, context, revision, directives): self._traverse_list(context, revision, directives) zzzeek-alembic-bee044a1c187/alembic/command.py000066400000000000000000000377561353106760100212100ustar00rootroot00000000000000import os from . import autogenerate as autogen from . import util from .runtime.environment import EnvironmentContext from .script import ScriptDirectory def list_templates(config): """List available templates. :param config: a :class:`.Config` object. """ config.print_stdout("Available templates:\n") for tempname in os.listdir(config.get_template_directory()): with open( os.path.join(config.get_template_directory(), tempname, "README") ) as readme: synopsis = next(readme) config.print_stdout("%s - %s", tempname, synopsis) config.print_stdout("\nTemplates are used via the 'init' command, e.g.:") config.print_stdout("\n alembic init --template generic ./scripts") def init(config, directory, template="generic"): """Initialize a new scripts directory. :param config: a :class:`.Config` object. :param directory: string path of the target directory :param template: string name of the migration environment template to use. """ if os.access(directory, os.F_OK) and os.listdir(directory): raise util.CommandError( "Directory %s already exists and is not empty" % directory ) template_dir = os.path.join(config.get_template_directory(), template) if not os.access(template_dir, os.F_OK): raise util.CommandError("No such template %r" % template) if not os.access(directory, os.F_OK): util.status( "Creating directory %s" % os.path.abspath(directory), os.makedirs, directory, ) versions = os.path.join(directory, "versions") util.status( "Creating directory %s" % os.path.abspath(versions), os.makedirs, versions, ) script = ScriptDirectory(directory) for file_ in os.listdir(template_dir): file_path = os.path.join(template_dir, file_) if file_ == "alembic.ini.mako": config_file = os.path.abspath(config.config_file_name) if os.access(config_file, os.F_OK): util.msg("File %s already exists, skipping" % config_file) else: script._generate_template( file_path, config_file, script_location=directory ) elif os.path.isfile(file_path): output_file = os.path.join(directory, file_) script._copy_file(file_path, output_file) util.msg( "Please edit configuration/connection/logging " "settings in %r before proceeding." % config_file ) def revision( config, message=None, autogenerate=False, sql=False, head="head", splice=False, branch_label=None, version_path=None, rev_id=None, depends_on=None, process_revision_directives=None, ): """Create a new revision file. :param config: a :class:`.Config` object. :param message: string message to apply to the revision; this is the ``-m`` option to ``alembic revision``. :param autogenerate: whether or not to autogenerate the script from the database; this is the ``--autogenerate`` option to ``alembic revision``. :param sql: whether to dump the script out as a SQL string; when specified, the script is dumped to stdout. This is the ``--sql`` option to ``alembic revision``. :param head: head revision to build the new revision upon as a parent; this is the ``--head`` option to ``alembic revision``. :param splice: whether or not the new revision should be made into a new head of its own; is required when the given ``head`` is not itself a head. This is the ``--splice`` option to ``alembic revision``. :param branch_label: string label to apply to the branch; this is the ``--branch-label`` option to ``alembic revision``. :param version_path: string symbol identifying a specific version path from the configuration; this is the ``--version-path`` option to ``alembic revision``. :param rev_id: optional revision identifier to use instead of having one generated; this is the ``--rev-id`` option to ``alembic revision``. :param depends_on: optional list of "depends on" identifiers; this is the ``--depends-on`` option to ``alembic revision``. :param process_revision_directives: this is a callable that takes the same form as the callable described at :paramref:`.EnvironmentContext.configure.process_revision_directives`; will be applied to the structure generated by the revision process where it can be altered programmatically. Note that unlike all the other parameters, this option is only available via programmatic use of :func:`.command.revision` .. versionadded:: 0.9.0 """ script_directory = ScriptDirectory.from_config(config) command_args = dict( message=message, autogenerate=autogenerate, sql=sql, head=head, splice=splice, branch_label=branch_label, version_path=version_path, rev_id=rev_id, depends_on=depends_on, ) revision_context = autogen.RevisionContext( config, script_directory, command_args, process_revision_directives=process_revision_directives, ) environment = util.asbool(config.get_main_option("revision_environment")) if autogenerate: environment = True if sql: raise util.CommandError( "Using --sql with --autogenerate does not make any sense" ) def retrieve_migrations(rev, context): revision_context.run_autogenerate(rev, context) return [] elif environment: def retrieve_migrations(rev, context): revision_context.run_no_autogenerate(rev, context) return [] elif sql: raise util.CommandError( "Using --sql with the revision command when " "revision_environment is not configured does not make any sense" ) if environment: with EnvironmentContext( config, script_directory, fn=retrieve_migrations, as_sql=sql, template_args=revision_context.template_args, revision_context=revision_context, ): script_directory.run_env() scripts = [script for script in revision_context.generate_scripts()] if len(scripts) == 1: return scripts[0] else: return scripts def merge(config, revisions, message=None, branch_label=None, rev_id=None): """Merge two revisions together. Creates a new migration file. .. versionadded:: 0.7.0 :param config: a :class:`.Config` instance :param message: string message to apply to the revision :param branch_label: string label name to apply to the new revision :param rev_id: hardcoded revision identifier instead of generating a new one. .. seealso:: :ref:`branches` """ script = ScriptDirectory.from_config(config) template_args = { "config": config # Let templates use config for # e.g. multiple databases } return script.generate_revision( rev_id or util.rev_id(), message, refresh=True, head=revisions, branch_labels=branch_label, **template_args ) def upgrade(config, revision, sql=False, tag=None): """Upgrade to a later version. :param config: a :class:`.Config` instance. :param revision: string revision target or range for --sql mode :param sql: if True, use ``--sql`` mode :param tag: an arbitrary "tag" that can be intercepted by custom ``env.py`` scripts via the :meth:`.EnvironmentContext.get_tag_argument` method. """ script = ScriptDirectory.from_config(config) starting_rev = None if ":" in revision: if not sql: raise util.CommandError("Range revision not allowed") starting_rev, revision = revision.split(":", 2) def upgrade(rev, context): return script._upgrade_revs(revision, rev) with EnvironmentContext( config, script, fn=upgrade, as_sql=sql, starting_rev=starting_rev, destination_rev=revision, tag=tag, ): script.run_env() def downgrade(config, revision, sql=False, tag=None): """Revert to a previous version. :param config: a :class:`.Config` instance. :param revision: string revision target or range for --sql mode :param sql: if True, use ``--sql`` mode :param tag: an arbitrary "tag" that can be intercepted by custom ``env.py`` scripts via the :meth:`.EnvironmentContext.get_tag_argument` method. """ script = ScriptDirectory.from_config(config) starting_rev = None if ":" in revision: if not sql: raise util.CommandError("Range revision not allowed") starting_rev, revision = revision.split(":", 2) elif sql: raise util.CommandError( "downgrade with --sql requires :" ) def downgrade(rev, context): return script._downgrade_revs(revision, rev) with EnvironmentContext( config, script, fn=downgrade, as_sql=sql, starting_rev=starting_rev, destination_rev=revision, tag=tag, ): script.run_env() def show(config, rev): """Show the revision(s) denoted by the given symbol. :param config: a :class:`.Config` instance. :param revision: string revision target """ script = ScriptDirectory.from_config(config) if rev == "current": def show_current(rev, context): for sc in script.get_revisions(rev): config.print_stdout(sc.log_entry) return [] with EnvironmentContext(config, script, fn=show_current): script.run_env() else: for sc in script.get_revisions(rev): config.print_stdout(sc.log_entry) def history(config, rev_range=None, verbose=False, indicate_current=False): """List changeset scripts in chronological order. :param config: a :class:`.Config` instance. :param rev_range: string revision range :param verbose: output in verbose mode. :param indicate_current: indicate current revision. ..versionadded:: 0.9.9 """ script = ScriptDirectory.from_config(config) if rev_range is not None: if ":" not in rev_range: raise util.CommandError( "History range requires [start]:[end], " "[start]:, or :[end]" ) base, head = rev_range.strip().split(":") else: base = head = None environment = ( util.asbool(config.get_main_option("revision_environment")) or indicate_current ) def _display_history(config, script, base, head, currents=()): for sc in script.walk_revisions( base=base or "base", head=head or "heads" ): if indicate_current: sc._db_current_indicator = sc.revision in currents config.print_stdout( sc.cmd_format( verbose=verbose, include_branches=True, include_doc=True, include_parents=True, ) ) def _display_history_w_current(config, script, base, head): def _display_current_history(rev, context): if head == "current": _display_history(config, script, base, rev, rev) elif base == "current": _display_history(config, script, rev, head, rev) else: _display_history(config, script, base, head, rev) return [] with EnvironmentContext(config, script, fn=_display_current_history): script.run_env() if base == "current" or head == "current" or environment: _display_history_w_current(config, script, base, head) else: _display_history(config, script, base, head) def heads(config, verbose=False, resolve_dependencies=False): """Show current available heads in the script directory. :param config: a :class:`.Config` instance. :param verbose: output in verbose mode. :param resolve_dependencies: treat dependency version as down revisions. """ script = ScriptDirectory.from_config(config) if resolve_dependencies: heads = script.get_revisions("heads") else: heads = script.get_revisions(script.get_heads()) for rev in heads: config.print_stdout( rev.cmd_format( verbose, include_branches=True, tree_indicators=False ) ) def branches(config, verbose=False): """Show current branch points. :param config: a :class:`.Config` instance. :param verbose: output in verbose mode. """ script = ScriptDirectory.from_config(config) for sc in script.walk_revisions(): if sc.is_branch_point: config.print_stdout( "%s\n%s\n", sc.cmd_format(verbose, include_branches=True), "\n".join( "%s -> %s" % ( " " * len(str(sc.revision)), rev_obj.cmd_format( False, include_branches=True, include_doc=verbose ), ) for rev_obj in ( script.get_revision(rev) for rev in sc.nextrev ) ), ) def current(config, verbose=False, head_only=False): """Display the current revision for a database. :param config: a :class:`.Config` instance. :param verbose: output in verbose mode. :param head_only: deprecated; use ``verbose`` for additional output. """ script = ScriptDirectory.from_config(config) if head_only: util.warn("--head-only is deprecated", stacklevel=3) def display_version(rev, context): if verbose: config.print_stdout( "Current revision(s) for %s:", util.obfuscate_url_pw(context.connection.engine.url), ) for rev in script.get_all_current(rev): config.print_stdout(rev.cmd_format(verbose)) return [] with EnvironmentContext(config, script, fn=display_version): script.run_env() def stamp(config, revision, sql=False, tag=None): """'stamp' the revision table with the given revision; don't run any migrations. :param config: a :class:`.Config` instance. :param revision: target revision. :param sql: use ``--sql`` mode :param tag: an arbitrary "tag" that can be intercepted by custom ``env.py`` scripts via the :class:`.EnvironmentContext.get_tag_argument` method. """ script = ScriptDirectory.from_config(config) starting_rev = None if ":" in revision: if not sql: raise util.CommandError("Range revision not allowed") starting_rev, revision = revision.split(":", 2) def do_stamp(rev, context): return script._stamp_revs(revision, rev) with EnvironmentContext( config, script, fn=do_stamp, as_sql=sql, destination_rev=revision, starting_rev=starting_rev, tag=tag, ): script.run_env() def edit(config, rev): """Edit revision script(s) using $EDITOR. :param config: a :class:`.Config` instance. :param rev: target revision. """ script = ScriptDirectory.from_config(config) if rev == "current": def edit_current(rev, context): if not rev: raise util.CommandError("No current revisions") for sc in script.get_revisions(rev): util.edit(sc.path) return [] with EnvironmentContext(config, script, fn=edit_current): script.run_env() else: revs = script.get_revisions(rev) if not revs: raise util.CommandError( "No revision files indicated by symbol '%s'" % rev ) for sc in revs: util.edit(sc.path) zzzeek-alembic-bee044a1c187/alembic/config.py000066400000000000000000000440151353106760100210210ustar00rootroot00000000000000from argparse import ArgumentParser import inspect import os import sys from . import command from . import package_dir from . import util from .util import compat from .util.compat import SafeConfigParser class Config(object): r"""Represent an Alembic configuration. Within an ``env.py`` script, this is available via the :attr:`.EnvironmentContext.config` attribute, which in turn is available at ``alembic.context``:: from alembic import context some_param = context.config.get_main_option("my option") When invoking Alembic programatically, a new :class:`.Config` can be created by passing the name of an .ini file to the constructor:: from alembic.config import Config alembic_cfg = Config("/path/to/yourapp/alembic.ini") With a :class:`.Config` object, you can then run Alembic commands programmatically using the directives in :mod:`alembic.command`. The :class:`.Config` object can also be constructed without a filename. Values can be set programmatically, and new sections will be created as needed:: from alembic.config import Config alembic_cfg = Config() alembic_cfg.set_main_option("script_location", "myapp:migrations") alembic_cfg.set_main_option("sqlalchemy.url", "postgresql://foo/bar") alembic_cfg.set_section_option("mysection", "foo", "bar") .. warning:: When using programmatic configuration, make sure the ``env.py`` file in use is compatible with the target configuration; including that the call to Python ``logging.fileConfig()`` is omitted if the programmatic configuration doesn't actually include logging directives. For passing non-string values to environments, such as connections and engines, use the :attr:`.Config.attributes` dictionary:: with engine.begin() as connection: alembic_cfg.attributes['connection'] = connection command.upgrade(alembic_cfg, "head") :param file\_: name of the .ini file to open. :param ini_section: name of the main Alembic section within the .ini file :param output_buffer: optional file-like input buffer which will be passed to the :class:`.MigrationContext` - used to redirect the output of "offline generation" when using Alembic programmatically. :param stdout: buffer where the "print" output of commands will be sent. Defaults to ``sys.stdout``. .. versionadded:: 0.4 :param config_args: A dictionary of keys and values that will be used for substitution in the alembic config file. The dictionary as given is **copied** to a new one, stored locally as the attribute ``.config_args``. When the :attr:`.Config.file_config` attribute is first invoked, the replacement variable ``here`` will be added to this dictionary before the dictionary is passed to ``SafeConfigParser()`` to parse the .ini file. .. versionadded:: 0.7.0 :param attributes: optional dictionary of arbitrary Python keys/values, which will be populated into the :attr:`.Config.attributes` dictionary. .. versionadded:: 0.7.5 .. seealso:: :ref:`connection_sharing` """ def __init__( self, file_=None, ini_section="alembic", output_buffer=None, stdout=sys.stdout, cmd_opts=None, config_args=util.immutabledict(), attributes=None, ): """Construct a new :class:`.Config` """ self.config_file_name = file_ self.config_ini_section = ini_section self.output_buffer = output_buffer self.stdout = stdout self.cmd_opts = cmd_opts self.config_args = dict(config_args) if attributes: self.attributes.update(attributes) cmd_opts = None """The command-line options passed to the ``alembic`` script. Within an ``env.py`` script this can be accessed via the :attr:`.EnvironmentContext.config` attribute. .. versionadded:: 0.6.0 .. seealso:: :meth:`.EnvironmentContext.get_x_argument` """ config_file_name = None """Filesystem path to the .ini file in use.""" config_ini_section = None """Name of the config file section to read basic configuration from. Defaults to ``alembic``, that is the ``[alembic]`` section of the .ini file. This value is modified using the ``-n/--name`` option to the Alembic runnier. """ @util.memoized_property def attributes(self): """A Python dictionary for storage of additional state. This is a utility dictionary which can include not just strings but engines, connections, schema objects, or anything else. Use this to pass objects into an env.py script, such as passing a :class:`sqlalchemy.engine.base.Connection` when calling commands from :mod:`alembic.command` programmatically. .. versionadded:: 0.7.5 .. seealso:: :ref:`connection_sharing` :paramref:`.Config.attributes` """ return {} def print_stdout(self, text, *arg): """Render a message to standard out. When :meth:`.Config.print_stdout` is called with additional args those arguments will formatted against the provided text, otherwise we simply output the provided text verbatim. e.g.:: >>> config.print_stdout('Some text %s', 'arg') Some Text arg """ if arg: output = compat.text_type(text) % arg else: output = compat.text_type(text) util.write_outstream(self.stdout, output, "\n") @util.memoized_property def file_config(self): """Return the underlying ``ConfigParser`` object. Direct access to the .ini file is available here, though the :meth:`.Config.get_section` and :meth:`.Config.get_main_option` methods provide a possibly simpler interface. """ if self.config_file_name: here = os.path.abspath(os.path.dirname(self.config_file_name)) else: here = "" self.config_args["here"] = here file_config = SafeConfigParser(self.config_args) if self.config_file_name: file_config.read([self.config_file_name]) else: file_config.add_section(self.config_ini_section) return file_config def get_template_directory(self): """Return the directory where Alembic setup templates are found. This method is used by the alembic ``init`` and ``list_templates`` commands. """ return os.path.join(package_dir, "templates") def get_section(self, name): """Return all the configuration options from a given .ini file section as a dictionary. """ return dict(self.file_config.items(name)) def set_main_option(self, name, value): """Set an option programmatically within the 'main' section. This overrides whatever was in the .ini file. :param name: name of the value :param value: the value. Note that this value is passed to ``ConfigParser.set``, which supports variable interpolation using pyformat (e.g. ``%(some_value)s``). A raw percent sign not part of an interpolation symbol must therefore be escaped, e.g. ``%%``. The given value may refer to another value already in the file using the interpolation format. """ self.set_section_option(self.config_ini_section, name, value) def remove_main_option(self, name): self.file_config.remove_option(self.config_ini_section, name) def set_section_option(self, section, name, value): """Set an option programmatically within the given section. The section is created if it doesn't exist already. The value here will override whatever was in the .ini file. :param section: name of the section :param name: name of the value :param value: the value. Note that this value is passed to ``ConfigParser.set``, which supports variable interpolation using pyformat (e.g. ``%(some_value)s``). A raw percent sign not part of an interpolation symbol must therefore be escaped, e.g. ``%%``. The given value may refer to another value already in the file using the interpolation format. """ if not self.file_config.has_section(section): self.file_config.add_section(section) self.file_config.set(section, name, value) def get_section_option(self, section, name, default=None): """Return an option from the given section of the .ini file. """ if not self.file_config.has_section(section): raise util.CommandError( "No config file %r found, or file has no " "'[%s]' section" % (self.config_file_name, section) ) if self.file_config.has_option(section, name): return self.file_config.get(section, name) else: return default def get_main_option(self, name, default=None): """Return an option from the 'main' section of the .ini file. This defaults to being a key from the ``[alembic]`` section, unless the ``-n/--name`` flag were used to indicate a different section. """ return self.get_section_option(self.config_ini_section, name, default) class CommandLine(object): def __init__(self, prog=None): self._generate_args(prog) def _generate_args(self, prog): def add_options(parser, positional, kwargs): kwargs_opts = { "template": ( "-t", "--template", dict( default="generic", type=str, help="Setup template for use with 'init'", ), ), "message": ( "-m", "--message", dict( type=str, help="Message string to use with 'revision'" ), ), "sql": ( "--sql", dict( action="store_true", help="Don't emit SQL to database - dump to " "standard output/file instead. See docs on " "offline mode.", ), ), "tag": ( "--tag", dict( type=str, help="Arbitrary 'tag' name - can be used by " "custom env.py scripts.", ), ), "head": ( "--head", dict( type=str, help="Specify head revision or @head " "to base new revision on.", ), ), "splice": ( "--splice", dict( action="store_true", help="Allow a non-head revision as the " "'head' to splice onto", ), ), "depends_on": ( "--depends-on", dict( action="append", help="Specify one or more revision identifiers " "which this revision should depend on.", ), ), "rev_id": ( "--rev-id", dict( type=str, help="Specify a hardcoded revision id instead of " "generating one", ), ), "version_path": ( "--version-path", dict( type=str, help="Specify specific path from config for " "version file", ), ), "branch_label": ( "--branch-label", dict( type=str, help="Specify a branch label to apply to the " "new revision", ), ), "verbose": ( "-v", "--verbose", dict(action="store_true", help="Use more verbose output"), ), "resolve_dependencies": ( "--resolve-dependencies", dict( action="store_true", help="Treat dependency versions as down revisions", ), ), "autogenerate": ( "--autogenerate", dict( action="store_true", help="Populate revision script with candidate " "migration operations, based on comparison " "of database to model.", ), ), "head_only": ( "--head-only", dict( action="store_true", help="Deprecated. Use --verbose for " "additional output", ), ), "rev_range": ( "-r", "--rev-range", dict( action="store", help="Specify a revision range; " "format is [start]:[end]", ), ), "indicate_current": ( "-i", "--indicate-current", dict( action="store_true", help="Indicate the current revision", ), ), } positional_help = { "directory": "location of scripts directory", "revision": "revision identifier", "revisions": "one or more revisions, or 'heads' for all heads", } for arg in kwargs: if arg in kwargs_opts: args = kwargs_opts[arg] args, kw = args[0:-1], args[-1] parser.add_argument(*args, **kw) for arg in positional: if arg == "revisions": subparser.add_argument( arg, nargs="+", help=positional_help.get(arg) ) else: subparser.add_argument(arg, help=positional_help.get(arg)) parser = ArgumentParser(prog=prog) parser.add_argument( "-c", "--config", type=str, default="alembic.ini", help="Alternate config file", ) parser.add_argument( "-n", "--name", type=str, default="alembic", help="Name of section in .ini file to " "use for Alembic config", ) parser.add_argument( "-x", action="append", help="Additional arguments consumed by " "custom env.py scripts, e.g. -x " "setting1=somesetting -x setting2=somesetting", ) parser.add_argument( "--raiseerr", action="store_true", help="Raise a full stack trace on error", ) subparsers = parser.add_subparsers() for fn in [getattr(command, n) for n in dir(command)]: if ( inspect.isfunction(fn) and fn.__name__[0] != "_" and fn.__module__ == "alembic.command" ): spec = compat.inspect_getargspec(fn) if spec[3]: positional = spec[0][1 : -len(spec[3])] kwarg = spec[0][-len(spec[3]) :] else: positional = spec[0][1:] kwarg = [] # parse first line(s) of helptext without a line break help_ = fn.__doc__ if help_: help_text = [] for line in help_.split("\n"): if not line.strip(): break else: help_text.append(line.strip()) else: help_text = "" subparser = subparsers.add_parser( fn.__name__, help=" ".join(help_text) ) add_options(subparser, positional, kwarg) subparser.set_defaults(cmd=(fn, positional, kwarg)) self.parser = parser def run_cmd(self, config, options): fn, positional, kwarg = options.cmd try: fn( config, *[getattr(options, k, None) for k in positional], **dict((k, getattr(options, k, None)) for k in kwarg) ) except util.CommandError as e: if options.raiseerr: raise else: util.err(str(e)) def main(self, argv=None): options = self.parser.parse_args(argv) if not hasattr(options, "cmd"): # see http://bugs.python.org/issue9253, argparse # behavior changed incompatibly in py3.3 self.parser.error("too few arguments") else: cfg = Config( file_=options.config, ini_section=options.name, cmd_opts=options, ) self.run_cmd(cfg, options) def main(argv=None, prog=None, **kwargs): """The console runner function for Alembic.""" CommandLine(prog=prog).main(argv=argv) if __name__ == "__main__": main() zzzeek-alembic-bee044a1c187/alembic/context.py000066400000000000000000000003031353106760100212300ustar00rootroot00000000000000from .runtime.environment import EnvironmentContext # create proxy functions for # each method on the EnvironmentContext class. EnvironmentContext.create_module_class_proxy(globals(), locals()) zzzeek-alembic-bee044a1c187/alembic/ddl/000077500000000000000000000000001353106760100177415ustar00rootroot00000000000000zzzeek-alembic-bee044a1c187/alembic/ddl/__init__.py000066400000000000000000000002711353106760100220520ustar00rootroot00000000000000from . import mssql # noqa from . import mysql # noqa from . import oracle # noqa from . import postgresql # noqa from . import sqlite # noqa from .impl import DefaultImpl # noqa zzzeek-alembic-bee044a1c187/alembic/ddl/base.py000066400000000000000000000141171353106760100212310ustar00rootroot00000000000000import functools from sqlalchemy import Integer from sqlalchemy import types as sqltypes from sqlalchemy.ext.compiler import compiles from sqlalchemy.schema import Column from sqlalchemy.schema import DDLElement from sqlalchemy.sql.elements import quoted_name from ..util.sqla_compat import _columns_for_constraint # noqa from ..util.sqla_compat import _find_columns # noqa from ..util.sqla_compat import _fk_spec # noqa from ..util.sqla_compat import _is_type_bound # noqa from ..util.sqla_compat import _table_for_constraint # noqa class AlterTable(DDLElement): """Represent an ALTER TABLE statement. Only the string name and optional schema name of the table is required, not a full Table object. """ def __init__(self, table_name, schema=None): self.table_name = table_name self.schema = schema class RenameTable(AlterTable): def __init__(self, old_table_name, new_table_name, schema=None): super(RenameTable, self).__init__(old_table_name, schema=schema) self.new_table_name = new_table_name class AlterColumn(AlterTable): def __init__( self, name, column_name, schema=None, existing_type=None, existing_nullable=None, existing_server_default=None, existing_comment=None, ): super(AlterColumn, self).__init__(name, schema=schema) self.column_name = column_name self.existing_type = ( sqltypes.to_instance(existing_type) if existing_type is not None else None ) self.existing_nullable = existing_nullable self.existing_server_default = existing_server_default self.existing_comment = existing_comment class ColumnNullable(AlterColumn): def __init__(self, name, column_name, nullable, **kw): super(ColumnNullable, self).__init__(name, column_name, **kw) self.nullable = nullable class ColumnType(AlterColumn): def __init__(self, name, column_name, type_, **kw): super(ColumnType, self).__init__(name, column_name, **kw) self.type_ = sqltypes.to_instance(type_) class ColumnName(AlterColumn): def __init__(self, name, column_name, newname, **kw): super(ColumnName, self).__init__(name, column_name, **kw) self.newname = newname class ColumnDefault(AlterColumn): def __init__(self, name, column_name, default, **kw): super(ColumnDefault, self).__init__(name, column_name, **kw) self.default = default class AddColumn(AlterTable): def __init__(self, name, column, schema=None): super(AddColumn, self).__init__(name, schema=schema) self.column = column class DropColumn(AlterTable): def __init__(self, name, column, schema=None): super(DropColumn, self).__init__(name, schema=schema) self.column = column class ColumnComment(AlterColumn): def __init__(self, name, column_name, comment, **kw): super(ColumnComment, self).__init__(name, column_name, **kw) self.comment = comment @compiles(RenameTable) def visit_rename_table(element, compiler, **kw): return "%s RENAME TO %s" % ( alter_table(compiler, element.table_name, element.schema), format_table_name(compiler, element.new_table_name, element.schema), ) @compiles(AddColumn) def visit_add_column(element, compiler, **kw): return "%s %s" % ( alter_table(compiler, element.table_name, element.schema), add_column(compiler, element.column, **kw), ) @compiles(DropColumn) def visit_drop_column(element, compiler, **kw): return "%s %s" % ( alter_table(compiler, element.table_name, element.schema), drop_column(compiler, element.column.name, **kw), ) @compiles(ColumnNullable) def visit_column_nullable(element, compiler, **kw): return "%s %s %s" % ( alter_table(compiler, element.table_name, element.schema), alter_column(compiler, element.column_name), "DROP NOT NULL" if element.nullable else "SET NOT NULL", ) @compiles(ColumnType) def visit_column_type(element, compiler, **kw): return "%s %s %s" % ( alter_table(compiler, element.table_name, element.schema), alter_column(compiler, element.column_name), "TYPE %s" % format_type(compiler, element.type_), ) @compiles(ColumnName) def visit_column_name(element, compiler, **kw): return "%s RENAME %s TO %s" % ( alter_table(compiler, element.table_name, element.schema), format_column_name(compiler, element.column_name), format_column_name(compiler, element.newname), ) @compiles(ColumnDefault) def visit_column_default(element, compiler, **kw): return "%s %s %s" % ( alter_table(compiler, element.table_name, element.schema), alter_column(compiler, element.column_name), "SET DEFAULT %s" % format_server_default(compiler, element.default) if element.default is not None else "DROP DEFAULT", ) def quote_dotted(name, quote): """quote the elements of a dotted name""" if isinstance(name, quoted_name): return quote(name) result = ".".join([quote(x) for x in name.split(".")]) return result def format_table_name(compiler, name, schema): quote = functools.partial(compiler.preparer.quote) if schema: return quote_dotted(schema, quote) + "." + quote(name) else: return quote(name) def format_column_name(compiler, name): return compiler.preparer.quote(name) def format_server_default(compiler, default): return compiler.get_column_default_string( Column("x", Integer, server_default=default) ) def format_type(compiler, type_): return compiler.dialect.type_compiler.process(type_) def alter_table(compiler, name, schema): return "ALTER TABLE %s" % format_table_name(compiler, name, schema) def drop_column(compiler, name): return "DROP COLUMN %s" % format_column_name(compiler, name) def alter_column(compiler, name): return "ALTER COLUMN %s" % format_column_name(compiler, name) def add_column(compiler, column, **kw): return "ADD COLUMN %s" % compiler.get_column_specification(column, **kw) zzzeek-alembic-bee044a1c187/alembic/ddl/impl.py000066400000000000000000000350371353106760100212640ustar00rootroot00000000000000from sqlalchemy import schema from sqlalchemy import text from sqlalchemy import types as sqltypes from . import base from .. import util from ..util import sqla_compat from ..util.compat import string_types from ..util.compat import text_type from ..util.compat import with_metaclass class ImplMeta(type): def __init__(cls, classname, bases, dict_): newtype = type.__init__(cls, classname, bases, dict_) if "__dialect__" in dict_: _impls[dict_["__dialect__"]] = cls return newtype _impls = {} class DefaultImpl(with_metaclass(ImplMeta)): """Provide the entrypoint for major migration operations, including database-specific behavioral variances. While individual SQL/DDL constructs already provide for database-specific implementations, variances here allow for entirely different sequences of operations to take place for a particular migration, such as SQL Server's special 'IDENTITY INSERT' step for bulk inserts. """ __dialect__ = "default" transactional_ddl = False command_terminator = ";" def __init__( self, dialect, connection, as_sql, transactional_ddl, output_buffer, context_opts, ): self.dialect = dialect self.connection = connection self.as_sql = as_sql self.literal_binds = context_opts.get("literal_binds", False) self.output_buffer = output_buffer self.memo = {} self.context_opts = context_opts if transactional_ddl is not None: self.transactional_ddl = transactional_ddl if self.literal_binds: if not self.as_sql: raise util.CommandError( "Can't use literal_binds setting without as_sql mode" ) @classmethod def get_by_dialect(cls, dialect): return _impls[dialect.name] def static_output(self, text): self.output_buffer.write(text_type(text + "\n\n")) self.output_buffer.flush() def requires_recreate_in_batch(self, batch_op): """Return True if the given :class:`.BatchOperationsImpl` would need the table to be recreated and copied in order to proceed. Normally, only returns True on SQLite when operations other than add_column are present. """ return False def prep_table_for_batch(self, table): """perform any operations needed on a table before a new one is created to replace it in batch mode. the PG dialect uses this to drop constraints on the table before the new one uses those same names. """ @property def bind(self): return self.connection def _exec( self, construct, execution_options=None, multiparams=(), params=util.immutabledict(), ): if isinstance(construct, string_types): construct = text(construct) if self.as_sql: if multiparams or params: # TODO: coverage raise Exception("Execution arguments not allowed with as_sql") if self.literal_binds and not isinstance( construct, schema.DDLElement ): compile_kw = dict(compile_kwargs={"literal_binds": True}) else: compile_kw = {} self.static_output( text_type( construct.compile(dialect=self.dialect, **compile_kw) ) .replace("\t", " ") .strip() + self.command_terminator ) else: conn = self.connection if execution_options: conn = conn.execution_options(**execution_options) return conn.execute(construct, *multiparams, **params) def execute(self, sql, execution_options=None): self._exec(sql, execution_options) def alter_column( self, table_name, column_name, nullable=None, server_default=False, name=None, type_=None, schema=None, autoincrement=None, comment=False, existing_comment=None, existing_type=None, existing_server_default=None, existing_nullable=None, existing_autoincrement=None, ): if autoincrement is not None or existing_autoincrement is not None: util.warn( "autoincrement and existing_autoincrement " "only make sense for MySQL", stacklevel=3, ) if nullable is not None: self._exec( base.ColumnNullable( table_name, column_name, nullable, schema=schema, existing_type=existing_type, existing_server_default=existing_server_default, existing_nullable=existing_nullable, existing_comment=existing_comment, ) ) if server_default is not False: self._exec( base.ColumnDefault( table_name, column_name, server_default, schema=schema, existing_type=existing_type, existing_server_default=existing_server_default, existing_nullable=existing_nullable, existing_comment=existing_comment, ) ) if type_ is not None: self._exec( base.ColumnType( table_name, column_name, type_, schema=schema, existing_type=existing_type, existing_server_default=existing_server_default, existing_nullable=existing_nullable, existing_comment=existing_comment, ) ) if comment is not False: self._exec( base.ColumnComment( table_name, column_name, comment, schema=schema, existing_type=existing_type, existing_server_default=existing_server_default, existing_nullable=existing_nullable, existing_comment=existing_comment, ) ) # do the new name last ;) if name is not None: self._exec( base.ColumnName( table_name, column_name, name, schema=schema, existing_type=existing_type, existing_server_default=existing_server_default, existing_nullable=existing_nullable, ) ) def add_column(self, table_name, column, schema=None): self._exec(base.AddColumn(table_name, column, schema=schema)) def drop_column(self, table_name, column, schema=None, **kw): self._exec(base.DropColumn(table_name, column, schema=schema)) def add_constraint(self, const): if const._create_rule is None or const._create_rule(self): self._exec(schema.AddConstraint(const)) def drop_constraint(self, const): self._exec(schema.DropConstraint(const)) def rename_table(self, old_table_name, new_table_name, schema=None): self._exec( base.RenameTable(old_table_name, new_table_name, schema=schema) ) def create_table(self, table): table.dispatch.before_create( table, self.connection, checkfirst=False, _ddl_runner=self ) self._exec(schema.CreateTable(table)) table.dispatch.after_create( table, self.connection, checkfirst=False, _ddl_runner=self ) for index in table.indexes: self._exec(schema.CreateIndex(index)) with_comment = ( sqla_compat._dialect_supports_comments(self.dialect) and not self.dialect.inline_comments ) comment = sqla_compat._comment_attribute(table) if comment and with_comment: self.create_table_comment(table) for column in table.columns: comment = sqla_compat._comment_attribute(column) if comment and with_comment: self.create_column_comment(column) def drop_table(self, table): self._exec(schema.DropTable(table)) def create_index(self, index): self._exec(schema.CreateIndex(index)) def create_table_comment(self, table): self._exec(schema.SetTableComment(table)) def drop_table_comment(self, table): self._exec(schema.DropTableComment(table)) def create_column_comment(self, column): self._exec(schema.SetColumnComment(column)) def drop_index(self, index): self._exec(schema.DropIndex(index)) def bulk_insert(self, table, rows, multiinsert=True): if not isinstance(rows, list): raise TypeError("List expected") elif rows and not isinstance(rows[0], dict): raise TypeError("List of dictionaries expected") if self.as_sql: for row in rows: self._exec( table.insert(inline=True).values( **dict( ( k, sqla_compat._literal_bindparam( k, v, type_=table.c[k].type ) if not isinstance( v, sqla_compat._literal_bindparam ) else v, ) for k, v in row.items() ) ) ) else: # work around http://www.sqlalchemy.org/trac/ticket/2461 if not hasattr(table, "_autoincrement_column"): table._autoincrement_column = None if rows: if multiinsert: self._exec(table.insert(inline=True), multiparams=rows) else: for row in rows: self._exec(table.insert(inline=True).values(**row)) def compare_type(self, inspector_column, metadata_column): conn_type = inspector_column.type metadata_type = metadata_column.type metadata_impl = metadata_type.dialect_impl(self.dialect) if isinstance(metadata_impl, sqltypes.Variant): metadata_impl = metadata_impl.impl.dialect_impl(self.dialect) # work around SQLAlchemy bug "stale value for type affinity" # fixed in 0.7.4 metadata_impl.__dict__.pop("_type_affinity", None) if hasattr(metadata_impl, "compare_against_backend"): comparison = metadata_impl.compare_against_backend( self.dialect, conn_type ) if comparison is not None: return not comparison if conn_type._compare_type_affinity(metadata_impl): comparator = _type_comparators.get(conn_type._type_affinity, None) return comparator and comparator(metadata_impl, conn_type) else: return True def compare_server_default( self, inspector_column, metadata_column, rendered_metadata_default, rendered_inspector_default, ): return rendered_inspector_default != rendered_metadata_default def correct_for_autogen_constraints( self, conn_uniques, conn_indexes, metadata_unique_constraints, metadata_indexes, ): pass def render_ddl_sql_expr(self, expr, is_server_default=False, **kw): """Render a SQL expression that is typically a server default, index expression, etc. .. versionadded:: 1.0.11 """ compile_kw = dict( compile_kwargs={"literal_binds": True, "include_table": False} ) return text_type(expr.compile(dialect=self.dialect, **compile_kw)) def _compat_autogen_column_reflect(self, inspector): return self.autogen_column_reflect def correct_for_autogen_foreignkeys(self, conn_fks, metadata_fks): pass def autogen_column_reflect(self, inspector, table, column_info): """A hook that is attached to the 'column_reflect' event for when a Table is reflected from the database during the autogenerate process. Dialects can elect to modify the information gathered here. """ def start_migrations(self): """A hook called when :meth:`.EnvironmentContext.run_migrations` is called. Implementations can set up per-migration-run state here. """ def emit_begin(self): """Emit the string ``BEGIN``, or the backend-specific equivalent, on the current connection context. This is used in offline mode and typically via :meth:`.EnvironmentContext.begin_transaction`. """ self.static_output("BEGIN" + self.command_terminator) def emit_commit(self): """Emit the string ``COMMIT``, or the backend-specific equivalent, on the current connection context. This is used in offline mode and typically via :meth:`.EnvironmentContext.begin_transaction`. """ self.static_output("COMMIT" + self.command_terminator) def render_type(self, type_obj, autogen_context): return False def _string_compare(t1, t2): return t1.length is not None and t1.length != t2.length def _numeric_compare(t1, t2): return (t1.precision is not None and t1.precision != t2.precision) or ( t1.precision is not None and t1.scale is not None and t1.scale != t2.scale ) def _integer_compare(t1, t2): t1_small_or_big = ( "S" if isinstance(t1, sqltypes.SmallInteger) else "B" if isinstance(t1, sqltypes.BigInteger) else "I" ) t2_small_or_big = ( "S" if isinstance(t2, sqltypes.SmallInteger) else "B" if isinstance(t2, sqltypes.BigInteger) else "I" ) return t1_small_or_big != t2_small_or_big def _datetime_compare(t1, t2): return t1.timezone != t2.timezone _type_comparators = { sqltypes.String: _string_compare, sqltypes.Numeric: _numeric_compare, sqltypes.Integer: _integer_compare, sqltypes.DateTime: _datetime_compare, } zzzeek-alembic-bee044a1c187/alembic/ddl/mssql.py000066400000000000000000000212021353106760100214470ustar00rootroot00000000000000from sqlalchemy import types as sqltypes from sqlalchemy.ext.compiler import compiles from sqlalchemy.schema import Column from sqlalchemy.schema import CreateIndex from sqlalchemy.sql.expression import ClauseElement from sqlalchemy.sql.expression import Executable from .base import AddColumn from .base import alter_column from .base import alter_table from .base import ColumnDefault from .base import ColumnName from .base import ColumnNullable from .base import ColumnType from .base import format_column_name from .base import format_server_default from .base import format_table_name from .base import format_type from .base import RenameTable from .impl import DefaultImpl from .. import util class MSSQLImpl(DefaultImpl): __dialect__ = "mssql" transactional_ddl = True batch_separator = "GO" def __init__(self, *arg, **kw): super(MSSQLImpl, self).__init__(*arg, **kw) self.batch_separator = self.context_opts.get( "mssql_batch_separator", self.batch_separator ) def _exec(self, construct, *args, **kw): result = super(MSSQLImpl, self)._exec(construct, *args, **kw) if self.as_sql and self.batch_separator: self.static_output(self.batch_separator) return result def emit_begin(self): self.static_output("BEGIN TRANSACTION" + self.command_terminator) def emit_commit(self): super(MSSQLImpl, self).emit_commit() if self.as_sql and self.batch_separator: self.static_output(self.batch_separator) def alter_column( self, table_name, column_name, nullable=None, server_default=False, name=None, type_=None, schema=None, existing_type=None, existing_server_default=None, existing_nullable=None, **kw ): if nullable is not None and existing_type is None: if type_ is not None: existing_type = type_ # the NULL/NOT NULL alter will handle # the type alteration type_ = None else: raise util.CommandError( "MS-SQL ALTER COLUMN operations " "with NULL or NOT NULL require the " "existing_type or a new type_ be passed." ) super(MSSQLImpl, self).alter_column( table_name, column_name, nullable=nullable, type_=type_, schema=schema, existing_type=existing_type, existing_nullable=existing_nullable, **kw ) if server_default is not False: if existing_server_default is not False or server_default is None: self._exec( _ExecDropConstraint( table_name, column_name, "sys.default_constraints" ) ) if server_default is not None: super(MSSQLImpl, self).alter_column( table_name, column_name, schema=schema, server_default=server_default, ) if name is not None: super(MSSQLImpl, self).alter_column( table_name, column_name, schema=schema, name=name ) def create_index(self, index): # this likely defaults to None if not present, so get() # should normally not return the default value. being # defensive in any case mssql_include = index.kwargs.get("mssql_include", None) or () for col in mssql_include: if col not in index.table.c: index.table.append_column(Column(col, sqltypes.NullType)) self._exec(CreateIndex(index)) def bulk_insert(self, table, rows, **kw): if self.as_sql: self._exec( "SET IDENTITY_INSERT %s ON" % self.dialect.identifier_preparer.format_table(table) ) super(MSSQLImpl, self).bulk_insert(table, rows, **kw) self._exec( "SET IDENTITY_INSERT %s OFF" % self.dialect.identifier_preparer.format_table(table) ) else: super(MSSQLImpl, self).bulk_insert(table, rows, **kw) def drop_column(self, table_name, column, **kw): drop_default = kw.pop("mssql_drop_default", False) if drop_default: self._exec( _ExecDropConstraint( table_name, column, "sys.default_constraints" ) ) drop_check = kw.pop("mssql_drop_check", False) if drop_check: self._exec( _ExecDropConstraint( table_name, column, "sys.check_constraints" ) ) drop_fks = kw.pop("mssql_drop_foreign_key", False) if drop_fks: self._exec(_ExecDropFKConstraint(table_name, column)) super(MSSQLImpl, self).drop_column(table_name, column, **kw) class _ExecDropConstraint(Executable, ClauseElement): def __init__(self, tname, colname, type_): self.tname = tname self.colname = colname self.type_ = type_ class _ExecDropFKConstraint(Executable, ClauseElement): def __init__(self, tname, colname): self.tname = tname self.colname = colname @compiles(_ExecDropConstraint, "mssql") def _exec_drop_col_constraint(element, compiler, **kw): tname, colname, type_ = element.tname, element.colname, element.type_ # from http://www.mssqltips.com/sqlservertip/1425/\ # working-with-default-constraints-in-sql-server/ # TODO: needs table formatting, etc. return """declare @const_name varchar(256) select @const_name = [name] from %(type)s where parent_object_id = object_id('%(tname)s') and col_name(parent_object_id, parent_column_id) = '%(colname)s' exec('alter table %(tname_quoted)s drop constraint ' + @const_name)""" % { "type": type_, "tname": tname, "colname": colname, "tname_quoted": format_table_name(compiler, tname, None), } @compiles(_ExecDropFKConstraint, "mssql") def _exec_drop_col_fk_constraint(element, compiler, **kw): tname, colname = element.tname, element.colname return """declare @const_name varchar(256) select @const_name = [name] from sys.foreign_keys fk join sys.foreign_key_columns fkc on fk.object_id=fkc.constraint_object_id where fkc.parent_object_id = object_id('%(tname)s') and col_name(fkc.parent_object_id, fkc.parent_column_id) = '%(colname)s' exec('alter table %(tname_quoted)s drop constraint ' + @const_name)""" % { "tname": tname, "colname": colname, "tname_quoted": format_table_name(compiler, tname, None), } @compiles(AddColumn, "mssql") def visit_add_column(element, compiler, **kw): return "%s %s" % ( alter_table(compiler, element.table_name, element.schema), mssql_add_column(compiler, element.column, **kw), ) def mssql_add_column(compiler, column, **kw): return "ADD %s" % compiler.get_column_specification(column, **kw) @compiles(ColumnNullable, "mssql") def visit_column_nullable(element, compiler, **kw): return "%s %s %s %s" % ( alter_table(compiler, element.table_name, element.schema), alter_column(compiler, element.column_name), format_type(compiler, element.existing_type), "NULL" if element.nullable else "NOT NULL", ) @compiles(ColumnDefault, "mssql") def visit_column_default(element, compiler, **kw): # TODO: there can also be a named constraint # with ADD CONSTRAINT here return "%s ADD DEFAULT %s FOR %s" % ( alter_table(compiler, element.table_name, element.schema), format_server_default(compiler, element.default), format_column_name(compiler, element.column_name), ) @compiles(ColumnName, "mssql") def visit_rename_column(element, compiler, **kw): return "EXEC sp_rename '%s.%s', %s, 'COLUMN'" % ( format_table_name(compiler, element.table_name, element.schema), format_column_name(compiler, element.column_name), format_column_name(compiler, element.newname), ) @compiles(ColumnType, "mssql") def visit_column_type(element, compiler, **kw): return "%s %s %s" % ( alter_table(compiler, element.table_name, element.schema), alter_column(compiler, element.column_name), format_type(compiler, element.type_), ) @compiles(RenameTable, "mssql") def visit_rename_table(element, compiler, **kw): return "EXEC sp_rename '%s', %s" % ( format_table_name(compiler, element.table_name, element.schema), format_table_name(compiler, element.new_table_name, None), ) zzzeek-alembic-bee044a1c187/alembic/ddl/mysql.py000066400000000000000000000361511353106760100214660ustar00rootroot00000000000000import re from sqlalchemy import schema from sqlalchemy import types as sqltypes from sqlalchemy.ext.compiler import compiles from .base import alter_table from .base import AlterColumn from .base import ColumnDefault from .base import ColumnName from .base import ColumnNullable from .base import ColumnType from .base import format_column_name from .base import format_server_default from .impl import DefaultImpl from .. import util from ..autogenerate import compare from ..util.compat import string_types from ..util.sqla_compat import _is_mariadb from ..util.sqla_compat import _is_type_bound class MySQLImpl(DefaultImpl): __dialect__ = "mysql" transactional_ddl = False def alter_column( self, table_name, column_name, nullable=None, server_default=False, name=None, type_=None, schema=None, existing_type=None, existing_server_default=None, existing_nullable=None, autoincrement=None, existing_autoincrement=None, comment=False, existing_comment=None, **kw ): if name is not None or self._is_mysql_allowed_functional_default( type_ if type_ is not None else existing_type, server_default ): self._exec( MySQLChangeColumn( table_name, column_name, schema=schema, newname=name if name is not None else column_name, nullable=nullable if nullable is not None else existing_nullable if existing_nullable is not None else True, type_=type_ if type_ is not None else existing_type, default=server_default if server_default is not False else existing_server_default, autoincrement=autoincrement if autoincrement is not None else existing_autoincrement, comment=comment if comment is not False else existing_comment, ) ) elif ( nullable is not None or type_ is not None or autoincrement is not None or comment is not False ): self._exec( MySQLModifyColumn( table_name, column_name, schema=schema, newname=name if name is not None else column_name, nullable=nullable if nullable is not None else existing_nullable if existing_nullable is not None else True, type_=type_ if type_ is not None else existing_type, default=server_default if server_default is not False else existing_server_default, autoincrement=autoincrement if autoincrement is not None else existing_autoincrement, comment=comment if comment is not False else existing_comment, ) ) elif server_default is not False: self._exec( MySQLAlterDefault( table_name, column_name, server_default, schema=schema ) ) def drop_constraint(self, const): if isinstance(const, schema.CheckConstraint) and _is_type_bound(const): return super(MySQLImpl, self).drop_constraint(const) def _is_mysql_allowed_functional_default(self, type_, server_default): return ( type_ is not None and type_._type_affinity is sqltypes.DateTime and server_default is not None ) def compare_server_default( self, inspector_column, metadata_column, rendered_metadata_default, rendered_inspector_default, ): # partially a workaround for SQLAlchemy issue #3023; if the # column were created without "NOT NULL", MySQL may have added # an implicit default of '0' which we need to skip # TODO: this is not really covered anymore ? if ( metadata_column.type._type_affinity is sqltypes.Integer and inspector_column.primary_key and not inspector_column.autoincrement and not rendered_metadata_default and rendered_inspector_default == "'0'" ): return False elif inspector_column.type._type_affinity is sqltypes.Integer: rendered_inspector_default = ( re.sub(r"^'|'$", "", rendered_inspector_default) if rendered_inspector_default is not None else None ) return rendered_inspector_default != rendered_metadata_default elif rendered_inspector_default and rendered_metadata_default: # adjust for "function()" vs. "FUNCTION" as can occur particularly # for the CURRENT_TIMESTAMP function on newer MariaDB versions # SQLAlchemy MySQL dialect bundles ON UPDATE into the server # default; adjust for this possibly being present. onupdate_ins = re.match( r"(.*) (on update.*?)(?:\(\))?$", rendered_inspector_default.lower(), ) onupdate_met = re.match( r"(.*) (on update.*?)(?:\(\))?$", rendered_metadata_default.lower(), ) if onupdate_ins: if not onupdate_met: return True elif onupdate_ins.group(2) != onupdate_met.group(2): return True rendered_inspector_default = onupdate_ins.group(1) rendered_metadata_default = onupdate_met.group(1) return re.sub( r"(.*?)(?:\(\))?$", r"\1", rendered_inspector_default.lower() ) != re.sub( r"(.*?)(?:\(\))?$", r"\1", rendered_metadata_default.lower() ) else: return rendered_inspector_default != rendered_metadata_default def correct_for_autogen_constraints( self, conn_unique_constraints, conn_indexes, metadata_unique_constraints, metadata_indexes, ): # TODO: if SQLA 1.0, make use of "duplicates_index" # metadata removed = set() for idx in list(conn_indexes): if idx.unique: continue # MySQL puts implicit indexes on FK columns, even if # composite and even if MyISAM, so can't check this too easily. # the name of the index may be the column name or it may # be the name of the FK constraint. for col in idx.columns: if idx.name == col.name: conn_indexes.remove(idx) removed.add(idx.name) break for fk in col.foreign_keys: if fk.name == idx.name: conn_indexes.remove(idx) removed.add(idx.name) break if idx.name in removed: break # then remove indexes from the "metadata_indexes" # that we've removed from reflected, otherwise they come out # as adds (see #202) for idx in list(metadata_indexes): if idx.name in removed: metadata_indexes.remove(idx) def _legacy_correct_for_dupe_uq_uix( self, conn_unique_constraints, conn_indexes, metadata_unique_constraints, metadata_indexes, ): # then dedupe unique indexes vs. constraints, since MySQL # doesn't really have unique constraints as a separate construct. # but look in the metadata and try to maintain constructs # that already seem to be defined one way or the other # on that side. See #276 metadata_uq_names = set( [ cons.name for cons in metadata_unique_constraints if cons.name is not None ] ) unnamed_metadata_uqs = set( [ compare._uq_constraint_sig(cons).sig for cons in metadata_unique_constraints if cons.name is None ] ) metadata_ix_names = set( [cons.name for cons in metadata_indexes if cons.unique] ) conn_uq_names = dict( (cons.name, cons) for cons in conn_unique_constraints ) conn_ix_names = dict( (cons.name, cons) for cons in conn_indexes if cons.unique ) for overlap in set(conn_uq_names).intersection(conn_ix_names): if overlap not in metadata_uq_names: if ( compare._uq_constraint_sig(conn_uq_names[overlap]).sig not in unnamed_metadata_uqs ): conn_unique_constraints.discard(conn_uq_names[overlap]) elif overlap not in metadata_ix_names: conn_indexes.discard(conn_ix_names[overlap]) def correct_for_autogen_foreignkeys(self, conn_fks, metadata_fks): conn_fk_by_sig = dict( (compare._fk_constraint_sig(fk).sig, fk) for fk in conn_fks ) metadata_fk_by_sig = dict( (compare._fk_constraint_sig(fk).sig, fk) for fk in metadata_fks ) for sig in set(conn_fk_by_sig).intersection(metadata_fk_by_sig): mdfk = metadata_fk_by_sig[sig] cnfk = conn_fk_by_sig[sig] # MySQL considers RESTRICT to be the default and doesn't # report on it. if the model has explicit RESTRICT and # the conn FK has None, set it to RESTRICT if ( mdfk.ondelete is not None and mdfk.ondelete.lower() == "restrict" and cnfk.ondelete is None ): cnfk.ondelete = "RESTRICT" if ( mdfk.onupdate is not None and mdfk.onupdate.lower() == "restrict" and cnfk.onupdate is None ): cnfk.onupdate = "RESTRICT" class MySQLAlterDefault(AlterColumn): def __init__(self, name, column_name, default, schema=None): super(AlterColumn, self).__init__(name, schema=schema) self.column_name = column_name self.default = default class MySQLChangeColumn(AlterColumn): def __init__( self, name, column_name, schema=None, newname=None, type_=None, nullable=None, default=False, autoincrement=None, comment=False, ): super(AlterColumn, self).__init__(name, schema=schema) self.column_name = column_name self.nullable = nullable self.newname = newname self.default = default self.autoincrement = autoincrement self.comment = comment if type_ is None: raise util.CommandError( "All MySQL CHANGE/MODIFY COLUMN operations " "require the existing type." ) self.type_ = sqltypes.to_instance(type_) class MySQLModifyColumn(MySQLChangeColumn): pass @compiles(ColumnNullable, "mysql") @compiles(ColumnName, "mysql") @compiles(ColumnDefault, "mysql") @compiles(ColumnType, "mysql") def _mysql_doesnt_support_individual(element, compiler, **kw): raise NotImplementedError( "Individual alter column constructs not supported by MySQL" ) @compiles(MySQLAlterDefault, "mysql") def _mysql_alter_default(element, compiler, **kw): return "%s ALTER COLUMN %s %s" % ( alter_table(compiler, element.table_name, element.schema), format_column_name(compiler, element.column_name), "SET DEFAULT %s" % format_server_default(compiler, element.default) if element.default is not None else "DROP DEFAULT", ) @compiles(MySQLModifyColumn, "mysql") def _mysql_modify_column(element, compiler, **kw): return "%s MODIFY %s %s" % ( alter_table(compiler, element.table_name, element.schema), format_column_name(compiler, element.column_name), _mysql_colspec( compiler, nullable=element.nullable, server_default=element.default, type_=element.type_, autoincrement=element.autoincrement, comment=element.comment, ), ) @compiles(MySQLChangeColumn, "mysql") def _mysql_change_column(element, compiler, **kw): return "%s CHANGE %s %s %s" % ( alter_table(compiler, element.table_name, element.schema), format_column_name(compiler, element.column_name), format_column_name(compiler, element.newname), _mysql_colspec( compiler, nullable=element.nullable, server_default=element.default, type_=element.type_, autoincrement=element.autoincrement, comment=element.comment, ), ) def _render_value(compiler, expr): if isinstance(expr, string_types): return "'%s'" % expr else: return compiler.sql_compiler.process(expr) def _mysql_colspec( compiler, nullable, server_default, type_, autoincrement, comment ): spec = "%s %s" % ( compiler.dialect.type_compiler.process(type_), "NULL" if nullable else "NOT NULL", ) if autoincrement: spec += " AUTO_INCREMENT" if server_default is not False and server_default is not None: spec += " DEFAULT %s" % _render_value(compiler, server_default) if comment: spec += " COMMENT %s" % compiler.sql_compiler.render_literal_value( comment, sqltypes.String() ) return spec @compiles(schema.DropConstraint, "mysql") def _mysql_drop_constraint(element, compiler, **kw): """Redefine SQLAlchemy's drop constraint to raise errors for invalid constraint type.""" constraint = element.element if isinstance( constraint, ( schema.ForeignKeyConstraint, schema.PrimaryKeyConstraint, schema.UniqueConstraint, ), ): return compiler.visit_drop_constraint(element, **kw) elif isinstance(constraint, schema.CheckConstraint): # note that SQLAlchemy as of 1.2 does not yet support # DROP CONSTRAINT for MySQL/MariaDB, so we implement fully # here. if _is_mariadb(compiler.dialect): return "ALTER TABLE %s DROP CONSTRAINT %s" % ( compiler.preparer.format_table(constraint.table), compiler.preparer.format_constraint(constraint), ) else: return "ALTER TABLE %s DROP CHECK %s" % ( compiler.preparer.format_table(constraint.table), compiler.preparer.format_constraint(constraint), ) else: raise NotImplementedError( "No generic 'DROP CONSTRAINT' in MySQL - " "please specify constraint type" ) zzzeek-alembic-bee044a1c187/alembic/ddl/oracle.py000066400000000000000000000063731353106760100215710ustar00rootroot00000000000000from sqlalchemy.ext.compiler import compiles from sqlalchemy.sql import sqltypes from .base import AddColumn from .base import alter_table from .base import ColumnComment from .base import ColumnDefault from .base import ColumnName from .base import ColumnNullable from .base import ColumnType from .base import format_column_name from .base import format_server_default from .base import format_type from .impl import DefaultImpl class OracleImpl(DefaultImpl): __dialect__ = "oracle" transactional_ddl = False batch_separator = "/" command_terminator = "" def __init__(self, *arg, **kw): super(OracleImpl, self).__init__(*arg, **kw) self.batch_separator = self.context_opts.get( "oracle_batch_separator", self.batch_separator ) def _exec(self, construct, *args, **kw): result = super(OracleImpl, self)._exec(construct, *args, **kw) if self.as_sql and self.batch_separator: self.static_output(self.batch_separator) return result def emit_begin(self): self._exec("SET TRANSACTION READ WRITE") def emit_commit(self): self._exec("COMMIT") @compiles(AddColumn, "oracle") def visit_add_column(element, compiler, **kw): return "%s %s" % ( alter_table(compiler, element.table_name, element.schema), add_column(compiler, element.column, **kw), ) @compiles(ColumnNullable, "oracle") def visit_column_nullable(element, compiler, **kw): return "%s %s %s" % ( alter_table(compiler, element.table_name, element.schema), alter_column(compiler, element.column_name), "NULL" if element.nullable else "NOT NULL", ) @compiles(ColumnType, "oracle") def visit_column_type(element, compiler, **kw): return "%s %s %s" % ( alter_table(compiler, element.table_name, element.schema), alter_column(compiler, element.column_name), "%s" % format_type(compiler, element.type_), ) @compiles(ColumnName, "oracle") def visit_column_name(element, compiler, **kw): return "%s RENAME COLUMN %s TO %s" % ( alter_table(compiler, element.table_name, element.schema), format_column_name(compiler, element.column_name), format_column_name(compiler, element.newname), ) @compiles(ColumnDefault, "oracle") def visit_column_default(element, compiler, **kw): return "%s %s %s" % ( alter_table(compiler, element.table_name, element.schema), alter_column(compiler, element.column_name), "DEFAULT %s" % format_server_default(compiler, element.default) if element.default is not None else "DEFAULT NULL", ) @compiles(ColumnComment, "oracle") def visit_column_comment(element, compiler, **kw): ddl = "COMMENT ON COLUMN {table_name}.{column_name} IS {comment}" comment = compiler.sql_compiler.render_literal_value( (element.comment if element.comment is not None else ""), sqltypes.String(), ) return ddl.format( table_name=element.table_name, column_name=element.column_name, comment=comment, ) def alter_column(compiler, name): return "MODIFY %s" % format_column_name(compiler, name) def add_column(compiler, column, **kw): return "ADD %s" % compiler.get_column_specification(column, **kw) zzzeek-alembic-bee044a1c187/alembic/ddl/postgresql.py000066400000000000000000000416021353106760100225210ustar00rootroot00000000000000import logging import re from sqlalchemy import Column from sqlalchemy import Numeric from sqlalchemy import text from sqlalchemy import types as sqltypes from sqlalchemy.dialects.postgresql import BIGINT from sqlalchemy.dialects.postgresql import ExcludeConstraint from sqlalchemy.dialects.postgresql import INTEGER from sqlalchemy.sql.expression import ColumnClause from sqlalchemy.sql.expression import UnaryExpression from sqlalchemy.types import NULLTYPE from .base import alter_column from .base import alter_table from .base import AlterColumn from .base import ColumnComment from .base import compiles from .base import format_table_name from .base import format_type from .base import RenameTable from .impl import DefaultImpl from .. import util from ..autogenerate import render from ..operations import ops from ..operations import schemaobj from ..operations.base import BatchOperations from ..operations.base import Operations from ..util import compat from ..util import sqla_compat log = logging.getLogger(__name__) class PostgresqlImpl(DefaultImpl): __dialect__ = "postgresql" transactional_ddl = True def prep_table_for_batch(self, table): for constraint in table.constraints: if constraint.name is not None: self.drop_constraint(constraint) def compare_server_default( self, inspector_column, metadata_column, rendered_metadata_default, rendered_inspector_default, ): # don't do defaults for SERIAL columns if ( metadata_column.primary_key and metadata_column is metadata_column.table._autoincrement_column ): return False conn_col_default = rendered_inspector_default defaults_equal = conn_col_default == rendered_metadata_default if defaults_equal: return False if None in (conn_col_default, rendered_metadata_default): return not defaults_equal if compat.py2k: # look for a python 2 "u''" string and filter m = re.match(r"^u'(.*)'$", rendered_metadata_default) if m: rendered_metadata_default = "'%s'" % m.group(1) # check for unquoted string and quote for PG String types if ( not isinstance(inspector_column.type, Numeric) and metadata_column.server_default is not None and isinstance( metadata_column.server_default.arg, compat.string_types ) and not re.match(r"^'.*'$", rendered_metadata_default) ): rendered_metadata_default = "'%s'" % rendered_metadata_default return not self.connection.scalar( "SELECT %s = %s" % (conn_col_default, rendered_metadata_default) ) def alter_column( self, table_name, column_name, nullable=None, server_default=False, name=None, type_=None, schema=None, autoincrement=None, existing_type=None, existing_server_default=None, existing_nullable=None, existing_autoincrement=None, **kw ): using = kw.pop("postgresql_using", None) if using is not None and type_ is None: raise util.CommandError( "postgresql_using must be used with the type_ parameter" ) if type_ is not None: self._exec( PostgresqlColumnType( table_name, column_name, type_, schema=schema, using=using, existing_type=existing_type, existing_server_default=existing_server_default, existing_nullable=existing_nullable, ) ) super(PostgresqlImpl, self).alter_column( table_name, column_name, nullable=nullable, server_default=server_default, name=name, schema=schema, autoincrement=autoincrement, existing_type=existing_type, existing_server_default=existing_server_default, existing_nullable=existing_nullable, existing_autoincrement=existing_autoincrement, **kw ) def autogen_column_reflect(self, inspector, table, column_info): if column_info.get("default") and isinstance( column_info["type"], (INTEGER, BIGINT) ): seq_match = re.match( r"nextval\('(.+?)'::regclass\)", column_info["default"] ) if seq_match: info = inspector.bind.execute( text( "select c.relname, a.attname " "from pg_class as c join " "pg_depend d on d.objid=c.oid and " "d.classid='pg_class'::regclass and " "d.refclassid='pg_class'::regclass " "join pg_class t on t.oid=d.refobjid " "join pg_attribute a on a.attrelid=t.oid and " "a.attnum=d.refobjsubid " "where c.relkind='S' and c.relname=:seqname" ), seqname=seq_match.group(1), ).first() if info: seqname, colname = info if colname == column_info["name"]: log.info( "Detected sequence named '%s' as " "owned by integer column '%s(%s)', " "assuming SERIAL and omitting", seqname, table.name, colname, ) # sequence, and the owner is this column, # its a SERIAL - whack it! del column_info["default"] def correct_for_autogen_constraints( self, conn_unique_constraints, conn_indexes, metadata_unique_constraints, metadata_indexes, ): conn_indexes_by_name = dict((c.name, c) for c in conn_indexes) doubled_constraints = set( index for index in conn_indexes if index.info.get("duplicates_constraint") ) for ix in doubled_constraints: conn_indexes.remove(ix) for idx in list(metadata_indexes): if idx.name in conn_indexes_by_name: continue exprs = idx.expressions for expr in exprs: while isinstance(expr, UnaryExpression): expr = expr.element if not isinstance(expr, Column): util.warn( "autogenerate skipping functional index %s; " "not supported by SQLAlchemy reflection" % idx.name ) metadata_indexes.discard(idx) def render_type(self, type_, autogen_context): mod = type(type_).__module__ if not mod.startswith("sqlalchemy.dialects.postgresql"): return False if hasattr(self, "_render_%s_type" % type_.__visit_name__): meth = getattr(self, "_render_%s_type" % type_.__visit_name__) return meth(type_, autogen_context) return False def _render_HSTORE_type(self, type_, autogen_context): return render._render_type_w_subtype( type_, autogen_context, "text_type", r"(.+?\(.*text_type=)" ) def _render_ARRAY_type(self, type_, autogen_context): return render._render_type_w_subtype( type_, autogen_context, "item_type", r"(.+?\()" ) def _render_JSON_type(self, type_, autogen_context): return render._render_type_w_subtype( type_, autogen_context, "astext_type", r"(.+?\(.*astext_type=)" ) def _render_JSONB_type(self, type_, autogen_context): return render._render_type_w_subtype( type_, autogen_context, "astext_type", r"(.+?\(.*astext_type=)" ) class PostgresqlColumnType(AlterColumn): def __init__(self, name, column_name, type_, **kw): using = kw.pop("using", None) super(PostgresqlColumnType, self).__init__(name, column_name, **kw) self.type_ = sqltypes.to_instance(type_) self.using = using @compiles(RenameTable, "postgresql") def visit_rename_table(element, compiler, **kw): return "%s RENAME TO %s" % ( alter_table(compiler, element.table_name, element.schema), format_table_name(compiler, element.new_table_name, None), ) @compiles(PostgresqlColumnType, "postgresql") def visit_column_type(element, compiler, **kw): return "%s %s %s %s" % ( alter_table(compiler, element.table_name, element.schema), alter_column(compiler, element.column_name), "TYPE %s" % format_type(compiler, element.type_), "USING %s" % element.using if element.using else "", ) @compiles(ColumnComment, "postgresql") def visit_column_comment(element, compiler, **kw): ddl = "COMMENT ON COLUMN {table_name}.{column_name} IS {comment}" comment = ( compiler.sql_compiler.render_literal_value( element.comment, sqltypes.String() ) if element.comment is not None else "NULL" ) return ddl.format( table_name=element.table_name, column_name=element.column_name, comment=comment, ) @Operations.register_operation("create_exclude_constraint") @BatchOperations.register_operation( "create_exclude_constraint", "batch_create_exclude_constraint" ) @ops.AddConstraintOp.register_add_constraint("exclude_constraint") class CreateExcludeConstraintOp(ops.AddConstraintOp): """Represent a create exclude constraint operation.""" constraint_type = "exclude" def __init__( self, constraint_name, table_name, elements, where=None, schema=None, _orig_constraint=None, **kw ): self.constraint_name = constraint_name self.table_name = table_name self.elements = elements self.where = where self.schema = schema self._orig_constraint = _orig_constraint self.kw = kw @classmethod def from_constraint(cls, constraint): constraint_table = sqla_compat._table_for_constraint(constraint) return cls( constraint.name, constraint_table.name, [(expr, op) for expr, name, op in constraint._render_exprs], where=constraint.where, schema=constraint_table.schema, _orig_constraint=constraint, deferrable=constraint.deferrable, initially=constraint.initially, using=constraint.using, ) def to_constraint(self, migration_context=None): if self._orig_constraint is not None: return self._orig_constraint schema_obj = schemaobj.SchemaObjects(migration_context) t = schema_obj.table(self.table_name, schema=self.schema) excl = ExcludeConstraint( *self.elements, name=self.constraint_name, where=self.where, **self.kw ) for expr, name, oper in excl._render_exprs: t.append_column(Column(name, NULLTYPE)) t.append_constraint(excl) return excl @classmethod def create_exclude_constraint( cls, operations, constraint_name, table_name, *elements, **kw ): """Issue an alter to create an EXCLUDE constraint using the current migration context. .. note:: This method is Postgresql specific, and additionally requires at least SQLAlchemy 1.0. e.g.:: from alembic import op op.create_exclude_constraint( "user_excl", "user", ("period", '&&'), ("group", '='), where=("group != 'some group'") ) Note that the expressions work the same way as that of the ``ExcludeConstraint`` object itself; if plain strings are passed, quoting rules must be applied manually. :param name: Name of the constraint. :param table_name: String name of the source table. :param elements: exclude conditions. :param where: SQL expression or SQL string with optional WHERE clause. :param deferrable: optional bool. If set, emit DEFERRABLE or NOT DEFERRABLE when issuing DDL for this constraint. :param initially: optional string. If set, emit INITIALLY when issuing DDL for this constraint. :param schema: Optional schema name to operate within. .. versionadded:: 0.9.0 """ op = cls(constraint_name, table_name, elements, **kw) return operations.invoke(op) @classmethod def batch_create_exclude_constraint( cls, operations, constraint_name, *elements, **kw ): """Issue a "create exclude constraint" instruction using the current batch migration context. .. note:: This method is Postgresql specific, and additionally requires at least SQLAlchemy 1.0. .. versionadded:: 0.9.0 .. seealso:: :meth:`.Operations.create_exclude_constraint` """ kw["schema"] = operations.impl.schema op = cls(constraint_name, operations.impl.table_name, elements, **kw) return operations.invoke(op) @render.renderers.dispatch_for(CreateExcludeConstraintOp) def _add_exclude_constraint(autogen_context, op): return _exclude_constraint(op.to_constraint(), autogen_context, alter=True) @render._constraint_renderers.dispatch_for(ExcludeConstraint) def _render_inline_exclude_constraint(constraint, autogen_context): rendered = render._user_defined_render( "exclude", constraint, autogen_context ) if rendered is not False: return rendered return _exclude_constraint(constraint, autogen_context, False) def _postgresql_autogenerate_prefix(autogen_context): imports = autogen_context.imports if imports is not None: imports.add("from sqlalchemy.dialects import postgresql") return "postgresql." def _exclude_constraint(constraint, autogen_context, alter): opts = [] has_batch = autogen_context._has_batch if constraint.deferrable: opts.append(("deferrable", str(constraint.deferrable))) if constraint.initially: opts.append(("initially", str(constraint.initially))) if constraint.using: opts.append(("using", str(constraint.using))) if not has_batch and alter and constraint.table.schema: opts.append(("schema", render._ident(constraint.table.schema))) if not alter and constraint.name: opts.append( ("name", render._render_gen_name(autogen_context, constraint.name)) ) if alter: args = [ repr(render._render_gen_name(autogen_context, constraint.name)) ] if not has_batch: args += [repr(render._ident(constraint.table.name))] args.extend( [ "(%s, %r)" % ( _render_potential_column(sqltext, autogen_context), opstring, ) for sqltext, name, opstring in constraint._render_exprs ] ) if constraint.where is not None: args.append( "where=%s" % render._render_potential_expr( constraint.where, autogen_context ) ) args.extend(["%s=%r" % (k, v) for k, v in opts]) return "%(prefix)screate_exclude_constraint(%(args)s)" % { "prefix": render._alembic_autogenerate_prefix(autogen_context), "args": ", ".join(args), } else: args = [ "(%s, %r)" % (_render_potential_column(sqltext, autogen_context), opstring) for sqltext, name, opstring in constraint._render_exprs ] if constraint.where is not None: args.append( "where=%s" % render._render_potential_expr( constraint.where, autogen_context ) ) args.extend(["%s=%r" % (k, v) for k, v in opts]) return "%(prefix)sExcludeConstraint(%(args)s)" % { "prefix": _postgresql_autogenerate_prefix(autogen_context), "args": ", ".join(args), } def _render_potential_column(value, autogen_context): if isinstance(value, ColumnClause): template = "%(prefix)scolumn(%(name)r)" return template % { "prefix": render._sqlalchemy_autogenerate_prefix(autogen_context), "name": value.name, } else: return render._render_potential_expr( value, autogen_context, wrap_in_text=False ) zzzeek-alembic-bee044a1c187/alembic/ddl/sqlite.py000066400000000000000000000102571353106760100216210ustar00rootroot00000000000000import re from .impl import DefaultImpl from .. import util class SQLiteImpl(DefaultImpl): __dialect__ = "sqlite" transactional_ddl = False """SQLite supports transactional DDL, but pysqlite does not: see: http://bugs.python.org/issue10740 """ def requires_recreate_in_batch(self, batch_op): """Return True if the given :class:`.BatchOperationsImpl` would need the table to be recreated and copied in order to proceed. Normally, only returns True on SQLite when operations other than add_column are present. """ for op in batch_op.batch: if op[0] not in ("add_column", "create_index", "drop_index"): return True else: return False def add_constraint(self, const): # attempt to distinguish between an # auto-gen constraint and an explicit one if const._create_rule is None: raise NotImplementedError( "No support for ALTER of constraints in SQLite dialect" ) elif const._create_rule(self): util.warn( "Skipping unsupported ALTER for " "creation of implicit constraint" ) def drop_constraint(self, const): if const._create_rule is None: raise NotImplementedError( "No support for ALTER of constraints in SQLite dialect" ) def compare_server_default( self, inspector_column, metadata_column, rendered_metadata_default, rendered_inspector_default, ): if rendered_metadata_default is not None: rendered_metadata_default = re.sub( r"^\((.+)\)$", r"\1", rendered_metadata_default ) rendered_metadata_default = re.sub( r"^\"?'(.+)'\"?$", r"\1", rendered_metadata_default ) if rendered_inspector_default is not None: rendered_inspector_default = re.sub( r"^\"?'(.+)'\"?$", r"\1", rendered_inspector_default ) return rendered_inspector_default != rendered_metadata_default def _guess_if_default_is_unparenthesized_sql_expr(self, expr): """Determine if a server default is a SQL expression or a constant. There are too many assertions that expect server defaults to round-trip identically without parenthesis added so we will add parens only in very specific cases. """ if not expr: return False elif re.match(r"^[0-9\.]$", expr): return False elif re.match(r"^'.+'$", expr): return False elif re.match(r"^\(.+\)$", expr): return False else: return True def autogen_column_reflect(self, inspector, table, column_info): # SQLite expression defaults require parenthesis when sent # as DDL if self._guess_if_default_is_unparenthesized_sql_expr( column_info.get("default", None) ): column_info["default"] = "(%s)" % (column_info["default"],) def render_ddl_sql_expr(self, expr, is_server_default=False, **kw): # SQLite expression defaults require parenthesis when sent # as DDL str_expr = super(SQLiteImpl, self).render_ddl_sql_expr( expr, is_server_default=is_server_default, **kw ) if ( is_server_default and self._guess_if_default_is_unparenthesized_sql_expr(str_expr) ): str_expr = "(%s)" % (str_expr,) return str_expr # @compiles(AddColumn, 'sqlite') # def visit_add_column(element, compiler, **kw): # return "%s %s" % ( # alter_table(compiler, element.table_name, element.schema), # add_column(compiler, element.column, **kw) # ) # def add_column(compiler, column, **kw): # text = "ADD COLUMN %s" % compiler.get_column_specification(column, **kw) # need to modify SQLAlchemy so that the CHECK associated with a Boolean # or Enum gets placed as part of the column constraints, not the Table # see ticket 98 # for const in column.constraints: # text += compiler.process(AddConstraint(const)) # return text zzzeek-alembic-bee044a1c187/alembic/op.py000066400000000000000000000002471353106760100201710ustar00rootroot00000000000000from .operations.base import Operations # create proxy functions for # each method on the Operations class. Operations.create_module_class_proxy(globals(), locals()) zzzeek-alembic-bee044a1c187/alembic/operations/000077500000000000000000000000001353106760100213615ustar00rootroot00000000000000zzzeek-alembic-bee044a1c187/alembic/operations/__init__.py000066400000000000000000000003001353106760100234630ustar00rootroot00000000000000from . import toimpl # noqa from .base import BatchOperations from .base import Operations from .ops import MigrateOperation __all__ = ["Operations", "BatchOperations", "MigrateOperation"] zzzeek-alembic-bee044a1c187/alembic/operations/base.py000066400000000000000000000407471353106760100226610ustar00rootroot00000000000000from contextlib import contextmanager import textwrap from . import batch from . import schemaobj from .. import util from ..util import sqla_compat from ..util.compat import exec_ from ..util.compat import inspect_formatargspec from ..util.compat import inspect_getargspec __all__ = ("Operations", "BatchOperations") try: from sqlalchemy.sql.naming import conv except: conv = None class Operations(util.ModuleClsProxy): """Define high level migration operations. Each operation corresponds to some schema migration operation, executed against a particular :class:`.MigrationContext` which in turn represents connectivity to a database, or a file output stream. While :class:`.Operations` is normally configured as part of the :meth:`.EnvironmentContext.run_migrations` method called from an ``env.py`` script, a standalone :class:`.Operations` instance can be made for use cases external to regular Alembic migrations by passing in a :class:`.MigrationContext`:: from alembic.migration import MigrationContext from alembic.operations import Operations conn = myengine.connect() ctx = MigrationContext.configure(conn) op = Operations(ctx) op.alter_column("t", "c", nullable=True) Note that as of 0.8, most of the methods on this class are produced dynamically using the :meth:`.Operations.register_operation` method. """ _to_impl = util.Dispatcher() def __init__(self, migration_context, impl=None): """Construct a new :class:`.Operations` :param migration_context: a :class:`.MigrationContext` instance. """ self.migration_context = migration_context if impl is None: self.impl = migration_context.impl else: self.impl = impl self.schema_obj = schemaobj.SchemaObjects(migration_context) @classmethod def register_operation(cls, name, sourcename=None): """Register a new operation for this class. This method is normally used to add new operations to the :class:`.Operations` class, and possibly the :class:`.BatchOperations` class as well. All Alembic migration operations are implemented via this system, however the system is also available as a public API to facilitate adding custom operations. .. versionadded:: 0.8.0 .. seealso:: :ref:`operation_plugins` """ def register(op_cls): if sourcename is None: fn = getattr(op_cls, name) source_name = fn.__name__ else: fn = getattr(op_cls, sourcename) source_name = fn.__name__ spec = inspect_getargspec(fn) name_args = spec[0] assert name_args[0:2] == ["cls", "operations"] name_args[0:2] = ["self"] args = inspect_formatargspec(*spec) num_defaults = len(spec[3]) if spec[3] else 0 if num_defaults: defaulted_vals = name_args[0 - num_defaults :] else: defaulted_vals = () apply_kw = inspect_formatargspec( name_args, spec[1], spec[2], defaulted_vals, formatvalue=lambda x: "=" + x, ) func_text = textwrap.dedent( """\ def %(name)s%(args)s: %(doc)r return op_cls.%(source_name)s%(apply_kw)s """ % { "name": name, "source_name": source_name, "args": args, "apply_kw": apply_kw, "doc": fn.__doc__, "meth": fn.__name__, } ) globals_ = {"op_cls": op_cls} lcl = {} exec_(func_text, globals_, lcl) setattr(cls, name, lcl[name]) fn.__func__.__doc__ = ( "This method is proxied on " "the :class:`.%s` class, via the :meth:`.%s.%s` method." % (cls.__name__, cls.__name__, name) ) if hasattr(fn, "_legacy_translations"): lcl[name]._legacy_translations = fn._legacy_translations return op_cls return register @classmethod def implementation_for(cls, op_cls): """Register an implementation for a given :class:`.MigrateOperation`. This is part of the operation extensibility API. .. seealso:: :ref:`operation_plugins` - example of use """ def decorate(fn): cls._to_impl.dispatch_for(op_cls)(fn) return fn return decorate @classmethod @contextmanager def context(cls, migration_context): op = Operations(migration_context) op._install_proxy() yield op op._remove_proxy() @contextmanager def batch_alter_table( self, table_name, schema=None, recreate="auto", copy_from=None, table_args=(), table_kwargs=util.immutabledict(), reflect_args=(), reflect_kwargs=util.immutabledict(), naming_convention=None, ): """Invoke a series of per-table migrations in batch. Batch mode allows a series of operations specific to a table to be syntactically grouped together, and allows for alternate modes of table migration, in particular the "recreate" style of migration required by SQLite. "recreate" style is as follows: 1. A new table is created with the new specification, based on the migration directives within the batch, using a temporary name. 2. the data copied from the existing table to the new table. 3. the existing table is dropped. 4. the new table is renamed to the existing table name. The directive by default will only use "recreate" style on the SQLite backend, and only if directives are present which require this form, e.g. anything other than ``add_column()``. The batch operation on other backends will proceed using standard ALTER TABLE operations. The method is used as a context manager, which returns an instance of :class:`.BatchOperations`; this object is the same as :class:`.Operations` except that table names and schema names are omitted. E.g.:: with op.batch_alter_table("some_table") as batch_op: batch_op.add_column(Column('foo', Integer)) batch_op.drop_column('bar') The operations within the context manager are invoked at once when the context is ended. When run against SQLite, if the migrations include operations not supported by SQLite's ALTER TABLE, the entire table will be copied to a new one with the new specification, moving all data across as well. The copy operation by default uses reflection to retrieve the current structure of the table, and therefore :meth:`.batch_alter_table` in this mode requires that the migration is run in "online" mode. The ``copy_from`` parameter may be passed which refers to an existing :class:`.Table` object, which will bypass this reflection step. .. note:: The table copy operation will currently not copy CHECK constraints, and may not copy UNIQUE constraints that are unnamed, as is possible on SQLite. See the section :ref:`sqlite_batch_constraints` for workarounds. :param table_name: name of table :param schema: optional schema name. :param recreate: under what circumstances the table should be recreated. At its default of ``"auto"``, the SQLite dialect will recreate the table if any operations other than ``add_column()``, ``create_index()``, or ``drop_index()`` are present. Other options include ``"always"`` and ``"never"``. :param copy_from: optional :class:`~sqlalchemy.schema.Table` object that will act as the structure of the table being copied. If omitted, table reflection is used to retrieve the structure of the table. .. versionadded:: 0.7.6 Fully implemented the :paramref:`~.Operations.batch_alter_table.copy_from` parameter. .. seealso:: :ref:`batch_offline_mode` :paramref:`~.Operations.batch_alter_table.reflect_args` :paramref:`~.Operations.batch_alter_table.reflect_kwargs` :param reflect_args: a sequence of additional positional arguments that will be applied to the table structure being reflected / copied; this may be used to pass column and constraint overrides to the table that will be reflected, in lieu of passing the whole :class:`~sqlalchemy.schema.Table` using :paramref:`~.Operations.batch_alter_table.copy_from`. .. versionadded:: 0.7.1 :param reflect_kwargs: a dictionary of additional keyword arguments that will be applied to the table structure being copied; this may be used to pass additional table and reflection options to the table that will be reflected, in lieu of passing the whole :class:`~sqlalchemy.schema.Table` using :paramref:`~.Operations.batch_alter_table.copy_from`. .. versionadded:: 0.7.1 :param table_args: a sequence of additional positional arguments that will be applied to the new :class:`~sqlalchemy.schema.Table` when created, in addition to those copied from the source table. This may be used to provide additional constraints such as CHECK constraints that may not be reflected. :param table_kwargs: a dictionary of additional keyword arguments that will be applied to the new :class:`~sqlalchemy.schema.Table` when created, in addition to those copied from the source table. This may be used to provide for additional table options that may not be reflected. .. versionadded:: 0.7.0 :param naming_convention: a naming convention dictionary of the form described at :ref:`autogen_naming_conventions` which will be applied to the :class:`~sqlalchemy.schema.MetaData` during the reflection process. This is typically required if one wants to drop SQLite constraints, as these constraints will not have names when reflected on this backend. Requires SQLAlchemy **0.9.4** or greater. .. seealso:: :ref:`dropping_sqlite_foreign_keys` .. versionadded:: 0.7.1 .. note:: batch mode requires SQLAlchemy 0.8 or above. .. seealso:: :ref:`batch_migrations` """ impl = batch.BatchOperationsImpl( self, table_name, schema, recreate, copy_from, table_args, table_kwargs, reflect_args, reflect_kwargs, naming_convention, ) batch_op = BatchOperations(self.migration_context, impl=impl) yield batch_op impl.flush() def get_context(self): """Return the :class:`.MigrationContext` object that's currently in use. """ return self.migration_context def invoke(self, operation): """Given a :class:`.MigrateOperation`, invoke it in terms of this :class:`.Operations` instance. .. versionadded:: 0.8.0 """ fn = self._to_impl.dispatch( operation, self.migration_context.impl.__dialect__ ) return fn(self, operation) def f(self, name): """Indicate a string name that has already had a naming convention applied to it. This feature combines with the SQLAlchemy ``naming_convention`` feature to disambiguate constraint names that have already had naming conventions applied to them, versus those that have not. This is necessary in the case that the ``"%(constraint_name)s"`` token is used within a naming convention, so that it can be identified that this particular name should remain fixed. If the :meth:`.Operations.f` is used on a constraint, the naming convention will not take effect:: op.add_column('t', 'x', Boolean(name=op.f('ck_bool_t_x'))) Above, the CHECK constraint generated will have the name ``ck_bool_t_x`` regardless of whether or not a naming convention is in use. Alternatively, if a naming convention is in use, and 'f' is not used, names will be converted along conventions. If the ``target_metadata`` contains the naming convention ``{"ck": "ck_bool_%(table_name)s_%(constraint_name)s"}``, then the output of the following: op.add_column('t', 'x', Boolean(name='x')) will be:: CONSTRAINT ck_bool_t_x CHECK (x in (1, 0))) The function is rendered in the output of autogenerate when a particular constraint name is already converted, for SQLAlchemy version **0.9.4 and greater only**. Even though ``naming_convention`` was introduced in 0.9.2, the string disambiguation service is new as of 0.9.4. .. versionadded:: 0.6.4 """ if conv: return conv(name) else: raise NotImplementedError( "op.f() feature requires SQLAlchemy 0.9.4 or greater." ) def inline_literal(self, value, type_=None): r"""Produce an 'inline literal' expression, suitable for using in an INSERT, UPDATE, or DELETE statement. When using Alembic in "offline" mode, CRUD operations aren't compatible with SQLAlchemy's default behavior surrounding literal values, which is that they are converted into bound values and passed separately into the ``execute()`` method of the DBAPI cursor. An offline SQL script needs to have these rendered inline. While it should always be noted that inline literal values are an **enormous** security hole in an application that handles untrusted input, a schema migration is not run in this context, so literals are safe to render inline, with the caveat that advanced types like dates may not be supported directly by SQLAlchemy. See :meth:`.execute` for an example usage of :meth:`.inline_literal`. The environment can also be configured to attempt to render "literal" values inline automatically, for those simple types that are supported by the dialect; see :paramref:`.EnvironmentContext.configure.literal_binds` for this more recently added feature. :param value: The value to render. Strings, integers, and simple numerics should be supported. Other types like boolean, dates, etc. may or may not be supported yet by various backends. :param type\_: optional - a :class:`sqlalchemy.types.TypeEngine` subclass stating the type of this value. In SQLAlchemy expressions, this is usually derived automatically from the Python type of the value itself, as well as based on the context in which the value is used. .. seealso:: :paramref:`.EnvironmentContext.configure.literal_binds` """ return sqla_compat._literal_bindparam(None, value, type_=type_) def get_bind(self): """Return the current 'bind'. Under normal circumstances, this is the :class:`~sqlalchemy.engine.Connection` currently being used to emit SQL to the database. In a SQL script context, this value is ``None``. [TODO: verify this] """ return self.migration_context.impl.bind class BatchOperations(Operations): """Modifies the interface :class:`.Operations` for batch mode. This basically omits the ``table_name`` and ``schema`` parameters from associated methods, as these are a given when running under batch mode. .. seealso:: :meth:`.Operations.batch_alter_table` Note that as of 0.8, most of the methods on this class are produced dynamically using the :meth:`.Operations.register_operation` method. """ def _noop(self, operation): raise NotImplementedError( "The %s method does not apply to a batch table alter operation." % operation ) zzzeek-alembic-bee044a1c187/alembic/operations/batch.py000066400000000000000000000356761353106760100230350ustar00rootroot00000000000000from sqlalchemy import cast from sqlalchemy import CheckConstraint from sqlalchemy import Column from sqlalchemy import ForeignKeyConstraint from sqlalchemy import Index from sqlalchemy import MetaData from sqlalchemy import PrimaryKeyConstraint from sqlalchemy import schema as sql_schema from sqlalchemy import select from sqlalchemy import Table from sqlalchemy import types as sqltypes from sqlalchemy.events import SchemaEventTarget from sqlalchemy.util import OrderedDict from ..util.sqla_compat import _columns_for_constraint from ..util.sqla_compat import _fk_is_self_referential from ..util.sqla_compat import _is_type_bound from ..util.sqla_compat import _remove_column_from_collection class BatchOperationsImpl(object): def __init__( self, operations, table_name, schema, recreate, copy_from, table_args, table_kwargs, reflect_args, reflect_kwargs, naming_convention, ): self.operations = operations self.table_name = table_name self.schema = schema if recreate not in ("auto", "always", "never"): raise ValueError( "recreate may be one of 'auto', 'always', or 'never'." ) self.recreate = recreate self.copy_from = copy_from self.table_args = table_args self.table_kwargs = dict(table_kwargs) self.reflect_args = reflect_args self.reflect_kwargs = dict(reflect_kwargs) self.reflect_kwargs.setdefault( "listeners", list(self.reflect_kwargs.get("listeners", ())) ) self.reflect_kwargs["listeners"].append( ("column_reflect", operations.impl.autogen_column_reflect) ) self.naming_convention = naming_convention self.batch = [] @property def dialect(self): return self.operations.impl.dialect @property def impl(self): return self.operations.impl def _should_recreate(self): if self.recreate == "auto": return self.operations.impl.requires_recreate_in_batch(self) elif self.recreate == "always": return True else: return False def flush(self): should_recreate = self._should_recreate() if not should_recreate: for opname, arg, kw in self.batch: fn = getattr(self.operations.impl, opname) fn(*arg, **kw) else: if self.naming_convention: m1 = MetaData(naming_convention=self.naming_convention) else: m1 = MetaData() if self.copy_from is not None: existing_table = self.copy_from reflected = False else: existing_table = Table( self.table_name, m1, schema=self.schema, autoload=True, autoload_with=self.operations.get_bind(), *self.reflect_args, **self.reflect_kwargs ) reflected = True batch_impl = ApplyBatchImpl( existing_table, self.table_args, self.table_kwargs, reflected ) for opname, arg, kw in self.batch: fn = getattr(batch_impl, opname) fn(*arg, **kw) batch_impl._create(self.impl) def alter_column(self, *arg, **kw): self.batch.append(("alter_column", arg, kw)) def add_column(self, *arg, **kw): self.batch.append(("add_column", arg, kw)) def drop_column(self, *arg, **kw): self.batch.append(("drop_column", arg, kw)) def add_constraint(self, const): self.batch.append(("add_constraint", (const,), {})) def drop_constraint(self, const): self.batch.append(("drop_constraint", (const,), {})) def rename_table(self, *arg, **kw): self.batch.append(("rename_table", arg, kw)) def create_index(self, idx): self.batch.append(("create_index", (idx,), {})) def drop_index(self, idx): self.batch.append(("drop_index", (idx,), {})) def create_table(self, table): raise NotImplementedError("Can't create table in batch mode") def drop_table(self, table): raise NotImplementedError("Can't drop table in batch mode") class ApplyBatchImpl(object): def __init__(self, table, table_args, table_kwargs, reflected): self.table = table # this is a Table object self.table_args = table_args self.table_kwargs = table_kwargs self.temp_table_name = self._calc_temp_name(table.name) self.new_table = None self.column_transfers = OrderedDict( (c.name, {"expr": c}) for c in self.table.c ) self.reflected = reflected self._grab_table_elements() @classmethod def _calc_temp_name(cls, tablename): return ("_alembic_tmp_%s" % tablename)[0:50] def _grab_table_elements(self): schema = self.table.schema self.columns = OrderedDict() for c in self.table.c: c_copy = c.copy(schema=schema) c_copy.unique = c_copy.index = False # ensure that the type object was copied, # as we may need to modify it in-place if isinstance(c.type, SchemaEventTarget): assert c_copy.type is not c.type self.columns[c.name] = c_copy self.named_constraints = {} self.unnamed_constraints = [] self.indexes = {} self.new_indexes = {} for const in self.table.constraints: if _is_type_bound(const): continue elif self.reflected and isinstance(const, CheckConstraint): # TODO: we are skipping reflected CheckConstraint because # we have no way to determine _is_type_bound() for these. pass elif const.name: self.named_constraints[const.name] = const else: self.unnamed_constraints.append(const) for idx in self.table.indexes: self.indexes[idx.name] = idx for k in self.table.kwargs: self.table_kwargs.setdefault(k, self.table.kwargs[k]) def _transfer_elements_to_new_table(self): assert self.new_table is None, "Can only create new table once" m = MetaData() schema = self.table.schema self.new_table = new_table = Table( self.temp_table_name, m, *(list(self.columns.values()) + list(self.table_args)), schema=schema, **self.table_kwargs ) for const in ( list(self.named_constraints.values()) + self.unnamed_constraints ): const_columns = set( [c.key for c in _columns_for_constraint(const)] ) if not const_columns.issubset(self.column_transfers): continue if isinstance(const, ForeignKeyConstraint): if _fk_is_self_referential(const): # for self-referential constraint, refer to the # *original* table name, and not _alembic_batch_temp. # This is consistent with how we're handling # FK constraints from other tables; we assume SQLite # no foreign keys just keeps the names unchanged, so # when we rename back, they match again. const_copy = const.copy( schema=schema, target_table=self.table ) else: # "target_table" for ForeignKeyConstraint.copy() is # only used if the FK is detected as being # self-referential, which we are handling above. const_copy = const.copy(schema=schema) else: const_copy = const.copy(schema=schema, target_table=new_table) if isinstance(const, ForeignKeyConstraint): self._setup_referent(m, const) new_table.append_constraint(const_copy) def _gather_indexes_from_both_tables(self): idx = [] idx.extend(self.indexes.values()) for index in self.new_indexes.values(): idx.append( Index( index.name, unique=index.unique, *[self.new_table.c[col] for col in index.columns.keys()], **index.kwargs ) ) return idx def _setup_referent(self, metadata, constraint): spec = constraint.elements[0]._get_colspec() parts = spec.split(".") tname = parts[-2] if len(parts) == 3: referent_schema = parts[0] else: referent_schema = None if tname != self.temp_table_name: key = sql_schema._get_table_key(tname, referent_schema) if key in metadata.tables: t = metadata.tables[key] for elem in constraint.elements: colname = elem._get_colspec().split(".")[-1] if not t.c.contains_column(colname): t.append_column(Column(colname, sqltypes.NULLTYPE)) else: Table( tname, metadata, *[ Column(n, sqltypes.NULLTYPE) for n in [ elem._get_colspec().split(".")[-1] for elem in constraint.elements ] ], schema=referent_schema ) def _create(self, op_impl): self._transfer_elements_to_new_table() op_impl.prep_table_for_batch(self.table) op_impl.create_table(self.new_table) try: op_impl._exec( self.new_table.insert(inline=True).from_select( list( k for k, transfer in self.column_transfers.items() if "expr" in transfer ), select( [ transfer["expr"] for transfer in self.column_transfers.values() if "expr" in transfer ] ), ) ) op_impl.drop_table(self.table) except: op_impl.drop_table(self.new_table) raise else: op_impl.rename_table( self.temp_table_name, self.table.name, schema=self.table.schema ) self.new_table.name = self.table.name try: for idx in self._gather_indexes_from_both_tables(): op_impl.create_index(idx) finally: self.new_table.name = self.temp_table_name def alter_column( self, table_name, column_name, nullable=None, server_default=False, name=None, type_=None, autoincrement=None, **kw ): existing = self.columns[column_name] existing_transfer = self.column_transfers[column_name] if name is not None and name != column_name: # note that we don't change '.key' - we keep referring # to the renamed column by its old key in _create(). neat! existing.name = name existing_transfer["name"] = name if type_ is not None: type_ = sqltypes.to_instance(type_) # old type is being discarded so turn off eventing # rules. Alternatively we can # erase the events set up by this type, but this is simpler. # we also ignore the drop_constraint that will come here from # Operations.implementation_for(alter_column) if isinstance(existing.type, SchemaEventTarget): existing.type._create_events = ( existing.type.create_constraint ) = False if existing.type._type_affinity is not type_._type_affinity: existing_transfer["expr"] = cast( existing_transfer["expr"], type_ ) existing.type = type_ # we *dont* however set events for the new type, because # alter_column is invoked from # Operations.implementation_for(alter_column) which already # will emit an add_constraint() if nullable is not None: existing.nullable = nullable if server_default is not False: if server_default is None: existing.server_default = None else: sql_schema.DefaultClause(server_default)._set_parent(existing) if autoincrement is not None: existing.autoincrement = bool(autoincrement) def add_column(self, table_name, column, **kw): # we copy the column because operations.add_column() # gives us a Column that is part of a Table already. self.columns[column.name] = column.copy(schema=self.table.schema) self.column_transfers[column.name] = {} def drop_column(self, table_name, column, **kw): if column.name in self.table.primary_key.columns: _remove_column_from_collection( self.table.primary_key.columns, column ) del self.columns[column.name] del self.column_transfers[column.name] def add_constraint(self, const): if not const.name: raise ValueError("Constraint must have a name") if isinstance(const, sql_schema.PrimaryKeyConstraint): if self.table.primary_key in self.unnamed_constraints: self.unnamed_constraints.remove(self.table.primary_key) self.named_constraints[const.name] = const def drop_constraint(self, const): if not const.name: raise ValueError("Constraint must have a name") try: const = self.named_constraints.pop(const.name) except KeyError: if _is_type_bound(const): # type-bound constraints are only included in the new # table via their type object in any case, so ignore the # drop_constraint() that comes here via the # Operations.implementation_for(alter_column) return raise ValueError("No such constraint: '%s'" % const.name) else: if isinstance(const, PrimaryKeyConstraint): for col in const.columns: self.columns[col.name].primary_key = False def create_index(self, idx): self.new_indexes[idx.name] = idx def drop_index(self, idx): try: del self.indexes[idx.name] except KeyError: raise ValueError("No such index: '%s'" % idx.name) def rename_table(self, *arg, **kw): raise NotImplementedError("TODO") zzzeek-alembic-bee044a1c187/alembic/operations/ops.py000066400000000000000000002376311353106760100225500ustar00rootroot00000000000000import re from sqlalchemy.types import NULLTYPE from . import schemaobj from .base import BatchOperations from .base import Operations from .. import util from ..util import sqla_compat class MigrateOperation(object): """base class for migration command and organization objects. This system is part of the operation extensibility API. .. versionadded:: 0.8.0 .. seealso:: :ref:`operation_objects` :ref:`operation_plugins` :ref:`customizing_revision` """ @util.memoized_property def info(self): """A dictionary that may be used to store arbitrary information along with this :class:`.MigrateOperation` object. """ return {} class AddConstraintOp(MigrateOperation): """Represent an add constraint operation.""" add_constraint_ops = util.Dispatcher() @property def constraint_type(self): raise NotImplementedError() @classmethod def register_add_constraint(cls, type_): def go(klass): cls.add_constraint_ops.dispatch_for(type_)(klass.from_constraint) return klass return go @classmethod def from_constraint(cls, constraint): return cls.add_constraint_ops.dispatch(constraint.__visit_name__)( constraint ) def reverse(self): return DropConstraintOp.from_constraint(self.to_constraint()) def to_diff_tuple(self): return ("add_constraint", self.to_constraint()) @Operations.register_operation("drop_constraint") @BatchOperations.register_operation("drop_constraint", "batch_drop_constraint") class DropConstraintOp(MigrateOperation): """Represent a drop constraint operation.""" def __init__( self, constraint_name, table_name, type_=None, schema=None, _orig_constraint=None, ): self.constraint_name = constraint_name self.table_name = table_name self.constraint_type = type_ self.schema = schema self._orig_constraint = _orig_constraint def reverse(self): if self._orig_constraint is None: raise ValueError( "operation is not reversible; " "original constraint is not present" ) return AddConstraintOp.from_constraint(self._orig_constraint) def to_diff_tuple(self): if self.constraint_type == "foreignkey": return ("remove_fk", self.to_constraint()) else: return ("remove_constraint", self.to_constraint()) @classmethod def from_constraint(cls, constraint): types = { "unique_constraint": "unique", "foreign_key_constraint": "foreignkey", "primary_key_constraint": "primary", "check_constraint": "check", "column_check_constraint": "check", } constraint_table = sqla_compat._table_for_constraint(constraint) return cls( constraint.name, constraint_table.name, schema=constraint_table.schema, type_=types[constraint.__visit_name__], _orig_constraint=constraint, ) def to_constraint(self): if self._orig_constraint is not None: return self._orig_constraint else: raise ValueError( "constraint cannot be produced; " "original constraint is not present" ) @classmethod @util._with_legacy_names([("type", "type_"), ("name", "constraint_name")]) def drop_constraint( cls, operations, constraint_name, table_name, type_=None, schema=None ): r"""Drop a constraint of the given name, typically via DROP CONSTRAINT. :param constraint_name: name of the constraint. :param table_name: table name. :param type\_: optional, required on MySQL. can be 'foreignkey', 'primary', 'unique', or 'check'. :param schema: Optional schema name to operate within. To control quoting of the schema outside of the default behavior, use the SQLAlchemy construct :class:`~sqlalchemy.sql.elements.quoted_name`. .. versionadded:: 0.7.0 'schema' can now accept a :class:`~sqlalchemy.sql.elements.quoted_name` construct. .. versionchanged:: 0.8.0 The following positional argument names have been changed: * name -> constraint_name """ op = cls(constraint_name, table_name, type_=type_, schema=schema) return operations.invoke(op) @classmethod def batch_drop_constraint(cls, operations, constraint_name, type_=None): """Issue a "drop constraint" instruction using the current batch migration context. The batch form of this call omits the ``table_name`` and ``schema`` arguments from the call. .. seealso:: :meth:`.Operations.drop_constraint` .. versionchanged:: 0.8.0 The following positional argument names have been changed: * name -> constraint_name """ op = cls( constraint_name, operations.impl.table_name, type_=type_, schema=operations.impl.schema, ) return operations.invoke(op) @Operations.register_operation("create_primary_key") @BatchOperations.register_operation( "create_primary_key", "batch_create_primary_key" ) @AddConstraintOp.register_add_constraint("primary_key_constraint") class CreatePrimaryKeyOp(AddConstraintOp): """Represent a create primary key operation.""" constraint_type = "primarykey" def __init__( self, constraint_name, table_name, columns, schema=None, _orig_constraint=None, **kw ): self.constraint_name = constraint_name self.table_name = table_name self.columns = columns self.schema = schema self._orig_constraint = _orig_constraint self.kw = kw @classmethod def from_constraint(cls, constraint): constraint_table = sqla_compat._table_for_constraint(constraint) return cls( constraint.name, constraint_table.name, constraint.columns, schema=constraint_table.schema, _orig_constraint=constraint, ) def to_constraint(self, migration_context=None): if self._orig_constraint is not None: return self._orig_constraint schema_obj = schemaobj.SchemaObjects(migration_context) return schema_obj.primary_key_constraint( self.constraint_name, self.table_name, self.columns, schema=self.schema, ) @classmethod @util._with_legacy_names( [("name", "constraint_name"), ("cols", "columns")] ) def create_primary_key( cls, operations, constraint_name, table_name, columns, schema=None ): """Issue a "create primary key" instruction using the current migration context. e.g.:: from alembic import op op.create_primary_key( "pk_my_table", "my_table", ["id", "version"] ) This internally generates a :class:`~sqlalchemy.schema.Table` object containing the necessary columns, then generates a new :class:`~sqlalchemy.schema.PrimaryKeyConstraint` object which it then associates with the :class:`~sqlalchemy.schema.Table`. Any event listeners associated with this action will be fired off normally. The :class:`~sqlalchemy.schema.AddConstraint` construct is ultimately used to generate the ALTER statement. :param name: Name of the primary key constraint. The name is necessary so that an ALTER statement can be emitted. For setups that use an automated naming scheme such as that described at :ref:`sqla:constraint_naming_conventions` ``name`` here can be ``None``, as the event listener will apply the name to the constraint object when it is associated with the table. :param table_name: String name of the target table. :param columns: a list of string column names to be applied to the primary key constraint. :param schema: Optional schema name to operate within. To control quoting of the schema outside of the default behavior, use the SQLAlchemy construct :class:`~sqlalchemy.sql.elements.quoted_name`. .. versionadded:: 0.7.0 'schema' can now accept a :class:`~sqlalchemy.sql.elements.quoted_name` construct. .. versionchanged:: 0.8.0 The following positional argument names have been changed: * name -> constraint_name * cols -> columns """ op = cls(constraint_name, table_name, columns, schema) return operations.invoke(op) @classmethod def batch_create_primary_key(cls, operations, constraint_name, columns): """Issue a "create primary key" instruction using the current batch migration context. The batch form of this call omits the ``table_name`` and ``schema`` arguments from the call. .. seealso:: :meth:`.Operations.create_primary_key` """ op = cls( constraint_name, operations.impl.table_name, columns, schema=operations.impl.schema, ) return operations.invoke(op) @Operations.register_operation("create_unique_constraint") @BatchOperations.register_operation( "create_unique_constraint", "batch_create_unique_constraint" ) @AddConstraintOp.register_add_constraint("unique_constraint") class CreateUniqueConstraintOp(AddConstraintOp): """Represent a create unique constraint operation.""" constraint_type = "unique" def __init__( self, constraint_name, table_name, columns, schema=None, _orig_constraint=None, **kw ): self.constraint_name = constraint_name self.table_name = table_name self.columns = columns self.schema = schema self._orig_constraint = _orig_constraint self.kw = kw @classmethod def from_constraint(cls, constraint): constraint_table = sqla_compat._table_for_constraint(constraint) kw = {} if constraint.deferrable: kw["deferrable"] = constraint.deferrable if constraint.initially: kw["initially"] = constraint.initially return cls( constraint.name, constraint_table.name, [c.name for c in constraint.columns], schema=constraint_table.schema, _orig_constraint=constraint, **kw ) def to_constraint(self, migration_context=None): if self._orig_constraint is not None: return self._orig_constraint schema_obj = schemaobj.SchemaObjects(migration_context) return schema_obj.unique_constraint( self.constraint_name, self.table_name, self.columns, schema=self.schema, **self.kw ) @classmethod @util._with_legacy_names( [ ("name", "constraint_name"), ("source", "table_name"), ("local_cols", "columns"), ] ) def create_unique_constraint( cls, operations, constraint_name, table_name, columns, schema=None, **kw ): """Issue a "create unique constraint" instruction using the current migration context. e.g.:: from alembic import op op.create_unique_constraint("uq_user_name", "user", ["name"]) This internally generates a :class:`~sqlalchemy.schema.Table` object containing the necessary columns, then generates a new :class:`~sqlalchemy.schema.UniqueConstraint` object which it then associates with the :class:`~sqlalchemy.schema.Table`. Any event listeners associated with this action will be fired off normally. The :class:`~sqlalchemy.schema.AddConstraint` construct is ultimately used to generate the ALTER statement. :param name: Name of the unique constraint. The name is necessary so that an ALTER statement can be emitted. For setups that use an automated naming scheme such as that described at :ref:`sqla:constraint_naming_conventions`, ``name`` here can be ``None``, as the event listener will apply the name to the constraint object when it is associated with the table. :param table_name: String name of the source table. :param columns: a list of string column names in the source table. :param deferrable: optional bool. If set, emit DEFERRABLE or NOT DEFERRABLE when issuing DDL for this constraint. :param initially: optional string. If set, emit INITIALLY when issuing DDL for this constraint. :param schema: Optional schema name to operate within. To control quoting of the schema outside of the default behavior, use the SQLAlchemy construct :class:`~sqlalchemy.sql.elements.quoted_name`. .. versionadded:: 0.7.0 'schema' can now accept a :class:`~sqlalchemy.sql.elements.quoted_name` construct. .. versionchanged:: 0.8.0 The following positional argument names have been changed: * name -> constraint_name * source -> table_name * local_cols -> columns """ op = cls(constraint_name, table_name, columns, schema=schema, **kw) return operations.invoke(op) @classmethod @util._with_legacy_names([("name", "constraint_name")]) def batch_create_unique_constraint( cls, operations, constraint_name, columns, **kw ): """Issue a "create unique constraint" instruction using the current batch migration context. The batch form of this call omits the ``source`` and ``schema`` arguments from the call. .. seealso:: :meth:`.Operations.create_unique_constraint` .. versionchanged:: 0.8.0 The following positional argument names have been changed: * name -> constraint_name """ kw["schema"] = operations.impl.schema op = cls(constraint_name, operations.impl.table_name, columns, **kw) return operations.invoke(op) @Operations.register_operation("create_foreign_key") @BatchOperations.register_operation( "create_foreign_key", "batch_create_foreign_key" ) @AddConstraintOp.register_add_constraint("foreign_key_constraint") class CreateForeignKeyOp(AddConstraintOp): """Represent a create foreign key constraint operation.""" constraint_type = "foreignkey" def __init__( self, constraint_name, source_table, referent_table, local_cols, remote_cols, _orig_constraint=None, **kw ): self.constraint_name = constraint_name self.source_table = source_table self.referent_table = referent_table self.local_cols = local_cols self.remote_cols = remote_cols self._orig_constraint = _orig_constraint self.kw = kw def to_diff_tuple(self): return ("add_fk", self.to_constraint()) @classmethod def from_constraint(cls, constraint): kw = {} if constraint.onupdate: kw["onupdate"] = constraint.onupdate if constraint.ondelete: kw["ondelete"] = constraint.ondelete if constraint.initially: kw["initially"] = constraint.initially if constraint.deferrable: kw["deferrable"] = constraint.deferrable if constraint.use_alter: kw["use_alter"] = constraint.use_alter ( source_schema, source_table, source_columns, target_schema, target_table, target_columns, onupdate, ondelete, deferrable, initially, ) = sqla_compat._fk_spec(constraint) kw["source_schema"] = source_schema kw["referent_schema"] = target_schema return cls( constraint.name, source_table, target_table, source_columns, target_columns, _orig_constraint=constraint, **kw ) def to_constraint(self, migration_context=None): if self._orig_constraint is not None: return self._orig_constraint schema_obj = schemaobj.SchemaObjects(migration_context) return schema_obj.foreign_key_constraint( self.constraint_name, self.source_table, self.referent_table, self.local_cols, self.remote_cols, **self.kw ) @classmethod @util._with_legacy_names( [ ("name", "constraint_name"), ("source", "source_table"), ("referent", "referent_table"), ] ) def create_foreign_key( cls, operations, constraint_name, source_table, referent_table, local_cols, remote_cols, onupdate=None, ondelete=None, deferrable=None, initially=None, match=None, source_schema=None, referent_schema=None, **dialect_kw ): """Issue a "create foreign key" instruction using the current migration context. e.g.:: from alembic import op op.create_foreign_key( "fk_user_address", "address", "user", ["user_id"], ["id"]) This internally generates a :class:`~sqlalchemy.schema.Table` object containing the necessary columns, then generates a new :class:`~sqlalchemy.schema.ForeignKeyConstraint` object which it then associates with the :class:`~sqlalchemy.schema.Table`. Any event listeners associated with this action will be fired off normally. The :class:`~sqlalchemy.schema.AddConstraint` construct is ultimately used to generate the ALTER statement. :param name: Name of the foreign key constraint. The name is necessary so that an ALTER statement can be emitted. For setups that use an automated naming scheme such as that described at :ref:`sqla:constraint_naming_conventions`, ``name`` here can be ``None``, as the event listener will apply the name to the constraint object when it is associated with the table. :param source_table: String name of the source table. :param referent_table: String name of the destination table. :param local_cols: a list of string column names in the source table. :param remote_cols: a list of string column names in the remote table. :param onupdate: Optional string. If set, emit ON UPDATE when issuing DDL for this constraint. Typical values include CASCADE, DELETE and RESTRICT. :param ondelete: Optional string. If set, emit ON DELETE when issuing DDL for this constraint. Typical values include CASCADE, DELETE and RESTRICT. :param deferrable: optional bool. If set, emit DEFERRABLE or NOT DEFERRABLE when issuing DDL for this constraint. :param source_schema: Optional schema name of the source table. :param referent_schema: Optional schema name of the destination table. .. versionchanged:: 0.8.0 The following positional argument names have been changed: * name -> constraint_name * source -> source_table * referent -> referent_table """ op = cls( constraint_name, source_table, referent_table, local_cols, remote_cols, onupdate=onupdate, ondelete=ondelete, deferrable=deferrable, source_schema=source_schema, referent_schema=referent_schema, initially=initially, match=match, **dialect_kw ) return operations.invoke(op) @classmethod @util._with_legacy_names( [("name", "constraint_name"), ("referent", "referent_table")] ) def batch_create_foreign_key( cls, operations, constraint_name, referent_table, local_cols, remote_cols, referent_schema=None, onupdate=None, ondelete=None, deferrable=None, initially=None, match=None, **dialect_kw ): """Issue a "create foreign key" instruction using the current batch migration context. The batch form of this call omits the ``source`` and ``source_schema`` arguments from the call. e.g.:: with batch_alter_table("address") as batch_op: batch_op.create_foreign_key( "fk_user_address", "user", ["user_id"], ["id"]) .. seealso:: :meth:`.Operations.create_foreign_key` .. versionchanged:: 0.8.0 The following positional argument names have been changed: * name -> constraint_name * referent -> referent_table """ op = cls( constraint_name, operations.impl.table_name, referent_table, local_cols, remote_cols, onupdate=onupdate, ondelete=ondelete, deferrable=deferrable, source_schema=operations.impl.schema, referent_schema=referent_schema, initially=initially, match=match, **dialect_kw ) return operations.invoke(op) @Operations.register_operation("create_check_constraint") @BatchOperations.register_operation( "create_check_constraint", "batch_create_check_constraint" ) @AddConstraintOp.register_add_constraint("check_constraint") @AddConstraintOp.register_add_constraint("column_check_constraint") class CreateCheckConstraintOp(AddConstraintOp): """Represent a create check constraint operation.""" constraint_type = "check" def __init__( self, constraint_name, table_name, condition, schema=None, _orig_constraint=None, **kw ): self.constraint_name = constraint_name self.table_name = table_name self.condition = condition self.schema = schema self._orig_constraint = _orig_constraint self.kw = kw @classmethod def from_constraint(cls, constraint): constraint_table = sqla_compat._table_for_constraint(constraint) return cls( constraint.name, constraint_table.name, constraint.sqltext, schema=constraint_table.schema, _orig_constraint=constraint, ) def to_constraint(self, migration_context=None): if self._orig_constraint is not None: return self._orig_constraint schema_obj = schemaobj.SchemaObjects(migration_context) return schema_obj.check_constraint( self.constraint_name, self.table_name, self.condition, schema=self.schema, **self.kw ) @classmethod @util._with_legacy_names( [("name", "constraint_name"), ("source", "table_name")] ) def create_check_constraint( cls, operations, constraint_name, table_name, condition, schema=None, **kw ): """Issue a "create check constraint" instruction using the current migration context. e.g.:: from alembic import op from sqlalchemy.sql import column, func op.create_check_constraint( "ck_user_name_len", "user", func.len(column('name')) > 5 ) CHECK constraints are usually against a SQL expression, so ad-hoc table metadata is usually needed. The function will convert the given arguments into a :class:`sqlalchemy.schema.CheckConstraint` bound to an anonymous table in order to emit the CREATE statement. :param name: Name of the check constraint. The name is necessary so that an ALTER statement can be emitted. For setups that use an automated naming scheme such as that described at :ref:`sqla:constraint_naming_conventions`, ``name`` here can be ``None``, as the event listener will apply the name to the constraint object when it is associated with the table. :param table_name: String name of the source table. :param condition: SQL expression that's the condition of the constraint. Can be a string or SQLAlchemy expression language structure. :param deferrable: optional bool. If set, emit DEFERRABLE or NOT DEFERRABLE when issuing DDL for this constraint. :param initially: optional string. If set, emit INITIALLY when issuing DDL for this constraint. :param schema: Optional schema name to operate within. To control quoting of the schema outside of the default behavior, use the SQLAlchemy construct :class:`~sqlalchemy.sql.elements.quoted_name`. .. versionadded:: 0.7.0 'schema' can now accept a :class:`~sqlalchemy.sql.elements.quoted_name` construct. .. versionchanged:: 0.8.0 The following positional argument names have been changed: * name -> constraint_name * source -> table_name """ op = cls(constraint_name, table_name, condition, schema=schema, **kw) return operations.invoke(op) @classmethod @util._with_legacy_names([("name", "constraint_name")]) def batch_create_check_constraint( cls, operations, constraint_name, condition, **kw ): """Issue a "create check constraint" instruction using the current batch migration context. The batch form of this call omits the ``source`` and ``schema`` arguments from the call. .. seealso:: :meth:`.Operations.create_check_constraint` .. versionchanged:: 0.8.0 The following positional argument names have been changed: * name -> constraint_name """ op = cls( constraint_name, operations.impl.table_name, condition, schema=operations.impl.schema, **kw ) return operations.invoke(op) @Operations.register_operation("create_index") @BatchOperations.register_operation("create_index", "batch_create_index") class CreateIndexOp(MigrateOperation): """Represent a create index operation.""" def __init__( self, index_name, table_name, columns, schema=None, unique=False, _orig_index=None, **kw ): self.index_name = index_name self.table_name = table_name self.columns = columns self.schema = schema self.unique = unique self.kw = kw self._orig_index = _orig_index def reverse(self): return DropIndexOp.from_index(self.to_index()) def to_diff_tuple(self): return ("add_index", self.to_index()) @classmethod def from_index(cls, index): return cls( index.name, index.table.name, sqla_compat._get_index_expressions(index), schema=index.table.schema, unique=index.unique, _orig_index=index, **index.kwargs ) def to_index(self, migration_context=None): if self._orig_index: return self._orig_index schema_obj = schemaobj.SchemaObjects(migration_context) return schema_obj.index( self.index_name, self.table_name, self.columns, schema=self.schema, unique=self.unique, **self.kw ) @classmethod @util._with_legacy_names([("name", "index_name")]) def create_index( cls, operations, index_name, table_name, columns, schema=None, unique=False, **kw ): r"""Issue a "create index" instruction using the current migration context. e.g.:: from alembic import op op.create_index('ik_test', 't1', ['foo', 'bar']) Functional indexes can be produced by using the :func:`sqlalchemy.sql.expression.text` construct:: from alembic import op from sqlalchemy import text op.create_index('ik_test', 't1', [text('lower(foo)')]) .. versionadded:: 0.6.7 support for making use of the :func:`~sqlalchemy.sql.expression.text` construct in conjunction with :meth:`.Operations.create_index` in order to produce functional expressions within CREATE INDEX. :param index_name: name of the index. :param table_name: name of the owning table. :param columns: a list consisting of string column names and/or :func:`~sqlalchemy.sql.expression.text` constructs. :param schema: Optional schema name to operate within. To control quoting of the schema outside of the default behavior, use the SQLAlchemy construct :class:`~sqlalchemy.sql.elements.quoted_name`. .. versionadded:: 0.7.0 'schema' can now accept a :class:`~sqlalchemy.sql.elements.quoted_name` construct. :param unique: If True, create a unique index. :param quote: Force quoting of this column's name on or off, corresponding to ``True`` or ``False``. When left at its default of ``None``, the column identifier will be quoted according to whether the name is case sensitive (identifiers with at least one upper case character are treated as case sensitive), or if it's a reserved word. This flag is only needed to force quoting of a reserved word which is not known by the SQLAlchemy dialect. :param \**kw: Additional keyword arguments not mentioned above are dialect specific, and passed in the form ``_``. See the documentation regarding an individual dialect at :ref:`dialect_toplevel` for detail on documented arguments. .. versionchanged:: 0.8.0 The following positional argument names have been changed: * name -> index_name """ op = cls( index_name, table_name, columns, schema=schema, unique=unique, **kw ) return operations.invoke(op) @classmethod def batch_create_index(cls, operations, index_name, columns, **kw): """Issue a "create index" instruction using the current batch migration context. .. seealso:: :meth:`.Operations.create_index` """ op = cls( index_name, operations.impl.table_name, columns, schema=operations.impl.schema, **kw ) return operations.invoke(op) @Operations.register_operation("drop_index") @BatchOperations.register_operation("drop_index", "batch_drop_index") class DropIndexOp(MigrateOperation): """Represent a drop index operation.""" def __init__( self, index_name, table_name=None, schema=None, _orig_index=None, **kw ): self.index_name = index_name self.table_name = table_name self.schema = schema self._orig_index = _orig_index self.kw = kw def to_diff_tuple(self): return ("remove_index", self.to_index()) def reverse(self): if self._orig_index is None: raise ValueError( "operation is not reversible; " "original index is not present" ) return CreateIndexOp.from_index(self._orig_index) @classmethod def from_index(cls, index): return cls( index.name, index.table.name, schema=index.table.schema, _orig_index=index, **index.kwargs ) def to_index(self, migration_context=None): if self._orig_index is not None: return self._orig_index schema_obj = schemaobj.SchemaObjects(migration_context) # need a dummy column name here since SQLAlchemy # 0.7.6 and further raises on Index with no columns return schema_obj.index( self.index_name, self.table_name, ["x"], schema=self.schema, **self.kw ) @classmethod @util._with_legacy_names( [("name", "index_name"), ("tablename", "table_name")] ) def drop_index( cls, operations, index_name, table_name=None, schema=None, **kw ): r"""Issue a "drop index" instruction using the current migration context. e.g.:: drop_index("accounts") :param index_name: name of the index. :param table_name: name of the owning table. Some backends such as Microsoft SQL Server require this. :param schema: Optional schema name to operate within. To control quoting of the schema outside of the default behavior, use the SQLAlchemy construct :class:`~sqlalchemy.sql.elements.quoted_name`. .. versionadded:: 0.7.0 'schema' can now accept a :class:`~sqlalchemy.sql.elements.quoted_name` construct. :param \**kw: Additional keyword arguments not mentioned above are dialect specific, and passed in the form ``_``. See the documentation regarding an individual dialect at :ref:`dialect_toplevel` for detail on documented arguments. .. versionadded:: 0.9.5 Support for dialect-specific keyword arguments for DROP INDEX .. versionchanged:: 0.8.0 The following positional argument names have been changed: * name -> index_name """ op = cls(index_name, table_name=table_name, schema=schema, **kw) return operations.invoke(op) @classmethod @util._with_legacy_names([("name", "index_name")]) def batch_drop_index(cls, operations, index_name, **kw): """Issue a "drop index" instruction using the current batch migration context. .. seealso:: :meth:`.Operations.drop_index` .. versionchanged:: 0.8.0 The following positional argument names have been changed: * name -> index_name """ op = cls( index_name, table_name=operations.impl.table_name, schema=operations.impl.schema, **kw ) return operations.invoke(op) @Operations.register_operation("create_table") class CreateTableOp(MigrateOperation): """Represent a create table operation.""" def __init__( self, table_name, columns, schema=None, _orig_table=None, **kw ): self.table_name = table_name self.columns = columns self.schema = schema self.kw = kw self._orig_table = _orig_table def reverse(self): return DropTableOp.from_table(self.to_table()) def to_diff_tuple(self): return ("add_table", self.to_table()) @classmethod def from_table(cls, table): return cls( table.name, list(table.c) + list(table.constraints), schema=table.schema, _orig_table=table, **table.kwargs ) def to_table(self, migration_context=None): if self._orig_table is not None: return self._orig_table schema_obj = schemaobj.SchemaObjects(migration_context) return schema_obj.table( self.table_name, *self.columns, schema=self.schema, **self.kw ) @classmethod @util._with_legacy_names([("name", "table_name")]) def create_table(cls, operations, table_name, *columns, **kw): r"""Issue a "create table" instruction using the current migration context. This directive receives an argument list similar to that of the traditional :class:`sqlalchemy.schema.Table` construct, but without the metadata:: from sqlalchemy import INTEGER, VARCHAR, NVARCHAR, Column from alembic import op op.create_table( 'account', Column('id', INTEGER, primary_key=True), Column('name', VARCHAR(50), nullable=False), Column('description', NVARCHAR(200)), Column('timestamp', TIMESTAMP, server_default=func.now()) ) Note that :meth:`.create_table` accepts :class:`~sqlalchemy.schema.Column` constructs directly from the SQLAlchemy library. In particular, default values to be created on the database side are specified using the ``server_default`` parameter, and not ``default`` which only specifies Python-side defaults:: from alembic import op from sqlalchemy import Column, TIMESTAMP, func # specify "DEFAULT NOW" along with the "timestamp" column op.create_table('account', Column('id', INTEGER, primary_key=True), Column('timestamp', TIMESTAMP, server_default=func.now()) ) The function also returns a newly created :class:`~sqlalchemy.schema.Table` object, corresponding to the table specification given, which is suitable for immediate SQL operations, in particular :meth:`.Operations.bulk_insert`:: from sqlalchemy import INTEGER, VARCHAR, NVARCHAR, Column from alembic import op account_table = op.create_table( 'account', Column('id', INTEGER, primary_key=True), Column('name', VARCHAR(50), nullable=False), Column('description', NVARCHAR(200)), Column('timestamp', TIMESTAMP, server_default=func.now()) ) op.bulk_insert( account_table, [ {"name": "A1", "description": "account 1"}, {"name": "A2", "description": "account 2"}, ] ) .. versionadded:: 0.7.0 :param table_name: Name of the table :param \*columns: collection of :class:`~sqlalchemy.schema.Column` objects within the table, as well as optional :class:`~sqlalchemy.schema.Constraint` objects and :class:`~.sqlalchemy.schema.Index` objects. :param schema: Optional schema name to operate within. To control quoting of the schema outside of the default behavior, use the SQLAlchemy construct :class:`~sqlalchemy.sql.elements.quoted_name`. .. versionadded:: 0.7.0 'schema' can now accept a :class:`~sqlalchemy.sql.elements.quoted_name` construct. :param \**kw: Other keyword arguments are passed to the underlying :class:`sqlalchemy.schema.Table` object created for the command. :return: the :class:`~sqlalchemy.schema.Table` object corresponding to the parameters given. .. versionadded:: 0.7.0 - the :class:`~sqlalchemy.schema.Table` object is returned. .. versionchanged:: 0.8.0 The following positional argument names have been changed: * name -> table_name """ op = cls(table_name, columns, **kw) return operations.invoke(op) @Operations.register_operation("drop_table") class DropTableOp(MigrateOperation): """Represent a drop table operation.""" def __init__( self, table_name, schema=None, table_kw=None, _orig_table=None ): self.table_name = table_name self.schema = schema self.table_kw = table_kw or {} self._orig_table = _orig_table def to_diff_tuple(self): return ("remove_table", self.to_table()) def reverse(self): if self._orig_table is None: raise ValueError( "operation is not reversible; " "original table is not present" ) return CreateTableOp.from_table(self._orig_table) @classmethod def from_table(cls, table): return cls(table.name, schema=table.schema, _orig_table=table) def to_table(self, migration_context=None): if self._orig_table is not None: return self._orig_table schema_obj = schemaobj.SchemaObjects(migration_context) return schema_obj.table( self.table_name, schema=self.schema, **self.table_kw ) @classmethod @util._with_legacy_names([("name", "table_name")]) def drop_table(cls, operations, table_name, schema=None, **kw): r"""Issue a "drop table" instruction using the current migration context. e.g.:: drop_table("accounts") :param table_name: Name of the table :param schema: Optional schema name to operate within. To control quoting of the schema outside of the default behavior, use the SQLAlchemy construct :class:`~sqlalchemy.sql.elements.quoted_name`. .. versionadded:: 0.7.0 'schema' can now accept a :class:`~sqlalchemy.sql.elements.quoted_name` construct. :param \**kw: Other keyword arguments are passed to the underlying :class:`sqlalchemy.schema.Table` object created for the command. .. versionchanged:: 0.8.0 The following positional argument names have been changed: * name -> table_name """ op = cls(table_name, schema=schema, table_kw=kw) operations.invoke(op) class AlterTableOp(MigrateOperation): """Represent an alter table operation.""" def __init__(self, table_name, schema=None): self.table_name = table_name self.schema = schema @Operations.register_operation("rename_table") class RenameTableOp(AlterTableOp): """Represent a rename table operation.""" def __init__(self, old_table_name, new_table_name, schema=None): super(RenameTableOp, self).__init__(old_table_name, schema=schema) self.new_table_name = new_table_name @classmethod def rename_table( cls, operations, old_table_name, new_table_name, schema=None ): """Emit an ALTER TABLE to rename a table. :param old_table_name: old name. :param new_table_name: new name. :param schema: Optional schema name to operate within. To control quoting of the schema outside of the default behavior, use the SQLAlchemy construct :class:`~sqlalchemy.sql.elements.quoted_name`. .. versionadded:: 0.7.0 'schema' can now accept a :class:`~sqlalchemy.sql.elements.quoted_name` construct. """ op = cls(old_table_name, new_table_name, schema=schema) return operations.invoke(op) @Operations.register_operation("create_table_comment") class CreateTableCommentOp(AlterTableOp): """Represent a COMMENT ON `table` operation. """ def __init__( self, table_name, comment, schema=None, existing_comment=None ): self.table_name = table_name self.comment = comment self.existing_comment = existing_comment self.schema = schema @classmethod def create_table_comment( cls, operations, table_name, comment, existing_comment=None, schema=None, ): """Emit a COMMENT ON operation to set the comment for a table. .. versionadded:: 1.0.6 :param table_name: string name of the target table. :param comment: string value of the comment being registered against the specified table. :param existing_comment: String value of a comment already registered on the specified table, used within autogenerate so that the operation is reversible, but not required for direct use. .. seealso:: :meth:`.Operations.drop_table_comment` :paramref:`.Operations.alter_column.comment` """ op = cls( table_name, comment, existing_comment=existing_comment, schema=schema, ) return operations.invoke(op) def reverse(self): """Reverses the COMMENT ON operation against a table. """ if self.existing_comment is None: return DropTableCommentOp( self.table_name, existing_comment=self.comment, schema=self.schema, ) else: return CreateTableCommentOp( self.table_name, self.existing_comment, existing_comment=self.comment, schema=self.schema, ) def to_table(self, migration_context=None): schema_obj = schemaobj.SchemaObjects(migration_context) return schema_obj.table( self.table_name, schema=self.schema, comment=self.comment ) def to_diff_tuple(self): return ("add_table_comment", self.to_table(), self.existing_comment) @Operations.register_operation("drop_table_comment") class DropTableCommentOp(AlterTableOp): """Represent an operation to remove the comment from a table. """ def __init__(self, table_name, schema=None, existing_comment=None): self.table_name = table_name self.existing_comment = existing_comment self.schema = schema @classmethod def drop_table_comment( cls, operations, table_name, existing_comment=None, schema=None ): """Issue a "drop table comment" operation to remove an existing comment set on a table. .. versionadded:: 1.0.6 :param table_name: string name of the target table. :param existing_comment: An optional string value of a comment already registered on the specified table. .. seealso:: :meth:`.Operations.create_table_comment` :paramref:`.Operations.alter_column.comment` """ op = cls(table_name, existing_comment=existing_comment, schema=schema) return operations.invoke(op) def reverse(self): """Reverses the COMMENT ON operation against a table. """ return CreateTableCommentOp( self.table_name, self.existing_comment, schema=self.schema ) def to_table(self, migration_context=None): schema_obj = schemaobj.SchemaObjects(migration_context) return schema_obj.table(self.table_name, schema=self.schema) def to_diff_tuple(self): return ("remove_table_comment", self.to_table()) @Operations.register_operation("alter_column") @BatchOperations.register_operation("alter_column", "batch_alter_column") class AlterColumnOp(AlterTableOp): """Represent an alter column operation.""" def __init__( self, table_name, column_name, schema=None, existing_type=None, existing_server_default=False, existing_nullable=None, existing_comment=None, modify_nullable=None, modify_comment=False, modify_server_default=False, modify_name=None, modify_type=None, **kw ): super(AlterColumnOp, self).__init__(table_name, schema=schema) self.column_name = column_name self.existing_type = existing_type self.existing_server_default = existing_server_default self.existing_nullable = existing_nullable self.existing_comment = existing_comment self.modify_nullable = modify_nullable self.modify_comment = modify_comment self.modify_server_default = modify_server_default self.modify_name = modify_name self.modify_type = modify_type self.kw = kw def to_diff_tuple(self): col_diff = [] schema, tname, cname = self.schema, self.table_name, self.column_name if self.modify_type is not None: col_diff.append( ( "modify_type", schema, tname, cname, { "existing_nullable": self.existing_nullable, "existing_server_default": ( self.existing_server_default ), "existing_comment": self.existing_comment, }, self.existing_type, self.modify_type, ) ) if self.modify_nullable is not None: col_diff.append( ( "modify_nullable", schema, tname, cname, { "existing_type": self.existing_type, "existing_server_default": ( self.existing_server_default ), "existing_comment": self.existing_comment, }, self.existing_nullable, self.modify_nullable, ) ) if self.modify_server_default is not False: col_diff.append( ( "modify_default", schema, tname, cname, { "existing_nullable": self.existing_nullable, "existing_type": self.existing_type, "existing_comment": self.existing_comment, }, self.existing_server_default, self.modify_server_default, ) ) if self.modify_comment is not False: col_diff.append( ( "modify_comment", schema, tname, cname, { "existing_nullable": self.existing_nullable, "existing_type": self.existing_type, "existing_server_default": ( self.existing_server_default ), }, self.existing_comment, self.modify_comment, ) ) return col_diff def has_changes(self): hc1 = ( self.modify_nullable is not None or self.modify_server_default is not False or self.modify_type is not None or self.modify_comment is not False ) if hc1: return True for kw in self.kw: if kw.startswith("modify_"): return True else: return False def reverse(self): kw = self.kw.copy() kw["existing_type"] = self.existing_type kw["existing_nullable"] = self.existing_nullable kw["existing_server_default"] = self.existing_server_default kw["existing_comment"] = self.existing_comment if self.modify_type is not None: kw["modify_type"] = self.modify_type if self.modify_nullable is not None: kw["modify_nullable"] = self.modify_nullable if self.modify_server_default is not False: kw["modify_server_default"] = self.modify_server_default if self.modify_comment is not False: kw["modify_comment"] = self.modify_comment # TODO: make this a little simpler all_keys = set( m.group(1) for m in [re.match(r"^(?:existing_|modify_)(.+)$", k) for k in kw] if m ) for k in all_keys: if "modify_%s" % k in kw: swap = kw["existing_%s" % k] kw["existing_%s" % k] = kw["modify_%s" % k] kw["modify_%s" % k] = swap return self.__class__( self.table_name, self.column_name, schema=self.schema, **kw ) @classmethod @util._with_legacy_names([("name", "new_column_name")]) def alter_column( cls, operations, table_name, column_name, nullable=None, comment=False, server_default=False, new_column_name=None, type_=None, existing_type=None, existing_server_default=False, existing_nullable=None, existing_comment=None, schema=None, **kw ): r"""Issue an "alter column" instruction using the current migration context. Generally, only that aspect of the column which is being changed, i.e. name, type, nullability, default, needs to be specified. Multiple changes can also be specified at once and the backend should "do the right thing", emitting each change either separately or together as the backend allows. MySQL has special requirements here, since MySQL cannot ALTER a column without a full specification. When producing MySQL-compatible migration files, it is recommended that the ``existing_type``, ``existing_server_default``, and ``existing_nullable`` parameters be present, if not being altered. Type changes which are against the SQLAlchemy "schema" types :class:`~sqlalchemy.types.Boolean` and :class:`~sqlalchemy.types.Enum` may also add or drop constraints which accompany those types on backends that don't support them natively. The ``existing_type`` argument is used in this case to identify and remove a previous constraint that was bound to the type object. :param table_name: string name of the target table. :param column_name: string name of the target column, as it exists before the operation begins. :param nullable: Optional; specify ``True`` or ``False`` to alter the column's nullability. :param server_default: Optional; specify a string SQL expression, :func:`~sqlalchemy.sql.expression.text`, or :class:`~sqlalchemy.schema.DefaultClause` to indicate an alteration to the column's default value. Set to ``None`` to have the default removed. :param comment: optional string text of a new comment to add to the column. .. versionadded:: 1.0.6 :param new_column_name: Optional; specify a string name here to indicate the new name within a column rename operation. :param type\_: Optional; a :class:`~sqlalchemy.types.TypeEngine` type object to specify a change to the column's type. For SQLAlchemy types that also indicate a constraint (i.e. :class:`~sqlalchemy.types.Boolean`, :class:`~sqlalchemy.types.Enum`), the constraint is also generated. :param autoincrement: set the ``AUTO_INCREMENT`` flag of the column; currently understood by the MySQL dialect. :param existing_type: Optional; a :class:`~sqlalchemy.types.TypeEngine` type object to specify the previous type. This is required for all MySQL column alter operations that don't otherwise specify a new type, as well as for when nullability is being changed on a SQL Server column. It is also used if the type is a so-called SQLlchemy "schema" type which may define a constraint (i.e. :class:`~sqlalchemy.types.Boolean`, :class:`~sqlalchemy.types.Enum`), so that the constraint can be dropped. :param existing_server_default: Optional; The existing default value of the column. Required on MySQL if an existing default is not being changed; else MySQL removes the default. :param existing_nullable: Optional; the existing nullability of the column. Required on MySQL if the existing nullability is not being changed; else MySQL sets this to NULL. :param existing_autoincrement: Optional; the existing autoincrement of the column. Used for MySQL's system of altering a column that specifies ``AUTO_INCREMENT``. :param existing_comment: string text of the existing comment on the column to be maintained. Required on MySQL if the existing comment on the column is not being changed. .. versionadded:: 1.0.6 :param schema: Optional schema name to operate within. To control quoting of the schema outside of the default behavior, use the SQLAlchemy construct :class:`~sqlalchemy.sql.elements.quoted_name`. .. versionadded:: 0.7.0 'schema' can now accept a :class:`~sqlalchemy.sql.elements.quoted_name` construct. :param postgresql_using: String argument which will indicate a SQL expression to render within the Postgresql-specific USING clause within ALTER COLUMN. This string is taken directly as raw SQL which must explicitly include any necessary quoting or escaping of tokens within the expression. .. versionadded:: 0.8.8 """ alt = cls( table_name, column_name, schema=schema, existing_type=existing_type, existing_server_default=existing_server_default, existing_nullable=existing_nullable, existing_comment=existing_comment, modify_name=new_column_name, modify_type=type_, modify_server_default=server_default, modify_nullable=nullable, modify_comment=comment, **kw ) return operations.invoke(alt) @classmethod def batch_alter_column( cls, operations, column_name, nullable=None, comment=False, server_default=False, new_column_name=None, type_=None, existing_type=None, existing_server_default=False, existing_nullable=None, existing_comment=None, **kw ): """Issue an "alter column" instruction using the current batch migration context. .. seealso:: :meth:`.Operations.alter_column` """ alt = cls( operations.impl.table_name, column_name, schema=operations.impl.schema, existing_type=existing_type, existing_server_default=existing_server_default, existing_nullable=existing_nullable, existing_comment=existing_comment, modify_name=new_column_name, modify_type=type_, modify_server_default=server_default, modify_nullable=nullable, modify_comment=comment, **kw ) return operations.invoke(alt) @Operations.register_operation("add_column") @BatchOperations.register_operation("add_column", "batch_add_column") class AddColumnOp(AlterTableOp): """Represent an add column operation.""" def __init__(self, table_name, column, schema=None): super(AddColumnOp, self).__init__(table_name, schema=schema) self.column = column def reverse(self): return DropColumnOp.from_column_and_tablename( self.schema, self.table_name, self.column ) def to_diff_tuple(self): return ("add_column", self.schema, self.table_name, self.column) def to_column(self): return self.column @classmethod def from_column(cls, col): return cls(col.table.name, col, schema=col.table.schema) @classmethod def from_column_and_tablename(cls, schema, tname, col): return cls(tname, col, schema=schema) @classmethod def add_column(cls, operations, table_name, column, schema=None): """Issue an "add column" instruction using the current migration context. e.g.:: from alembic import op from sqlalchemy import Column, String op.add_column('organization', Column('name', String()) ) The provided :class:`~sqlalchemy.schema.Column` object can also specify a :class:`~sqlalchemy.schema.ForeignKey`, referencing a remote table name. Alembic will automatically generate a stub "referenced" table and emit a second ALTER statement in order to add the constraint separately:: from alembic import op from sqlalchemy import Column, INTEGER, ForeignKey op.add_column('organization', Column('account_id', INTEGER, ForeignKey('accounts.id')) ) Note that this statement uses the :class:`~sqlalchemy.schema.Column` construct as is from the SQLAlchemy library. In particular, default values to be created on the database side are specified using the ``server_default`` parameter, and not ``default`` which only specifies Python-side defaults:: from alembic import op from sqlalchemy import Column, TIMESTAMP, func # specify "DEFAULT NOW" along with the column add op.add_column('account', Column('timestamp', TIMESTAMP, server_default=func.now()) ) :param table_name: String name of the parent table. :param column: a :class:`sqlalchemy.schema.Column` object representing the new column. :param schema: Optional schema name to operate within. To control quoting of the schema outside of the default behavior, use the SQLAlchemy construct :class:`~sqlalchemy.sql.elements.quoted_name`. .. versionadded:: 0.7.0 'schema' can now accept a :class:`~sqlalchemy.sql.elements.quoted_name` construct. """ op = cls(table_name, column, schema=schema) return operations.invoke(op) @classmethod def batch_add_column(cls, operations, column): """Issue an "add column" instruction using the current batch migration context. .. seealso:: :meth:`.Operations.add_column` """ op = cls( operations.impl.table_name, column, schema=operations.impl.schema ) return operations.invoke(op) @Operations.register_operation("drop_column") @BatchOperations.register_operation("drop_column", "batch_drop_column") class DropColumnOp(AlterTableOp): """Represent a drop column operation.""" def __init__( self, table_name, column_name, schema=None, _orig_column=None, **kw ): super(DropColumnOp, self).__init__(table_name, schema=schema) self.column_name = column_name self.kw = kw self._orig_column = _orig_column def to_diff_tuple(self): return ( "remove_column", self.schema, self.table_name, self.to_column(), ) def reverse(self): if self._orig_column is None: raise ValueError( "operation is not reversible; " "original column is not present" ) return AddColumnOp.from_column_and_tablename( self.schema, self.table_name, self._orig_column ) @classmethod def from_column_and_tablename(cls, schema, tname, col): return cls(tname, col.name, schema=schema, _orig_column=col) def to_column(self, migration_context=None): if self._orig_column is not None: return self._orig_column schema_obj = schemaobj.SchemaObjects(migration_context) return schema_obj.column(self.column_name, NULLTYPE) @classmethod def drop_column( cls, operations, table_name, column_name, schema=None, **kw ): """Issue a "drop column" instruction using the current migration context. e.g.:: drop_column('organization', 'account_id') :param table_name: name of table :param column_name: name of column :param schema: Optional schema name to operate within. To control quoting of the schema outside of the default behavior, use the SQLAlchemy construct :class:`~sqlalchemy.sql.elements.quoted_name`. .. versionadded:: 0.7.0 'schema' can now accept a :class:`~sqlalchemy.sql.elements.quoted_name` construct. :param mssql_drop_check: Optional boolean. When ``True``, on Microsoft SQL Server only, first drop the CHECK constraint on the column using a SQL-script-compatible block that selects into a @variable from sys.check_constraints, then exec's a separate DROP CONSTRAINT for that constraint. :param mssql_drop_default: Optional boolean. When ``True``, on Microsoft SQL Server only, first drop the DEFAULT constraint on the column using a SQL-script-compatible block that selects into a @variable from sys.default_constraints, then exec's a separate DROP CONSTRAINT for that default. :param mssql_drop_foreign_key: Optional boolean. When ``True``, on Microsoft SQL Server only, first drop a single FOREIGN KEY constraint on the column using a SQL-script-compatible block that selects into a @variable from sys.foreign_keys/sys.foreign_key_columns, then exec's a separate DROP CONSTRAINT for that default. Only works if the column has exactly one FK constraint which refers to it, at the moment. .. versionadded:: 0.6.2 """ op = cls(table_name, column_name, schema=schema, **kw) return operations.invoke(op) @classmethod def batch_drop_column(cls, operations, column_name, **kw): """Issue a "drop column" instruction using the current batch migration context. .. seealso:: :meth:`.Operations.drop_column` """ op = cls( operations.impl.table_name, column_name, schema=operations.impl.schema, **kw ) return operations.invoke(op) @Operations.register_operation("bulk_insert") class BulkInsertOp(MigrateOperation): """Represent a bulk insert operation.""" def __init__(self, table, rows, multiinsert=True): self.table = table self.rows = rows self.multiinsert = multiinsert @classmethod def bulk_insert(cls, operations, table, rows, multiinsert=True): """Issue a "bulk insert" operation using the current migration context. This provides a means of representing an INSERT of multiple rows which works equally well in the context of executing on a live connection as well as that of generating a SQL script. In the case of a SQL script, the values are rendered inline into the statement. e.g.:: from alembic import op from datetime import date from sqlalchemy.sql import table, column from sqlalchemy import String, Integer, Date # Create an ad-hoc table to use for the insert statement. accounts_table = table('account', column('id', Integer), column('name', String), column('create_date', Date) ) op.bulk_insert(accounts_table, [ {'id':1, 'name':'John Smith', 'create_date':date(2010, 10, 5)}, {'id':2, 'name':'Ed Williams', 'create_date':date(2007, 5, 27)}, {'id':3, 'name':'Wendy Jones', 'create_date':date(2008, 8, 15)}, ] ) When using --sql mode, some datatypes may not render inline automatically, such as dates and other special types. When this issue is present, :meth:`.Operations.inline_literal` may be used:: op.bulk_insert(accounts_table, [ {'id':1, 'name':'John Smith', 'create_date':op.inline_literal("2010-10-05")}, {'id':2, 'name':'Ed Williams', 'create_date':op.inline_literal("2007-05-27")}, {'id':3, 'name':'Wendy Jones', 'create_date':op.inline_literal("2008-08-15")}, ], multiinsert=False ) When using :meth:`.Operations.inline_literal` in conjunction with :meth:`.Operations.bulk_insert`, in order for the statement to work in "online" (e.g. non --sql) mode, the :paramref:`~.Operations.bulk_insert.multiinsert` flag should be set to ``False``, which will have the effect of individual INSERT statements being emitted to the database, each with a distinct VALUES clause, so that the "inline" values can still be rendered, rather than attempting to pass the values as bound parameters. .. versionadded:: 0.6.4 :meth:`.Operations.inline_literal` can now be used with :meth:`.Operations.bulk_insert`, and the :paramref:`~.Operations.bulk_insert.multiinsert` flag has been added to assist in this usage when running in "online" mode. :param table: a table object which represents the target of the INSERT. :param rows: a list of dictionaries indicating rows. :param multiinsert: when at its default of True and --sql mode is not enabled, the INSERT statement will be executed using "executemany()" style, where all elements in the list of dictionaries are passed as bound parameters in a single list. Setting this to False results in individual INSERT statements being emitted per parameter set, and is needed in those cases where non-literal values are present in the parameter sets. .. versionadded:: 0.6.4 """ op = cls(table, rows, multiinsert=multiinsert) operations.invoke(op) @Operations.register_operation("execute") class ExecuteSQLOp(MigrateOperation): """Represent an execute SQL operation.""" def __init__(self, sqltext, execution_options=None): self.sqltext = sqltext self.execution_options = execution_options @classmethod def execute(cls, operations, sqltext, execution_options=None): r"""Execute the given SQL using the current migration context. The given SQL can be a plain string, e.g.:: op.execute("INSERT INTO table (foo) VALUES ('some value')") Or it can be any kind of Core SQL Expression construct, such as below where we use an update construct:: from sqlalchemy.sql import table, column from sqlalchemy import String from alembic import op account = table('account', column('name', String) ) op.execute( account.update().\\ where(account.c.name==op.inline_literal('account 1')).\\ values({'name':op.inline_literal('account 2')}) ) Above, we made use of the SQLAlchemy :func:`sqlalchemy.sql.expression.table` and :func:`sqlalchemy.sql.expression.column` constructs to make a brief, ad-hoc table construct just for our UPDATE statement. A full :class:`~sqlalchemy.schema.Table` construct of course works perfectly fine as well, though note it's a recommended practice to at least ensure the definition of a table is self-contained within the migration script, rather than imported from a module that may break compatibility with older migrations. In a SQL script context, the statement is emitted directly to the output stream. There is *no* return result, however, as this function is oriented towards generating a change script that can run in "offline" mode. Additionally, parameterized statements are discouraged here, as they *will not work* in offline mode. Above, we use :meth:`.inline_literal` where parameters are to be used. For full interaction with a connected database where parameters can also be used normally, use the "bind" available from the context:: from alembic import op connection = op.get_bind() connection.execute( account.update().where(account.c.name=='account 1'). values({"name": "account 2"}) ) Additionally, when passing the statement as a plain string, it is first coerceed into a :func:`sqlalchemy.sql.expression.text` construct before being passed along. In the less likely case that the literal SQL string contains a colon, it must be escaped with a backslash, as:: op.execute("INSERT INTO table (foo) VALUES ('\:colon_value')") :param sql: Any legal SQLAlchemy expression, including: * a string * a :func:`sqlalchemy.sql.expression.text` construct. * a :func:`sqlalchemy.sql.expression.insert` construct. * a :func:`sqlalchemy.sql.expression.update`, :func:`sqlalchemy.sql.expression.insert`, or :func:`sqlalchemy.sql.expression.delete` construct. * Pretty much anything that's "executable" as described in :ref:`sqlexpression_toplevel`. .. note:: when passing a plain string, the statement is coerced into a :func:`sqlalchemy.sql.expression.text` construct. This construct considers symbols with colons, e.g. ``:foo`` to be bound parameters. To avoid this, ensure that colon symbols are escaped, e.g. ``\:foo``. :param execution_options: Optional dictionary of execution options, will be passed to :meth:`sqlalchemy.engine.Connection.execution_options`. """ op = cls(sqltext, execution_options=execution_options) return operations.invoke(op) class OpContainer(MigrateOperation): """Represent a sequence of operations operation.""" def __init__(self, ops=()): self.ops = ops def is_empty(self): return not self.ops def as_diffs(self): return list(OpContainer._ops_as_diffs(self)) @classmethod def _ops_as_diffs(cls, migrations): for op in migrations.ops: if hasattr(op, "ops"): for sub_op in cls._ops_as_diffs(op): yield sub_op else: yield op.to_diff_tuple() class ModifyTableOps(OpContainer): """Contains a sequence of operations that all apply to a single Table.""" def __init__(self, table_name, ops, schema=None): super(ModifyTableOps, self).__init__(ops) self.table_name = table_name self.schema = schema def reverse(self): return ModifyTableOps( self.table_name, ops=list(reversed([op.reverse() for op in self.ops])), schema=self.schema, ) class UpgradeOps(OpContainer): """contains a sequence of operations that would apply to the 'upgrade' stream of a script. .. seealso:: :ref:`customizing_revision` """ def __init__(self, ops=(), upgrade_token="upgrades"): super(UpgradeOps, self).__init__(ops=ops) self.upgrade_token = upgrade_token def reverse_into(self, downgrade_ops): downgrade_ops.ops[:] = list( reversed([op.reverse() for op in self.ops]) ) return downgrade_ops def reverse(self): return self.reverse_into(DowngradeOps(ops=[])) class DowngradeOps(OpContainer): """contains a sequence of operations that would apply to the 'downgrade' stream of a script. .. seealso:: :ref:`customizing_revision` """ def __init__(self, ops=(), downgrade_token="downgrades"): super(DowngradeOps, self).__init__(ops=ops) self.downgrade_token = downgrade_token def reverse(self): return UpgradeOps( ops=list(reversed([op.reverse() for op in self.ops])) ) class MigrationScript(MigrateOperation): """represents a migration script. E.g. when autogenerate encounters this object, this corresponds to the production of an actual script file. A normal :class:`.MigrationScript` object would contain a single :class:`.UpgradeOps` and a single :class:`.DowngradeOps` directive. These are accessible via the ``.upgrade_ops`` and ``.downgrade_ops`` attributes. In the case of an autogenerate operation that runs multiple times, such as the multiple database example in the "multidb" template, the ``.upgrade_ops`` and ``.downgrade_ops`` attributes are disabled, and instead these objects should be accessed via the ``.upgrade_ops_list`` and ``.downgrade_ops_list`` list-based attributes. These latter attributes are always available at the very least as single-element lists. .. versionchanged:: 0.8.1 the ``.upgrade_ops`` and ``.downgrade_ops`` attributes should be accessed via the ``.upgrade_ops_list`` and ``.downgrade_ops_list`` attributes if multiple autogenerate passes proceed on the same :class:`.MigrationScript` object. .. seealso:: :ref:`customizing_revision` """ def __init__( self, rev_id, upgrade_ops, downgrade_ops, message=None, imports=set(), head=None, splice=None, branch_label=None, version_path=None, depends_on=None, ): self.rev_id = rev_id self.message = message self.imports = imports self.head = head self.splice = splice self.branch_label = branch_label self.version_path = version_path self.depends_on = depends_on self.upgrade_ops = upgrade_ops self.downgrade_ops = downgrade_ops @property def upgrade_ops(self): """An instance of :class:`.UpgradeOps`. .. seealso:: :attr:`.MigrationScript.upgrade_ops_list` """ if len(self._upgrade_ops) > 1: raise ValueError( "This MigrationScript instance has a multiple-entry " "list for UpgradeOps; please use the " "upgrade_ops_list attribute." ) elif not self._upgrade_ops: return None else: return self._upgrade_ops[0] @upgrade_ops.setter def upgrade_ops(self, upgrade_ops): self._upgrade_ops = util.to_list(upgrade_ops) for elem in self._upgrade_ops: assert isinstance(elem, UpgradeOps) @property def downgrade_ops(self): """An instance of :class:`.DowngradeOps`. .. seealso:: :attr:`.MigrationScript.downgrade_ops_list` """ if len(self._downgrade_ops) > 1: raise ValueError( "This MigrationScript instance has a multiple-entry " "list for DowngradeOps; please use the " "downgrade_ops_list attribute." ) elif not self._downgrade_ops: return None else: return self._downgrade_ops[0] @downgrade_ops.setter def downgrade_ops(self, downgrade_ops): self._downgrade_ops = util.to_list(downgrade_ops) for elem in self._downgrade_ops: assert isinstance(elem, DowngradeOps) @property def upgrade_ops_list(self): """A list of :class:`.UpgradeOps` instances. This is used in place of the :attr:`.MigrationScript.upgrade_ops` attribute when dealing with a revision operation that does multiple autogenerate passes. .. versionadded:: 0.8.1 """ return self._upgrade_ops @property def downgrade_ops_list(self): """A list of :class:`.DowngradeOps` instances. This is used in place of the :attr:`.MigrationScript.downgrade_ops` attribute when dealing with a revision operation that does multiple autogenerate passes. .. versionadded:: 0.8.1 """ return self._downgrade_ops zzzeek-alembic-bee044a1c187/alembic/operations/schemaobj.py000066400000000000000000000131501353106760100236660ustar00rootroot00000000000000from sqlalchemy import schema as sa_schema from sqlalchemy.types import Integer from sqlalchemy.types import NULLTYPE from .. import util from ..util.compat import string_types class SchemaObjects(object): def __init__(self, migration_context=None): self.migration_context = migration_context def primary_key_constraint(self, name, table_name, cols, schema=None): m = self.metadata() columns = [sa_schema.Column(n, NULLTYPE) for n in cols] t = sa_schema.Table(table_name, m, *columns, schema=schema) p = sa_schema.PrimaryKeyConstraint(*[t.c[n] for n in cols], name=name) t.append_constraint(p) return p def foreign_key_constraint( self, name, source, referent, local_cols, remote_cols, onupdate=None, ondelete=None, deferrable=None, source_schema=None, referent_schema=None, initially=None, match=None, **dialect_kw ): m = self.metadata() if source == referent and source_schema == referent_schema: t1_cols = local_cols + remote_cols else: t1_cols = local_cols sa_schema.Table( referent, m, *[sa_schema.Column(n, NULLTYPE) for n in remote_cols], schema=referent_schema ) t1 = sa_schema.Table( source, m, *[sa_schema.Column(n, NULLTYPE) for n in t1_cols], schema=source_schema ) tname = ( "%s.%s" % (referent_schema, referent) if referent_schema else referent ) dialect_kw["match"] = match f = sa_schema.ForeignKeyConstraint( local_cols, ["%s.%s" % (tname, n) for n in remote_cols], name=name, onupdate=onupdate, ondelete=ondelete, deferrable=deferrable, initially=initially, **dialect_kw ) t1.append_constraint(f) return f def unique_constraint(self, name, source, local_cols, schema=None, **kw): t = sa_schema.Table( source, self.metadata(), *[sa_schema.Column(n, NULLTYPE) for n in local_cols], schema=schema ) kw["name"] = name uq = sa_schema.UniqueConstraint(*[t.c[n] for n in local_cols], **kw) # TODO: need event tests to ensure the event # is fired off here t.append_constraint(uq) return uq def check_constraint(self, name, source, condition, schema=None, **kw): t = sa_schema.Table( source, self.metadata(), sa_schema.Column("x", Integer), schema=schema, ) ck = sa_schema.CheckConstraint(condition, name=name, **kw) t.append_constraint(ck) return ck def generic_constraint(self, name, table_name, type_, schema=None, **kw): t = self.table(table_name, schema=schema) types = { "foreignkey": lambda name: sa_schema.ForeignKeyConstraint( [], [], name=name ), "primary": sa_schema.PrimaryKeyConstraint, "unique": sa_schema.UniqueConstraint, "check": lambda name: sa_schema.CheckConstraint("", name=name), None: sa_schema.Constraint, } try: const = types[type_] except KeyError: raise TypeError( "'type' can be one of %s" % ", ".join(sorted(repr(x) for x in types)) ) else: const = const(name=name) t.append_constraint(const) return const def metadata(self): kw = {} if ( self.migration_context is not None and "target_metadata" in self.migration_context.opts ): mt = self.migration_context.opts["target_metadata"] if hasattr(mt, "naming_convention"): kw["naming_convention"] = mt.naming_convention return sa_schema.MetaData(**kw) def table(self, name, *columns, **kw): m = self.metadata() t = sa_schema.Table(name, m, *columns, **kw) for f in t.foreign_keys: self._ensure_table_for_fk(m, f) return t def column(self, name, type_, **kw): return sa_schema.Column(name, type_, **kw) def index(self, name, tablename, columns, schema=None, **kw): t = sa_schema.Table( tablename or "no_table", self.metadata(), schema=schema ) idx = sa_schema.Index( name, *[util.sqla_compat._textual_index_column(t, n) for n in columns], **kw ) return idx def _parse_table_key(self, table_key): if "." in table_key: tokens = table_key.split(".") sname = ".".join(tokens[0:-1]) tname = tokens[-1] else: tname = table_key sname = None return (sname, tname) def _ensure_table_for_fk(self, metadata, fk): """create a placeholder Table object for the referent of a ForeignKey. """ if isinstance(fk._colspec, string_types): table_key, cname = fk._colspec.rsplit(".", 1) sname, tname = self._parse_table_key(table_key) if table_key not in metadata.tables: rel_t = sa_schema.Table(tname, metadata, schema=sname) else: rel_t = metadata.tables[table_key] if cname not in rel_t.c: rel_t.append_column(sa_schema.Column(cname, NULLTYPE)) zzzeek-alembic-bee044a1c187/alembic/operations/toimpl.py000066400000000000000000000130671353106760100232460ustar00rootroot00000000000000from sqlalchemy import schema as sa_schema from . import ops from .base import Operations from ..util import sqla_compat @Operations.implementation_for(ops.AlterColumnOp) def alter_column(operations, operation): compiler = operations.impl.dialect.statement_compiler( operations.impl.dialect, None ) existing_type = operation.existing_type existing_nullable = operation.existing_nullable existing_server_default = operation.existing_server_default type_ = operation.modify_type column_name = operation.column_name table_name = operation.table_name schema = operation.schema server_default = operation.modify_server_default new_column_name = operation.modify_name nullable = operation.modify_nullable comment = operation.modify_comment existing_comment = operation.existing_comment def _count_constraint(constraint): return not isinstance(constraint, sa_schema.PrimaryKeyConstraint) and ( not constraint._create_rule or constraint._create_rule(compiler) ) if existing_type and type_: t = operations.schema_obj.table( table_name, sa_schema.Column(column_name, existing_type), schema=schema, ) for constraint in t.constraints: if _count_constraint(constraint): operations.impl.drop_constraint(constraint) operations.impl.alter_column( table_name, column_name, nullable=nullable, server_default=server_default, name=new_column_name, type_=type_, schema=schema, existing_type=existing_type, existing_server_default=existing_server_default, existing_nullable=existing_nullable, comment=comment, existing_comment=existing_comment, **operation.kw ) if type_: t = operations.schema_obj.table( table_name, operations.schema_obj.column(column_name, type_), schema=schema, ) for constraint in t.constraints: if _count_constraint(constraint): operations.impl.add_constraint(constraint) @Operations.implementation_for(ops.DropTableOp) def drop_table(operations, operation): operations.impl.drop_table( operation.to_table(operations.migration_context) ) @Operations.implementation_for(ops.DropColumnOp) def drop_column(operations, operation): column = operation.to_column(operations.migration_context) operations.impl.drop_column( operation.table_name, column, schema=operation.schema, **operation.kw ) @Operations.implementation_for(ops.CreateIndexOp) def create_index(operations, operation): idx = operation.to_index(operations.migration_context) operations.impl.create_index(idx) @Operations.implementation_for(ops.DropIndexOp) def drop_index(operations, operation): operations.impl.drop_index( operation.to_index(operations.migration_context) ) @Operations.implementation_for(ops.CreateTableOp) def create_table(operations, operation): table = operation.to_table(operations.migration_context) operations.impl.create_table(table) return table @Operations.implementation_for(ops.RenameTableOp) def rename_table(operations, operation): operations.impl.rename_table( operation.table_name, operation.new_table_name, schema=operation.schema ) @Operations.implementation_for(ops.CreateTableCommentOp) def create_table_comment(operations, operation): table = operation.to_table(operations.migration_context) operations.impl.create_table_comment(table) @Operations.implementation_for(ops.DropTableCommentOp) def drop_table_comment(operations, operation): table = operation.to_table(operations.migration_context) operations.impl.drop_table_comment(table) @Operations.implementation_for(ops.AddColumnOp) def add_column(operations, operation): table_name = operation.table_name column = operation.column schema = operation.schema t = operations.schema_obj.table(table_name, column, schema=schema) operations.impl.add_column(table_name, column, schema=schema) for constraint in t.constraints: if not isinstance(constraint, sa_schema.PrimaryKeyConstraint): operations.impl.add_constraint(constraint) for index in t.indexes: operations.impl.create_index(index) with_comment = ( sqla_compat._dialect_supports_comments(operations.impl.dialect) and not operations.impl.dialect.inline_comments ) comment = sqla_compat._comment_attribute(column) if comment and with_comment: operations.impl.create_column_comment(column) @Operations.implementation_for(ops.AddConstraintOp) def create_constraint(operations, operation): operations.impl.add_constraint( operation.to_constraint(operations.migration_context) ) @Operations.implementation_for(ops.DropConstraintOp) def drop_constraint(operations, operation): operations.impl.drop_constraint( operations.schema_obj.generic_constraint( operation.constraint_name, operation.table_name, operation.constraint_type, schema=operation.schema, ) ) @Operations.implementation_for(ops.BulkInsertOp) def bulk_insert(operations, operation): operations.impl.bulk_insert( operation.table, operation.rows, multiinsert=operation.multiinsert ) @Operations.implementation_for(ops.ExecuteSQLOp) def execute_sql(operations, operation): operations.migration_context.impl.execute( operation.sqltext, execution_options=operation.execution_options ) zzzeek-alembic-bee044a1c187/alembic/runtime/000077500000000000000000000000001353106760100206615ustar00rootroot00000000000000zzzeek-alembic-bee044a1c187/alembic/runtime/__init__.py000066400000000000000000000000001353106760100227600ustar00rootroot00000000000000zzzeek-alembic-bee044a1c187/alembic/runtime/environment.py000066400000000000000000001117461353106760100236110ustar00rootroot00000000000000from .migration import MigrationContext from .. import util from ..operations import Operations class EnvironmentContext(util.ModuleClsProxy): """A configurational facade made available in an ``env.py`` script. The :class:`.EnvironmentContext` acts as a *facade* to the more nuts-and-bolts objects of :class:`.MigrationContext` as well as certain aspects of :class:`.Config`, within the context of the ``env.py`` script that is invoked by most Alembic commands. :class:`.EnvironmentContext` is normally instantiated when a command in :mod:`alembic.command` is run. It then makes itself available in the ``alembic.context`` module for the scope of the command. From within an ``env.py`` script, the current :class:`.EnvironmentContext` is available by importing this module. :class:`.EnvironmentContext` also supports programmatic usage. At this level, it acts as a Python context manager, that is, is intended to be used using the ``with:`` statement. A typical use of :class:`.EnvironmentContext`:: from alembic.config import Config from alembic.script import ScriptDirectory config = Config() config.set_main_option("script_location", "myapp:migrations") script = ScriptDirectory.from_config(config) def my_function(rev, context): '''do something with revision "rev", which will be the current database revision, and "context", which is the MigrationContext that the env.py will create''' with EnvironmentContext( config, script, fn = my_function, as_sql = False, starting_rev = 'base', destination_rev = 'head', tag = "sometag" ): script.run_env() The above script will invoke the ``env.py`` script within the migration environment. If and when ``env.py`` calls :meth:`.MigrationContext.run_migrations`, the ``my_function()`` function above will be called by the :class:`.MigrationContext`, given the context itself as well as the current revision in the database. .. note:: For most API usages other than full blown invocation of migration scripts, the :class:`.MigrationContext` and :class:`.ScriptDirectory` objects can be created and used directly. The :class:`.EnvironmentContext` object is *only* needed when you need to actually invoke the ``env.py`` module present in the migration environment. """ _migration_context = None config = None """An instance of :class:`.Config` representing the configuration file contents as well as other variables set programmatically within it.""" script = None """An instance of :class:`.ScriptDirectory` which provides programmatic access to version files within the ``versions/`` directory. """ def __init__(self, config, script, **kw): r"""Construct a new :class:`.EnvironmentContext`. :param config: a :class:`.Config` instance. :param script: a :class:`.ScriptDirectory` instance. :param \**kw: keyword options that will be ultimately passed along to the :class:`.MigrationContext` when :meth:`.EnvironmentContext.configure` is called. """ self.config = config self.script = script self.context_opts = kw def __enter__(self): """Establish a context which provides a :class:`.EnvironmentContext` object to env.py scripts. The :class:`.EnvironmentContext` will be made available as ``from alembic import context``. """ self._install_proxy() return self def __exit__(self, *arg, **kw): self._remove_proxy() def is_offline_mode(self): """Return True if the current migrations environment is running in "offline mode". This is ``True`` or ``False`` depending on the the ``--sql`` flag passed. This function does not require that the :class:`.MigrationContext` has been configured. """ return self.context_opts.get("as_sql", False) def is_transactional_ddl(self): """Return True if the context is configured to expect a transactional DDL capable backend. This defaults to the type of database in use, and can be overridden by the ``transactional_ddl`` argument to :meth:`.configure` This function requires that a :class:`.MigrationContext` has first been made available via :meth:`.configure`. """ return self.get_context().impl.transactional_ddl def requires_connection(self): return not self.is_offline_mode() def get_head_revision(self): """Return the hex identifier of the 'head' script revision. If the script directory has multiple heads, this method raises a :class:`.CommandError`; :meth:`.EnvironmentContext.get_head_revisions` should be preferred. This function does not require that the :class:`.MigrationContext` has been configured. .. seealso:: :meth:`.EnvironmentContext.get_head_revisions` """ return self.script.as_revision_number("head") def get_head_revisions(self): """Return the hex identifier of the 'heads' script revision(s). This returns a tuple containing the version number of all heads in the script directory. This function does not require that the :class:`.MigrationContext` has been configured. .. versionadded:: 0.7.0 """ return self.script.as_revision_number("heads") def get_starting_revision_argument(self): """Return the 'starting revision' argument, if the revision was passed using ``start:end``. This is only meaningful in "offline" mode. Returns ``None`` if no value is available or was configured. This function does not require that the :class:`.MigrationContext` has been configured. """ if self._migration_context is not None: return self.script.as_revision_number( self.get_context()._start_from_rev ) elif "starting_rev" in self.context_opts: return self.script.as_revision_number( self.context_opts["starting_rev"] ) else: # this should raise only in the case that a command # is being run where the "starting rev" is never applicable; # this is to catch scripts which rely upon this in # non-sql mode or similar raise util.CommandError( "No starting revision argument is available." ) def get_revision_argument(self): """Get the 'destination' revision argument. This is typically the argument passed to the ``upgrade`` or ``downgrade`` command. If it was specified as ``head``, the actual version number is returned; if specified as ``base``, ``None`` is returned. This function does not require that the :class:`.MigrationContext` has been configured. """ return self.script.as_revision_number( self.context_opts["destination_rev"] ) def get_tag_argument(self): """Return the value passed for the ``--tag`` argument, if any. The ``--tag`` argument is not used directly by Alembic, but is available for custom ``env.py`` configurations that wish to use it; particularly for offline generation scripts that wish to generate tagged filenames. This function does not require that the :class:`.MigrationContext` has been configured. .. seealso:: :meth:`.EnvironmentContext.get_x_argument` - a newer and more open ended system of extending ``env.py`` scripts via the command line. """ return self.context_opts.get("tag", None) def get_x_argument(self, as_dictionary=False): """Return the value(s) passed for the ``-x`` argument, if any. The ``-x`` argument is an open ended flag that allows any user-defined value or values to be passed on the command line, then available here for consumption by a custom ``env.py`` script. The return value is a list, returned directly from the ``argparse`` structure. If ``as_dictionary=True`` is passed, the ``x`` arguments are parsed using ``key=value`` format into a dictionary that is then returned. For example, to support passing a database URL on the command line, the standard ``env.py`` script can be modified like this:: cmd_line_url = context.get_x_argument( as_dictionary=True).get('dbname') if cmd_line_url: engine = create_engine(cmd_line_url) else: engine = engine_from_config( config.get_section(config.config_ini_section), prefix='sqlalchemy.', poolclass=pool.NullPool) This then takes effect by running the ``alembic`` script as:: alembic -x dbname=postgresql://user:pass@host/dbname upgrade head This function does not require that the :class:`.MigrationContext` has been configured. .. versionadded:: 0.6.0 .. seealso:: :meth:`.EnvironmentContext.get_tag_argument` :attr:`.Config.cmd_opts` """ if self.config.cmd_opts is not None: value = self.config.cmd_opts.x or [] else: value = [] if as_dictionary: value = dict(arg.split("=", 1) for arg in value) return value def configure( self, connection=None, url=None, dialect_name=None, dialect_opts=None, transactional_ddl=None, transaction_per_migration=False, output_buffer=None, starting_rev=None, tag=None, template_args=None, render_as_batch=False, target_metadata=None, include_symbol=None, include_object=None, include_schemas=False, process_revision_directives=None, compare_type=False, compare_server_default=False, render_item=None, literal_binds=False, upgrade_token="upgrades", downgrade_token="downgrades", alembic_module_prefix="op.", sqlalchemy_module_prefix="sa.", user_module_prefix=None, on_version_apply=None, **kw ): """Configure a :class:`.MigrationContext` within this :class:`.EnvironmentContext` which will provide database connectivity and other configuration to a series of migration scripts. Many methods on :class:`.EnvironmentContext` require that this method has been called in order to function, as they ultimately need to have database access or at least access to the dialect in use. Those which do are documented as such. The important thing needed by :meth:`.configure` is a means to determine what kind of database dialect is in use. An actual connection to that database is needed only if the :class:`.MigrationContext` is to be used in "online" mode. If the :meth:`.is_offline_mode` function returns ``True``, then no connection is needed here. Otherwise, the ``connection`` parameter should be present as an instance of :class:`sqlalchemy.engine.Connection`. This function is typically called from the ``env.py`` script within a migration environment. It can be called multiple times for an invocation. The most recent :class:`~sqlalchemy.engine.Connection` for which it was called is the one that will be operated upon by the next call to :meth:`.run_migrations`. General parameters: :param connection: a :class:`~sqlalchemy.engine.Connection` to use for SQL execution in "online" mode. When present, is also used to determine the type of dialect in use. :param url: a string database url, or a :class:`sqlalchemy.engine.url.URL` object. The type of dialect to be used will be derived from this if ``connection`` is not passed. :param dialect_name: string name of a dialect, such as "postgresql", "mssql", etc. The type of dialect to be used will be derived from this if ``connection`` and ``url`` are not passed. :param dialect_opts: dictionary of options to be passed to dialect constructor. .. versionadded:: 1.0.12 :param transactional_ddl: Force the usage of "transactional" DDL on or off; this otherwise defaults to whether or not the dialect in use supports it. :param transaction_per_migration: if True, nest each migration script in a transaction rather than the full series of migrations to run. .. versionadded:: 0.6.5 :param output_buffer: a file-like object that will be used for textual output when the ``--sql`` option is used to generate SQL scripts. Defaults to ``sys.stdout`` if not passed here and also not present on the :class:`.Config` object. The value here overrides that of the :class:`.Config` object. :param output_encoding: when using ``--sql`` to generate SQL scripts, apply this encoding to the string output. :param literal_binds: when using ``--sql`` to generate SQL scripts, pass through the ``literal_binds`` flag to the compiler so that any literal values that would ordinarily be bound parameters are converted to plain strings. .. warning:: Dialects can typically only handle simple datatypes like strings and numbers for auto-literal generation. Datatypes like dates, intervals, and others may still require manual formatting, typically using :meth:`.Operations.inline_literal`. .. note:: the ``literal_binds`` flag is ignored on SQLAlchemy versions prior to 0.8 where this feature is not supported. .. versionadded:: 0.7.6 .. seealso:: :meth:`.Operations.inline_literal` :param starting_rev: Override the "starting revision" argument when using ``--sql`` mode. :param tag: a string tag for usage by custom ``env.py`` scripts. Set via the ``--tag`` option, can be overridden here. :param template_args: dictionary of template arguments which will be added to the template argument environment when running the "revision" command. Note that the script environment is only run within the "revision" command if the --autogenerate option is used, or if the option "revision_environment=true" is present in the alembic.ini file. :param version_table: The name of the Alembic version table. The default is ``'alembic_version'``. :param version_table_schema: Optional schema to place version table within. :param version_table_pk: boolean, whether the Alembic version table should use a primary key constraint for the "value" column; this only takes effect when the table is first created. Defaults to True; setting to False should not be necessary and is here for backwards compatibility reasons. .. versionadded:: 0.8.10 Added the :paramref:`.EnvironmentContext.configure.version_table_pk` flag and additionally established that the Alembic version table has a primary key constraint by default. :param on_version_apply: a callable or collection of callables to be run for each migration step. The callables will be run in the order they are given, once for each migration step, after the respective operation has been applied but before its transaction is finalized. Each callable accepts no positional arguments and the following keyword arguments: * ``ctx``: the :class:`.MigrationContext` running the migration, * ``step``: a :class:`.MigrationInfo` representing the step currently being applied, * ``heads``: a collection of version strings representing the current heads, * ``run_args``: the ``**kwargs`` passed to :meth:`.run_migrations`. .. versionadded:: 0.9.3 Parameters specific to the autogenerate feature, when ``alembic revision`` is run with the ``--autogenerate`` feature: :param target_metadata: a :class:`sqlalchemy.schema.MetaData` object, or a sequence of :class:`~sqlalchemy.schema.MetaData` objects, that will be consulted during autogeneration. The tables present in each :class:`~sqlalchemy.schema.MetaData` will be compared against what is locally available on the target :class:`~sqlalchemy.engine.Connection` to produce candidate upgrade/downgrade operations. .. versionchanged:: 0.9.0 the :paramref:`.EnvironmentContext.configure.target_metadata` parameter may now be passed a sequence of :class:`~sqlalchemy.schema.MetaData` objects to support autogeneration of multiple :class:`~sqlalchemy.schema.MetaData` collections. :param compare_type: Indicates type comparison behavior during an autogenerate operation. Defaults to ``False`` which disables type comparison. Set to ``True`` to turn on default type comparison, which has varied accuracy depending on backend. See :ref:`compare_types` for an example as well as information on other type comparison options. .. seealso:: :ref:`compare_types` :paramref:`.EnvironmentContext.configure.compare_server_default` :param compare_server_default: Indicates server default comparison behavior during an autogenerate operation. Defaults to ``False`` which disables server default comparison. Set to ``True`` to turn on server default comparison, which has varied accuracy depending on backend. To customize server default comparison behavior, a callable may be specified which can filter server default comparisons during an autogenerate operation. defaults during an autogenerate operation. The format of this callable is:: def my_compare_server_default(context, inspected_column, metadata_column, inspected_default, metadata_default, rendered_metadata_default): # return True if the defaults are different, # False if not, or None to allow the default implementation # to compare these defaults return None context.configure( # ... compare_server_default = my_compare_server_default ) ``inspected_column`` is a dictionary structure as returned by :meth:`sqlalchemy.engine.reflection.Inspector.get_columns`, whereas ``metadata_column`` is a :class:`sqlalchemy.schema.Column` from the local model environment. A return value of ``None`` indicates to allow default server default comparison to proceed. Note that some backends such as Postgresql actually execute the two defaults on the database side to compare for equivalence. .. seealso:: :paramref:`.EnvironmentContext.configure.compare_type` :param include_object: A callable function which is given the chance to return ``True`` or ``False`` for any object, indicating if the given object should be considered in the autogenerate sweep. The function accepts the following positional arguments: * ``object``: a :class:`~sqlalchemy.schema.SchemaItem` object such as a :class:`~sqlalchemy.schema.Table`, :class:`~sqlalchemy.schema.Column`, :class:`~sqlalchemy.schema.Index` :class:`~sqlalchemy.schema.UniqueConstraint`, or :class:`~sqlalchemy.schema.ForeignKeyConstraint` object * ``name``: the name of the object. This is typically available via ``object.name``. * ``type``: a string describing the type of object; currently ``"table"``, ``"column"``, ``"index"``, ``"unique_constraint"``, or ``"foreign_key_constraint"`` .. versionadded:: 0.7.0 Support for indexes and unique constraints within the :paramref:`~.EnvironmentContext.configure.include_object` hook. .. versionadded:: 0.7.1 Support for foreign keys within the :paramref:`~.EnvironmentContext.configure.include_object` hook. * ``reflected``: ``True`` if the given object was produced based on table reflection, ``False`` if it's from a local :class:`.MetaData` object. * ``compare_to``: the object being compared against, if available, else ``None``. E.g.:: def include_object(object, name, type_, reflected, compare_to): if (type_ == "column" and not reflected and object.info.get("skip_autogenerate", False)): return False else: return True context.configure( # ... include_object = include_object ) :paramref:`.EnvironmentContext.configure.include_object` can also be used to filter on specific schemas to include or omit, when the :paramref:`.EnvironmentContext.configure.include_schemas` flag is set to ``True``. The :attr:`.Table.schema` attribute on each :class:`.Table` object reflected will indicate the name of the schema from which the :class:`.Table` originates. .. versionadded:: 0.6.0 .. seealso:: :paramref:`.EnvironmentContext.configure.include_schemas` :param include_symbol: A callable function which, given a table name and schema name (may be ``None``), returns ``True`` or ``False``, indicating if the given table should be considered in the autogenerate sweep. .. deprecated:: 0.6.0 :paramref:`.EnvironmentContext.configure.include_symbol` is superceded by the more generic :paramref:`.EnvironmentContext.configure.include_object` parameter. E.g.:: def include_symbol(tablename, schema): return tablename not in ("skip_table_one", "skip_table_two") context.configure( # ... include_symbol = include_symbol ) .. seealso:: :paramref:`.EnvironmentContext.configure.include_schemas` :paramref:`.EnvironmentContext.configure.include_object` :param render_as_batch: if True, commands which alter elements within a table will be placed under a ``with batch_alter_table():`` directive, so that batch migrations will take place. .. versionadded:: 0.7.0 .. seealso:: :ref:`batch_migrations` :param include_schemas: If True, autogenerate will scan across all schemas located by the SQLAlchemy :meth:`~sqlalchemy.engine.reflection.Inspector.get_schema_names` method, and include all differences in tables found across all those schemas. When using this option, you may want to also use the :paramref:`.EnvironmentContext.configure.include_object` option to specify a callable which can filter the tables/schemas that get included. .. seealso:: :paramref:`.EnvironmentContext.configure.include_object` :param render_item: Callable that can be used to override how any schema item, i.e. column, constraint, type, etc., is rendered for autogenerate. The callable receives a string describing the type of object, the object, and the autogen context. If it returns False, the default rendering method will be used. If it returns None, the item will not be rendered in the context of a Table construct, that is, can be used to skip columns or constraints within op.create_table():: def my_render_column(type_, col, autogen_context): if type_ == "column" and isinstance(col, MySpecialCol): return repr(col) else: return False context.configure( # ... render_item = my_render_column ) Available values for the type string include: ``"column"``, ``"primary_key"``, ``"foreign_key"``, ``"unique"``, ``"check"``, ``"type"``, ``"server_default"``. .. seealso:: :ref:`autogen_render_types` :param upgrade_token: When autogenerate completes, the text of the candidate upgrade operations will be present in this template variable when ``script.py.mako`` is rendered. Defaults to ``upgrades``. :param downgrade_token: When autogenerate completes, the text of the candidate downgrade operations will be present in this template variable when ``script.py.mako`` is rendered. Defaults to ``downgrades``. :param alembic_module_prefix: When autogenerate refers to Alembic :mod:`alembic.operations` constructs, this prefix will be used (i.e. ``op.create_table``) Defaults to "``op.``". Can be ``None`` to indicate no prefix. :param sqlalchemy_module_prefix: When autogenerate refers to SQLAlchemy :class:`~sqlalchemy.schema.Column` or type classes, this prefix will be used (i.e. ``sa.Column("somename", sa.Integer)``) Defaults to "``sa.``". Can be ``None`` to indicate no prefix. Note that when dialect-specific types are rendered, autogenerate will render them using the dialect module name, i.e. ``mssql.BIT()``, ``postgresql.UUID()``. :param user_module_prefix: When autogenerate refers to a SQLAlchemy type (e.g. :class:`.TypeEngine`) where the module name is not under the ``sqlalchemy`` namespace, this prefix will be used within autogenerate. If left at its default of ``None``, the ``__module__`` attribute of the type is used to render the import module. It's a good practice to set this and to have all custom types be available from a fixed module space, in order to future-proof migration files against reorganizations in modules. .. versionchanged:: 0.7.0 :paramref:`.EnvironmentContext.configure.user_module_prefix` no longer defaults to the value of :paramref:`.EnvironmentContext.configure.sqlalchemy_module_prefix` when left at ``None``; the ``__module__`` attribute is now used. .. versionadded:: 0.6.3 added :paramref:`.EnvironmentContext.configure.user_module_prefix` .. seealso:: :ref:`autogen_module_prefix` :param process_revision_directives: a callable function that will be passed a structure representing the end result of an autogenerate or plain "revision" operation, which can be manipulated to affect how the ``alembic revision`` command ultimately outputs new revision scripts. The structure of the callable is:: def process_revision_directives(context, revision, directives): pass The ``directives`` parameter is a Python list containing a single :class:`.MigrationScript` directive, which represents the revision file to be generated. This list as well as its contents may be freely modified to produce any set of commands. The section :ref:`customizing_revision` shows an example of doing this. The ``context`` parameter is the :class:`.MigrationContext` in use, and ``revision`` is a tuple of revision identifiers representing the current revision of the database. The callable is invoked at all times when the ``--autogenerate`` option is passed to ``alembic revision``. If ``--autogenerate`` is not passed, the callable is invoked only if the ``revision_environment`` variable is set to True in the Alembic configuration, in which case the given ``directives`` collection will contain empty :class:`.UpgradeOps` and :class:`.DowngradeOps` collections for ``.upgrade_ops`` and ``.downgrade_ops``. The ``--autogenerate`` option itself can be inferred by inspecting ``context.config.cmd_opts.autogenerate``. The callable function may optionally be an instance of a :class:`.Rewriter` object. This is a helper object that assists in the production of autogenerate-stream rewriter functions. .. versionadded:: 0.8.0 .. versionchanged:: 0.8.1 - The :paramref:`.EnvironmentContext.configure.process_revision_directives` hook can append op directives into :class:`.UpgradeOps` and :class:`.DowngradeOps` which will be rendered in Python regardless of whether the ``--autogenerate`` option is in use or not; the ``revision_environment`` configuration variable should be set to "true" in the config to enable this. .. seealso:: :ref:`customizing_revision` :ref:`autogen_rewriter` :paramref:`.command.revision.process_revision_directives` Parameters specific to individual backends: :param mssql_batch_separator: The "batch separator" which will be placed between each statement when generating offline SQL Server migrations. Defaults to ``GO``. Note this is in addition to the customary semicolon ``;`` at the end of each statement; SQL Server considers the "batch separator" to denote the end of an individual statement execution, and cannot group certain dependent operations in one step. :param oracle_batch_separator: The "batch separator" which will be placed between each statement when generating offline Oracle migrations. Defaults to ``/``. Oracle doesn't add a semicolon between statements like most other backends. """ opts = self.context_opts if transactional_ddl is not None: opts["transactional_ddl"] = transactional_ddl if output_buffer is not None: opts["output_buffer"] = output_buffer elif self.config.output_buffer is not None: opts["output_buffer"] = self.config.output_buffer if starting_rev: opts["starting_rev"] = starting_rev if tag: opts["tag"] = tag if template_args and "template_args" in opts: opts["template_args"].update(template_args) opts["transaction_per_migration"] = transaction_per_migration opts["target_metadata"] = target_metadata opts["include_symbol"] = include_symbol opts["include_object"] = include_object opts["include_schemas"] = include_schemas opts["render_as_batch"] = render_as_batch opts["upgrade_token"] = upgrade_token opts["downgrade_token"] = downgrade_token opts["sqlalchemy_module_prefix"] = sqlalchemy_module_prefix opts["alembic_module_prefix"] = alembic_module_prefix opts["user_module_prefix"] = user_module_prefix opts["literal_binds"] = literal_binds opts["process_revision_directives"] = process_revision_directives opts["on_version_apply"] = util.to_tuple(on_version_apply, default=()) if render_item is not None: opts["render_item"] = render_item if compare_type is not None: opts["compare_type"] = compare_type if compare_server_default is not None: opts["compare_server_default"] = compare_server_default opts["script"] = self.script opts.update(kw) self._migration_context = MigrationContext.configure( connection=connection, url=url, dialect_name=dialect_name, environment_context=self, dialect_opts=dialect_opts, opts=opts, ) def run_migrations(self, **kw): """Run migrations as determined by the current command line configuration as well as versioning information present (or not) in the current database connection (if one is present). The function accepts optional ``**kw`` arguments. If these are passed, they are sent directly to the ``upgrade()`` and ``downgrade()`` functions within each target revision file. By modifying the ``script.py.mako`` file so that the ``upgrade()`` and ``downgrade()`` functions accept arguments, parameters can be passed here so that contextual information, usually information to identify a particular database in use, can be passed from a custom ``env.py`` script to the migration functions. This function requires that a :class:`.MigrationContext` has first been made available via :meth:`.configure`. """ with Operations.context(self._migration_context): self.get_context().run_migrations(**kw) def execute(self, sql, execution_options=None): """Execute the given SQL using the current change context. The behavior of :meth:`.execute` is the same as that of :meth:`.Operations.execute`. Please see that function's documentation for full detail including caveats and limitations. This function requires that a :class:`.MigrationContext` has first been made available via :meth:`.configure`. """ self.get_context().execute(sql, execution_options=execution_options) def static_output(self, text): """Emit text directly to the "offline" SQL stream. Typically this is for emitting comments that start with --. The statement is not treated as a SQL execution, no ; or batch separator is added, etc. """ self.get_context().impl.static_output(text) def begin_transaction(self): """Return a context manager that will enclose an operation within a "transaction", as defined by the environment's offline and transactional DDL settings. e.g.:: with context.begin_transaction(): context.run_migrations() :meth:`.begin_transaction` is intended to "do the right thing" regardless of calling context: * If :meth:`.is_transactional_ddl` is ``False``, returns a "do nothing" context manager which otherwise produces no transactional state or directives. * If :meth:`.is_offline_mode` is ``True``, returns a context manager that will invoke the :meth:`.DefaultImpl.emit_begin` and :meth:`.DefaultImpl.emit_commit` methods, which will produce the string directives ``BEGIN`` and ``COMMIT`` on the output stream, as rendered by the target backend (e.g. SQL Server would emit ``BEGIN TRANSACTION``). * Otherwise, calls :meth:`sqlalchemy.engine.Connection.begin` on the current online connection, which returns a :class:`sqlalchemy.engine.Transaction` object. This object demarcates a real transaction and is itself a context manager, which will roll back if an exception is raised. Note that a custom ``env.py`` script which has more specific transactional needs can of course manipulate the :class:`~sqlalchemy.engine.Connection` directly to produce transactional state in "online" mode. """ return self.get_context().begin_transaction() def get_context(self): """Return the current :class:`.MigrationContext` object. If :meth:`.EnvironmentContext.configure` has not been called yet, raises an exception. """ if self._migration_context is None: raise Exception("No context has been configured yet.") return self._migration_context def get_bind(self): """Return the current 'bind'. In "online" mode, this is the :class:`sqlalchemy.engine.Connection` currently being used to emit SQL to the database. This function requires that a :class:`.MigrationContext` has first been made available via :meth:`.configure`. """ return self.get_context().bind def get_impl(self): return self.get_context().impl zzzeek-alembic-bee044a1c187/alembic/runtime/migration.py000066400000000000000000001056601353106760100232340ustar00rootroot00000000000000from contextlib import contextmanager import logging import sys from sqlalchemy import Column from sqlalchemy import literal_column from sqlalchemy import MetaData from sqlalchemy import PrimaryKeyConstraint from sqlalchemy import String from sqlalchemy import Table from sqlalchemy.engine import Connection from sqlalchemy.engine import url as sqla_url from sqlalchemy.engine.strategies import MockEngineStrategy from .. import ddl from .. import util from ..util.compat import callable from ..util.compat import EncodedIO log = logging.getLogger(__name__) class MigrationContext(object): """Represent the database state made available to a migration script. :class:`.MigrationContext` is the front end to an actual database connection, or alternatively a string output stream given a particular database dialect, from an Alembic perspective. When inside the ``env.py`` script, the :class:`.MigrationContext` is available via the :meth:`.EnvironmentContext.get_context` method, which is available at ``alembic.context``:: # from within env.py script from alembic import context migration_context = context.get_context() For usage outside of an ``env.py`` script, such as for utility routines that want to check the current version in the database, the :meth:`.MigrationContext.configure` method to create new :class:`.MigrationContext` objects. For example, to get at the current revision in the database using :meth:`.MigrationContext.get_current_revision`:: # in any application, outside of an env.py script from alembic.migration import MigrationContext from sqlalchemy import create_engine engine = create_engine("postgresql://mydatabase") conn = engine.connect() context = MigrationContext.configure(conn) current_rev = context.get_current_revision() The above context can also be used to produce Alembic migration operations with an :class:`.Operations` instance:: # in any application, outside of the normal Alembic environment from alembic.operations import Operations op = Operations(context) op.alter_column("mytable", "somecolumn", nullable=True) """ def __init__(self, dialect, connection, opts, environment_context=None): self.environment_context = environment_context self.opts = opts self.dialect = dialect self.script = opts.get("script") as_sql = opts.get("as_sql", False) transactional_ddl = opts.get("transactional_ddl") self._transaction_per_migration = opts.get( "transaction_per_migration", False ) self.on_version_apply_callbacks = opts.get("on_version_apply", ()) if as_sql: self.connection = self._stdout_connection(connection) assert self.connection is not None else: self.connection = connection self._migrations_fn = opts.get("fn") self.as_sql = as_sql if "output_encoding" in opts: self.output_buffer = EncodedIO( opts.get("output_buffer") or sys.stdout, opts["output_encoding"], ) else: self.output_buffer = opts.get("output_buffer", sys.stdout) self._user_compare_type = opts.get("compare_type", False) self._user_compare_server_default = opts.get( "compare_server_default", False ) self.version_table = version_table = opts.get( "version_table", "alembic_version" ) self.version_table_schema = version_table_schema = opts.get( "version_table_schema", None ) self._version = Table( version_table, MetaData(), Column("version_num", String(32), nullable=False), schema=version_table_schema, ) if opts.get("version_table_pk", True): self._version.append_constraint( PrimaryKeyConstraint( "version_num", name="%s_pkc" % version_table ) ) self._start_from_rev = opts.get("starting_rev") self.impl = ddl.DefaultImpl.get_by_dialect(dialect)( dialect, self.connection, self.as_sql, transactional_ddl, self.output_buffer, opts, ) log.info("Context impl %s.", self.impl.__class__.__name__) if self.as_sql: log.info("Generating static SQL") log.info( "Will assume %s DDL.", "transactional" if self.impl.transactional_ddl else "non-transactional", ) @classmethod def configure( cls, connection=None, url=None, dialect_name=None, dialect=None, environment_context=None, dialect_opts=None, opts=None, ): """Create a new :class:`.MigrationContext`. This is a factory method usually called by :meth:`.EnvironmentContext.configure`. :param connection: a :class:`~sqlalchemy.engine.Connection` to use for SQL execution in "online" mode. When present, is also used to determine the type of dialect in use. :param url: a string database url, or a :class:`sqlalchemy.engine.url.URL` object. The type of dialect to be used will be derived from this if ``connection`` is not passed. :param dialect_name: string name of a dialect, such as "postgresql", "mssql", etc. The type of dialect to be used will be derived from this if ``connection`` and ``url`` are not passed. :param opts: dictionary of options. Most other options accepted by :meth:`.EnvironmentContext.configure` are passed via this dictionary. """ if opts is None: opts = {} if dialect_opts is None: dialect_opts = {} if connection: if not isinstance(connection, Connection): util.warn( "'connection' argument to configure() is expected " "to be a sqlalchemy.engine.Connection instance, " "got %r" % connection, stacklevel=3, ) dialect = connection.dialect elif url: url = sqla_url.make_url(url) dialect = url.get_dialect()(**dialect_opts) elif dialect_name: url = sqla_url.make_url("%s://" % dialect_name) dialect = url.get_dialect()(**dialect_opts) elif not dialect: raise Exception("Connection, url, or dialect_name is required.") return MigrationContext(dialect, connection, opts, environment_context) def begin_transaction(self, _per_migration=False): transaction_now = _per_migration == self._transaction_per_migration if not transaction_now: @contextmanager def do_nothing(): yield return do_nothing() elif not self.impl.transactional_ddl: @contextmanager def do_nothing(): yield return do_nothing() elif self.as_sql: @contextmanager def begin_commit(): self.impl.emit_begin() yield self.impl.emit_commit() return begin_commit() else: return self.bind.begin() def get_current_revision(self): """Return the current revision, usually that which is present in the ``alembic_version`` table in the database. This method intends to be used only for a migration stream that does not contain unmerged branches in the target database; if there are multiple branches present, an exception is raised. The :meth:`.MigrationContext.get_current_heads` should be preferred over this method going forward in order to be compatible with branch migration support. If this :class:`.MigrationContext` was configured in "offline" mode, that is with ``as_sql=True``, the ``starting_rev`` parameter is returned instead, if any. """ heads = self.get_current_heads() if len(heads) == 0: return None elif len(heads) > 1: raise util.CommandError( "Version table '%s' has more than one head present; " "please use get_current_heads()" % self.version_table ) else: return heads[0] def get_current_heads(self): """Return a tuple of the current 'head versions' that are represented in the target database. For a migration stream without branches, this will be a single value, synonymous with that of :meth:`.MigrationContext.get_current_revision`. However when multiple unmerged branches exist within the target database, the returned tuple will contain a value for each head. If this :class:`.MigrationContext` was configured in "offline" mode, that is with ``as_sql=True``, the ``starting_rev`` parameter is returned in a one-length tuple. If no version table is present, or if there are no revisions present, an empty tuple is returned. .. versionadded:: 0.7.0 """ if self.as_sql: start_from_rev = self._start_from_rev if start_from_rev == "base": start_from_rev = None elif start_from_rev is not None and self.script: start_from_rev = self.script.get_revision( start_from_rev ).revision return util.to_tuple(start_from_rev, default=()) else: if self._start_from_rev: raise util.CommandError( "Can't specify current_rev to context " "when using a database connection" ) if not self._has_version_table(): return () return tuple( row[0] for row in self.connection.execute(self._version.select()) ) def _ensure_version_table(self): self._version.create(self.connection, checkfirst=True) def _has_version_table(self): return self.connection.dialect.has_table( self.connection, self.version_table, self.version_table_schema ) def stamp(self, script_directory, revision): """Stamp the version table with a specific revision. This method calculates those branches to which the given revision can apply, and updates those branches as though they were migrated towards that revision (either up or down). If no current branches include the revision, it is added as a new branch head. .. versionadded:: 0.7.0 """ heads = self.get_current_heads() if not self.as_sql and not heads: self._ensure_version_table() head_maintainer = HeadMaintainer(self, heads) for step in script_directory._stamp_revs(revision, heads): head_maintainer.update_to_step(step) def run_migrations(self, **kw): r"""Run the migration scripts established for this :class:`.MigrationContext`, if any. The commands in :mod:`alembic.command` will set up a function that is ultimately passed to the :class:`.MigrationContext` as the ``fn`` argument. This function represents the "work" that will be done when :meth:`.MigrationContext.run_migrations` is called, typically from within the ``env.py`` script of the migration environment. The "work function" then provides an iterable of version callables and other version information which in the case of the ``upgrade`` or ``downgrade`` commands are the list of version scripts to invoke. Other commands yield nothing, in the case that a command wants to run some other operation against the database such as the ``current`` or ``stamp`` commands. :param \**kw: keyword arguments here will be passed to each migration callable, that is the ``upgrade()`` or ``downgrade()`` method within revision scripts. """ self.impl.start_migrations() heads = self.get_current_heads() if not self.as_sql and not heads: self._ensure_version_table() head_maintainer = HeadMaintainer(self, heads) starting_in_transaction = ( not self.as_sql and self._in_connection_transaction() ) for step in self._migrations_fn(heads, self): with self.begin_transaction(_per_migration=True): if self.as_sql and not head_maintainer.heads: # for offline mode, include a CREATE TABLE from # the base self._version.create(self.connection) log.info("Running %s", step) if self.as_sql: self.impl.static_output( "-- Running %s" % (step.short_log,) ) step.migration_fn(**kw) # previously, we wouldn't stamp per migration # if we were in a transaction, however given the more # complex model that involves any number of inserts # and row-targeted updates and deletes, it's simpler for now # just to run the operations on every version head_maintainer.update_to_step(step) for callback in self.on_version_apply_callbacks: callback( ctx=self, step=step.info, heads=set(head_maintainer.heads), run_args=kw, ) if ( not starting_in_transaction and not self.as_sql and not self.impl.transactional_ddl and self._in_connection_transaction() ): raise util.CommandError( 'Migration "%s" has left an uncommitted ' "transaction opened; transactional_ddl is False so " "Alembic is not committing transactions" % step ) if self.as_sql and not head_maintainer.heads: self._version.drop(self.connection) def _in_connection_transaction(self): try: meth = self.connection.in_transaction except AttributeError: return False else: return meth() def execute(self, sql, execution_options=None): """Execute a SQL construct or string statement. The underlying execution mechanics are used, that is if this is "offline mode" the SQL is written to the output buffer, otherwise the SQL is emitted on the current SQLAlchemy connection. """ self.impl._exec(sql, execution_options) def _stdout_connection(self, connection): def dump(construct, *multiparams, **params): self.impl._exec(construct) return MockEngineStrategy.MockConnection(self.dialect, dump) @property def bind(self): """Return the current "bind". In online mode, this is an instance of :class:`sqlalchemy.engine.Connection`, and is suitable for ad-hoc execution of any kind of usage described in :ref:`sqlexpression_toplevel` as well as for usage with the :meth:`sqlalchemy.schema.Table.create` and :meth:`sqlalchemy.schema.MetaData.create_all` methods of :class:`~sqlalchemy.schema.Table`, :class:`~sqlalchemy.schema.MetaData`. Note that when "standard output" mode is enabled, this bind will be a "mock" connection handler that cannot return results and is only appropriate for a very limited subset of commands. """ return self.connection @property def config(self): """Return the :class:`.Config` used by the current environment, if any. .. versionadded:: 0.6.6 """ if self.environment_context: return self.environment_context.config else: return None def _compare_type(self, inspector_column, metadata_column): if self._user_compare_type is False: return False if callable(self._user_compare_type): user_value = self._user_compare_type( self, inspector_column, metadata_column, inspector_column.type, metadata_column.type, ) if user_value is not None: return user_value return self.impl.compare_type(inspector_column, metadata_column) def _compare_server_default( self, inspector_column, metadata_column, rendered_metadata_default, rendered_column_default, ): if self._user_compare_server_default is False: return False if callable(self._user_compare_server_default): user_value = self._user_compare_server_default( self, inspector_column, metadata_column, rendered_column_default, metadata_column.server_default, rendered_metadata_default, ) if user_value is not None: return user_value return self.impl.compare_server_default( inspector_column, metadata_column, rendered_metadata_default, rendered_column_default, ) class HeadMaintainer(object): def __init__(self, context, heads): self.context = context self.heads = set(heads) def _insert_version(self, version): assert version not in self.heads self.heads.add(version) self.context.impl._exec( self.context._version.insert().values( version_num=literal_column("'%s'" % version) ) ) def _delete_version(self, version): self.heads.remove(version) ret = self.context.impl._exec( self.context._version.delete().where( self.context._version.c.version_num == literal_column("'%s'" % version) ) ) if not self.context.as_sql and ret.rowcount != 1: raise util.CommandError( "Online migration expected to match one " "row when deleting '%s' in '%s'; " "%d found" % (version, self.context.version_table, ret.rowcount) ) def _update_version(self, from_, to_): assert to_ not in self.heads self.heads.remove(from_) self.heads.add(to_) ret = self.context.impl._exec( self.context._version.update() .values(version_num=literal_column("'%s'" % to_)) .where( self.context._version.c.version_num == literal_column("'%s'" % from_) ) ) if not self.context.as_sql and ret.rowcount != 1: raise util.CommandError( "Online migration expected to match one " "row when updating '%s' to '%s' in '%s'; " "%d found" % (from_, to_, self.context.version_table, ret.rowcount) ) def update_to_step(self, step): if step.should_delete_branch(self.heads): vers = step.delete_version_num log.debug("branch delete %s", vers) self._delete_version(vers) elif step.should_create_branch(self.heads): vers = step.insert_version_num log.debug("new branch insert %s", vers) self._insert_version(vers) elif step.should_merge_branches(self.heads): # delete revs, update from rev, update to rev ( delete_revs, update_from_rev, update_to_rev, ) = step.merge_branch_idents(self.heads) log.debug( "merge, delete %s, update %s to %s", delete_revs, update_from_rev, update_to_rev, ) for delrev in delete_revs: self._delete_version(delrev) self._update_version(update_from_rev, update_to_rev) elif step.should_unmerge_branches(self.heads): ( update_from_rev, update_to_rev, insert_revs, ) = step.unmerge_branch_idents(self.heads) log.debug( "unmerge, insert %s, update %s to %s", insert_revs, update_from_rev, update_to_rev, ) for insrev in insert_revs: self._insert_version(insrev) self._update_version(update_from_rev, update_to_rev) else: from_, to_ = step.update_version_num(self.heads) log.debug("update %s to %s", from_, to_) self._update_version(from_, to_) class MigrationInfo(object): """Exposes information about a migration step to a callback listener. The :class:`.MigrationInfo` object is available exclusively for the benefit of the :paramref:`.EnvironmentContext.on_version_apply` callback hook. .. versionadded:: 0.9.3 """ is_upgrade = None """True/False: indicates whether this operation ascends or descends the version tree.""" is_stamp = None """True/False: indicates whether this operation is a stamp (i.e. whether it results in any actual database operations).""" up_revision_id = None """Version string corresponding to :attr:`.Revision.revision`. In the case of a stamp operation, it is advised to use the :attr:`.MigrationInfo.up_revision_ids` tuple as a stamp operation can make a single movement from one or more branches down to a single branchpoint, in which case there will be multiple "up" revisions. .. seealso:: :attr:`.MigrationInfo.up_revision_ids` """ up_revision_ids = None """Tuple of version strings corresponding to :attr:`.Revision.revision`. In the majority of cases, this tuple will be a single value, synonomous with the scalar value of :attr:`.MigrationInfo.up_revision_id`. It can be multiple revision identifiers only in the case of an ``alembic stamp`` operation which is moving downwards from multiple branches down to their common branch point. .. versionadded:: 0.9.4 """ down_revision_ids = None """Tuple of strings representing the base revisions of this migration step. If empty, this represents a root revision; otherwise, the first item corresponds to :attr:`.Revision.down_revision`, and the rest are inferred from dependencies. """ revision_map = None """The revision map inside of which this operation occurs.""" def __init__( self, revision_map, is_upgrade, is_stamp, up_revisions, down_revisions ): self.revision_map = revision_map self.is_upgrade = is_upgrade self.is_stamp = is_stamp self.up_revision_ids = util.to_tuple(up_revisions, default=()) if self.up_revision_ids: self.up_revision_id = self.up_revision_ids[0] else: # this should never be the case with # "upgrade", "downgrade", or "stamp" as we are always # measuring movement in terms of at least one upgrade version self.up_revision_id = None self.down_revision_ids = util.to_tuple(down_revisions, default=()) @property def is_migration(self): """True/False: indicates whether this operation is a migration. At present this is true if and only the migration is not a stamp. If other operation types are added in the future, both this attribute and :attr:`~.MigrationInfo.is_stamp` will be false. """ return not self.is_stamp @property def source_revision_ids(self): """Active revisions before this migration step is applied.""" return ( self.down_revision_ids if self.is_upgrade else self.up_revision_ids ) @property def destination_revision_ids(self): """Active revisions after this migration step is applied.""" return ( self.up_revision_ids if self.is_upgrade else self.down_revision_ids ) @property def up_revision(self): """Get :attr:`~.MigrationInfo.up_revision_id` as a :class:`.Revision`. """ return self.revision_map.get_revision(self.up_revision_id) @property def up_revisions(self): """Get :attr:`~.MigrationInfo.up_revision_ids` as a :class:`.Revision`. .. versionadded:: 0.9.4 """ return self.revision_map.get_revisions(self.up_revision_ids) @property def down_revisions(self): """Get :attr:`~.MigrationInfo.down_revision_ids` as a tuple of :class:`Revisions <.Revision>`.""" return self.revision_map.get_revisions(self.down_revision_ids) @property def source_revisions(self): """Get :attr:`~MigrationInfo.source_revision_ids` as a tuple of :class:`Revisions <.Revision>`.""" return self.revision_map.get_revisions(self.source_revision_ids) @property def destination_revisions(self): """Get :attr:`~MigrationInfo.destination_revision_ids` as a tuple of :class:`Revisions <.Revision>`.""" return self.revision_map.get_revisions(self.destination_revision_ids) class MigrationStep(object): @property def name(self): return self.migration_fn.__name__ @classmethod def upgrade_from_script(cls, revision_map, script): return RevisionStep(revision_map, script, True) @classmethod def downgrade_from_script(cls, revision_map, script): return RevisionStep(revision_map, script, False) @property def is_downgrade(self): return not self.is_upgrade @property def short_log(self): return "%s %s -> %s" % ( self.name, util.format_as_comma(self.from_revisions_no_deps), util.format_as_comma(self.to_revisions_no_deps), ) def __str__(self): if self.doc: return "%s %s -> %s, %s" % ( self.name, util.format_as_comma(self.from_revisions_no_deps), util.format_as_comma(self.to_revisions_no_deps), self.doc, ) else: return self.short_log class RevisionStep(MigrationStep): def __init__(self, revision_map, revision, is_upgrade): self.revision_map = revision_map self.revision = revision self.is_upgrade = is_upgrade if is_upgrade: self.migration_fn = revision.module.upgrade else: self.migration_fn = revision.module.downgrade def __repr__(self): return "RevisionStep(%r, is_upgrade=%r)" % ( self.revision.revision, self.is_upgrade, ) def __eq__(self, other): return ( isinstance(other, RevisionStep) and other.revision == self.revision and self.is_upgrade == other.is_upgrade ) @property def doc(self): return self.revision.doc @property def from_revisions(self): if self.is_upgrade: return self.revision._all_down_revisions else: return (self.revision.revision,) @property def from_revisions_no_deps(self): if self.is_upgrade: return self.revision._versioned_down_revisions else: return (self.revision.revision,) @property def to_revisions(self): if self.is_upgrade: return (self.revision.revision,) else: return self.revision._all_down_revisions @property def to_revisions_no_deps(self): if self.is_upgrade: return (self.revision.revision,) else: return self.revision._versioned_down_revisions @property def _has_scalar_down_revision(self): return len(self.revision._all_down_revisions) == 1 def should_delete_branch(self, heads): """A delete is when we are a. in a downgrade and b. we are going to the "base" or we are going to a version that is implied as a dependency on another version that is remaining. """ if not self.is_downgrade: return False if self.revision.revision not in heads: return False downrevs = self.revision._all_down_revisions if not downrevs: # is a base return True else: # determine what the ultimate "to_revisions" for an # unmerge would be. If there are none, then we're a delete. to_revisions = self._unmerge_to_revisions(heads) return not to_revisions def merge_branch_idents(self, heads): other_heads = set(heads).difference(self.from_revisions) if other_heads: ancestors = set( r.revision for r in self.revision_map._get_ancestor_nodes( self.revision_map.get_revisions(other_heads), check=False ) ) from_revisions = list( set(self.from_revisions).difference(ancestors) ) else: from_revisions = list(self.from_revisions) return ( # delete revs, update from rev, update to rev list(from_revisions[0:-1]), from_revisions[-1], self.to_revisions[0], ) def _unmerge_to_revisions(self, heads): other_heads = set(heads).difference([self.revision.revision]) if other_heads: ancestors = set( r.revision for r in self.revision_map._get_ancestor_nodes( self.revision_map.get_revisions(other_heads), check=False ) ) return list(set(self.to_revisions).difference(ancestors)) else: return self.to_revisions def unmerge_branch_idents(self, heads): to_revisions = self._unmerge_to_revisions(heads) return ( # update from rev, update to rev, insert revs self.from_revisions[0], to_revisions[-1], to_revisions[0:-1], ) def should_create_branch(self, heads): if not self.is_upgrade: return False downrevs = self.revision._all_down_revisions if not downrevs: # is a base return True else: # none of our downrevs are present, so... # we have to insert our version. This is true whether # or not there is only one downrev, or multiple (in the latter # case, we're a merge point.) if not heads.intersection(downrevs): return True else: return False def should_merge_branches(self, heads): if not self.is_upgrade: return False downrevs = self.revision._all_down_revisions if len(downrevs) > 1 and len(heads.intersection(downrevs)) > 1: return True return False def should_unmerge_branches(self, heads): if not self.is_downgrade: return False downrevs = self.revision._all_down_revisions if self.revision.revision in heads and len(downrevs) > 1: return True return False def update_version_num(self, heads): if not self._has_scalar_down_revision: downrev = heads.intersection(self.revision._all_down_revisions) assert ( len(downrev) == 1 ), "Can't do an UPDATE because downrevision is ambiguous" down_revision = list(downrev)[0] else: down_revision = self.revision._all_down_revisions[0] if self.is_upgrade: return down_revision, self.revision.revision else: return self.revision.revision, down_revision @property def delete_version_num(self): return self.revision.revision @property def insert_version_num(self): return self.revision.revision @property def info(self): return MigrationInfo( revision_map=self.revision_map, up_revisions=self.revision.revision, down_revisions=self.revision._all_down_revisions, is_upgrade=self.is_upgrade, is_stamp=False, ) class StampStep(MigrationStep): def __init__(self, from_, to_, is_upgrade, branch_move, revision_map=None): self.from_ = util.to_tuple(from_, default=()) self.to_ = util.to_tuple(to_, default=()) self.is_upgrade = is_upgrade self.branch_move = branch_move self.migration_fn = self.stamp_revision self.revision_map = revision_map doc = None def stamp_revision(self, **kw): return None def __eq__(self, other): return ( isinstance(other, StampStep) and other.from_revisions == self.revisions and other.to_revisions == self.to_revisions and other.branch_move == self.branch_move and self.is_upgrade == other.is_upgrade ) @property def from_revisions(self): return self.from_ @property def to_revisions(self): return self.to_ @property def from_revisions_no_deps(self): return self.from_ @property def to_revisions_no_deps(self): return self.to_ @property def delete_version_num(self): assert len(self.from_) == 1 return self.from_[0] @property def insert_version_num(self): assert len(self.to_) == 1 return self.to_[0] def update_version_num(self, heads): assert len(self.from_) == 1 assert len(self.to_) == 1 return self.from_[0], self.to_[0] def merge_branch_idents(self, heads): return ( # delete revs, update from rev, update to rev list(self.from_[0:-1]), self.from_[-1], self.to_[0], ) def unmerge_branch_idents(self, heads): return ( # update from rev, update to rev, insert revs self.from_[0], self.to_[-1], list(self.to_[0:-1]), ) def should_delete_branch(self, heads): return self.is_downgrade and self.branch_move def should_create_branch(self, heads): return self.is_upgrade and self.branch_move def should_merge_branches(self, heads): return len(self.from_) > 1 def should_unmerge_branches(self, heads): return len(self.to_) > 1 @property def info(self): up, down = ( (self.to_, self.from_) if self.is_upgrade else (self.from_, self.to_) ) return MigrationInfo( revision_map=self.revision_map, up_revisions=up, down_revisions=down, is_upgrade=self.is_upgrade, is_stamp=True, ) zzzeek-alembic-bee044a1c187/alembic/script/000077500000000000000000000000001353106760100205025ustar00rootroot00000000000000zzzeek-alembic-bee044a1c187/alembic/script/__init__.py000066400000000000000000000001641353106760100226140ustar00rootroot00000000000000from .base import Script # noqa from .base import ScriptDirectory # noqa __all__ = ["ScriptDirectory", "Script"] zzzeek-alembic-bee044a1c187/alembic/script/base.py000066400000000000000000000750321353106760100217750ustar00rootroot00000000000000from contextlib import contextmanager import datetime import os import re import shutil from dateutil import tz from . import revision from .. import util from ..runtime import migration from ..util import compat _sourceless_rev_file = re.compile(r"(?!\.\#|__init__)(.*\.py)(c|o)?$") _only_source_rev_file = re.compile(r"(?!\.\#|__init__)(.*\.py)$") _legacy_rev = re.compile(r"([a-f0-9]+)\.py$") _mod_def_re = re.compile(r"(upgrade|downgrade)_([a-z0-9]+)") _slug_re = re.compile(r"\w+") _default_file_template = "%(rev)s_%(slug)s" _split_on_space_comma = re.compile(r",|(?: +)") class ScriptDirectory(object): """Provides operations upon an Alembic script directory. This object is useful to get information as to current revisions, most notably being able to get at the "head" revision, for schemes that want to test if the current revision in the database is the most recent:: from alembic.script import ScriptDirectory from alembic.config import Config config = Config() config.set_main_option("script_location", "myapp:migrations") script = ScriptDirectory.from_config(config) head_revision = script.get_current_head() """ def __init__( self, dir, # noqa file_template=_default_file_template, truncate_slug_length=40, version_locations=None, sourceless=False, output_encoding="utf-8", timezone=None, ): self.dir = dir self.file_template = file_template self.version_locations = version_locations self.truncate_slug_length = truncate_slug_length or 40 self.sourceless = sourceless self.output_encoding = output_encoding self.revision_map = revision.RevisionMap(self._load_revisions) self.timezone = timezone if not os.access(dir, os.F_OK): raise util.CommandError( "Path doesn't exist: %r. Please use " "the 'init' command to create a new " "scripts folder." % dir ) @property def versions(self): loc = self._version_locations if len(loc) > 1: raise util.CommandError("Multiple version_locations present") else: return loc[0] @util.memoized_property def _version_locations(self): if self.version_locations: return [ os.path.abspath(util.coerce_resource_to_filename(location)) for location in self.version_locations ] else: return (os.path.abspath(os.path.join(self.dir, "versions")),) def _load_revisions(self): if self.version_locations: paths = [ vers for vers in self._version_locations if os.path.exists(vers) ] else: paths = [self.versions] dupes = set() for vers in paths: for file_ in Script._list_py_dir(self, vers): path = os.path.realpath(os.path.join(vers, file_)) if path in dupes: util.warn( "File %s loaded twice! ignoring. Please ensure " "version_locations is unique." % path ) continue dupes.add(path) script = Script._from_filename(self, vers, file_) if script is None: continue yield script @classmethod def from_config(cls, config): """Produce a new :class:`.ScriptDirectory` given a :class:`.Config` instance. The :class:`.Config` need only have the ``script_location`` key present. """ script_location = config.get_main_option("script_location") if script_location is None: raise util.CommandError( "No 'script_location' key " "found in configuration." ) truncate_slug_length = config.get_main_option("truncate_slug_length") if truncate_slug_length is not None: truncate_slug_length = int(truncate_slug_length) version_locations = config.get_main_option("version_locations") if version_locations: version_locations = _split_on_space_comma.split(version_locations) return ScriptDirectory( util.coerce_resource_to_filename(script_location), file_template=config.get_main_option( "file_template", _default_file_template ), truncate_slug_length=truncate_slug_length, sourceless=config.get_main_option("sourceless") == "true", output_encoding=config.get_main_option("output_encoding", "utf-8"), version_locations=version_locations, timezone=config.get_main_option("timezone"), ) @contextmanager def _catch_revision_errors( self, ancestor=None, multiple_heads=None, start=None, end=None, resolution=None, ): try: yield except revision.RangeNotAncestorError as rna: if start is None: start = rna.lower if end is None: end = rna.upper if not ancestor: ancestor = ( "Requested range %(start)s:%(end)s does not refer to " "ancestor/descendant revisions along the same branch" ) ancestor = ancestor % {"start": start, "end": end} compat.raise_from_cause(util.CommandError(ancestor)) except revision.MultipleHeads as mh: if not multiple_heads: multiple_heads = ( "Multiple head revisions are present for given " "argument '%(head_arg)s'; please " "specify a specific target revision, " "'@%(head_arg)s' to " "narrow to a specific head, or 'heads' for all heads" ) multiple_heads = multiple_heads % { "head_arg": end or mh.argument, "heads": util.format_as_comma(mh.heads), } compat.raise_from_cause(util.CommandError(multiple_heads)) except revision.ResolutionError as re: if resolution is None: resolution = "Can't locate revision identified by '%s'" % ( re.argument ) compat.raise_from_cause(util.CommandError(resolution)) except revision.RevisionError as err: compat.raise_from_cause(util.CommandError(err.args[0])) def walk_revisions(self, base="base", head="heads"): """Iterate through all revisions. :param base: the base revision, or "base" to start from the empty revision. :param head: the head revision; defaults to "heads" to indicate all head revisions. May also be "head" to indicate a single head revision. .. versionchanged:: 0.7.0 the "head" identifier now refers to the head of a non-branched repository only; use "heads" to refer to the set of all head branches simultaneously. """ with self._catch_revision_errors(start=base, end=head): for rev in self.revision_map.iterate_revisions( head, base, inclusive=True, assert_relative_length=False ): yield rev def get_revisions(self, id_): """Return the :class:`.Script` instance with the given rev identifier, symbolic name, or sequence of identifiers. .. versionadded:: 0.7.0 """ with self._catch_revision_errors(): return self.revision_map.get_revisions(id_) def get_all_current(self, id_): with self._catch_revision_errors(): top_revs = set(self.revision_map.get_revisions(id_)) top_revs.update( self.revision_map._get_ancestor_nodes( list(top_revs), include_dependencies=True ) ) top_revs = self.revision_map._filter_into_branch_heads(top_revs) return top_revs def get_revision(self, id_): """Return the :class:`.Script` instance with the given rev id. .. seealso:: :meth:`.ScriptDirectory.get_revisions` """ with self._catch_revision_errors(): return self.revision_map.get_revision(id_) def as_revision_number(self, id_): """Convert a symbolic revision, i.e. 'head' or 'base', into an actual revision number.""" with self._catch_revision_errors(): rev, branch_name = self.revision_map._resolve_revision_number(id_) if not rev: # convert () to None return None elif id_ == "heads": return rev else: return rev[0] def iterate_revisions(self, upper, lower): """Iterate through script revisions, starting at the given upper revision identifier and ending at the lower. The traversal uses strictly the `down_revision` marker inside each migration script, so it is a requirement that upper >= lower, else you'll get nothing back. The iterator yields :class:`.Script` objects. .. seealso:: :meth:`.RevisionMap.iterate_revisions` """ return self.revision_map.iterate_revisions(upper, lower) def get_current_head(self): """Return the current head revision. If the script directory has multiple heads due to branching, an error is raised; :meth:`.ScriptDirectory.get_heads` should be preferred. :return: a string revision number. .. seealso:: :meth:`.ScriptDirectory.get_heads` """ with self._catch_revision_errors( multiple_heads=( "The script directory has multiple heads (due to branching)." "Please use get_heads(), or merge the branches using " "alembic merge." ) ): return self.revision_map.get_current_head() def get_heads(self): """Return all "versioned head" revisions as strings. This is normally a list of length one, unless branches are present. The :meth:`.ScriptDirectory.get_current_head()` method can be used normally when a script directory has only one head. :return: a tuple of string revision numbers. """ return list(self.revision_map.heads) def get_base(self): """Return the "base" revision as a string. This is the revision number of the script that has a ``down_revision`` of None. If the script directory has multiple bases, an error is raised; :meth:`.ScriptDirectory.get_bases` should be preferred. """ bases = self.get_bases() if len(bases) > 1: raise util.CommandError( "The script directory has multiple bases. " "Please use get_bases()." ) elif bases: return bases[0] else: return None def get_bases(self): """return all "base" revisions as strings. This is the revision number of all scripts that have a ``down_revision`` of None. .. versionadded:: 0.7.0 """ return list(self.revision_map.bases) def _upgrade_revs(self, destination, current_rev): with self._catch_revision_errors( ancestor="Destination %(end)s is not a valid upgrade " "target from current head(s)", end=destination, ): revs = self.revision_map.iterate_revisions( destination, current_rev, implicit_base=True ) revs = list(revs) return [ migration.MigrationStep.upgrade_from_script( self.revision_map, script ) for script in reversed(list(revs)) ] def _downgrade_revs(self, destination, current_rev): with self._catch_revision_errors( ancestor="Destination %(end)s is not a valid downgrade " "target from current head(s)", end=destination, ): revs = self.revision_map.iterate_revisions( current_rev, destination, select_for_downgrade=True ) return [ migration.MigrationStep.downgrade_from_script( self.revision_map, script ) for script in revs ] def _stamp_revs(self, revision, heads): with self._catch_revision_errors( multiple_heads="Multiple heads are present; please specify a " "single target revision" ): heads = self.get_revisions(heads) # filter for lineage will resolve things like # branchname@base, version@base, etc. filtered_heads = self.revision_map.filter_for_lineage( heads, revision, include_dependencies=True ) steps = [] dests = self.get_revisions(revision) or [None] for dest in dests: if dest is None: # dest is 'base'. Return a "delete branch" migration # for all applicable heads. steps.extend( [ migration.StampStep( head.revision, None, False, True, self.revision_map, ) for head in filtered_heads ] ) continue elif dest in filtered_heads: # the dest is already in the version table, do nothing. continue # figure out if the dest is a descendant or an # ancestor of the selected nodes descendants = set( self.revision_map._get_descendant_nodes([dest]) ) ancestors = set(self.revision_map._get_ancestor_nodes([dest])) if descendants.intersection(filtered_heads): # heads are above the target, so this is a downgrade. # we can treat them as a "merge", single step. assert not ancestors.intersection(filtered_heads) todo_heads = [head.revision for head in filtered_heads] step = migration.StampStep( todo_heads, dest.revision, False, False, self.revision_map, ) steps.append(step) continue elif ancestors.intersection(filtered_heads): # heads are below the target, so this is an upgrade. # we can treat them as a "merge", single step. todo_heads = [head.revision for head in filtered_heads] step = migration.StampStep( todo_heads, dest.revision, True, False, self.revision_map, ) steps.append(step) continue else: # destination is in a branch not represented, # treat it as new branch step = migration.StampStep( (), dest.revision, True, True, self.revision_map ) steps.append(step) continue return steps def run_env(self): """Run the script environment. This basically runs the ``env.py`` script present in the migration environment. It is called exclusively by the command functions in :mod:`alembic.command`. """ util.load_python_file(self.dir, "env.py") @property def env_py_location(self): return os.path.abspath(os.path.join(self.dir, "env.py")) def _generate_template(self, src, dest, **kw): util.status( "Generating %s" % os.path.abspath(dest), util.template_to_file, src, dest, self.output_encoding, **kw ) def _copy_file(self, src, dest): util.status( "Generating %s" % os.path.abspath(dest), shutil.copy, src, dest ) def _ensure_directory(self, path): path = os.path.abspath(path) if not os.path.exists(path): util.status("Creating directory %s" % path, os.makedirs, path) def _generate_create_date(self): if self.timezone is not None: # First, assume correct capitalization tzinfo = tz.gettz(self.timezone) if tzinfo is None: # Fall back to uppercase tzinfo = tz.gettz(self.timezone.upper()) if tzinfo is None: raise util.CommandError( "Can't locate timezone: %s" % self.timezone ) create_date = ( datetime.datetime.utcnow() .replace(tzinfo=tz.tzutc()) .astimezone(tzinfo) ) else: create_date = datetime.datetime.now() return create_date def generate_revision( self, revid, message, head=None, refresh=False, splice=False, branch_labels=None, version_path=None, depends_on=None, **kw ): """Generate a new revision file. This runs the ``script.py.mako`` template, given template arguments, and creates a new file. :param revid: String revision id. Typically this comes from ``alembic.util.rev_id()``. :param message: the revision message, the one passed by the -m argument to the ``revision`` command. :param head: the head revision to generate against. Defaults to the current "head" if no branches are present, else raises an exception. .. versionadded:: 0.7.0 :param splice: if True, allow the "head" version to not be an actual head; otherwise, the selected head must be a head (e.g. endpoint) revision. :param refresh: deprecated. """ if head is None: head = "head" try: Script.verify_rev_id(revid) except revision.RevisionError as err: compat.raise_from_cause(util.CommandError(err.args[0])) with self._catch_revision_errors( multiple_heads=( "Multiple heads are present; please specify the head " "revision on which the new revision should be based, " "or perform a merge." ) ): heads = self.revision_map.get_revisions(head) if len(set(heads)) != len(heads): raise util.CommandError("Duplicate head revisions specified") create_date = self._generate_create_date() if version_path is None: if len(self._version_locations) > 1: for head in heads: if head is not None: version_path = os.path.dirname(head.path) break else: raise util.CommandError( "Multiple version locations present, " "please specify --version-path" ) else: version_path = self.versions norm_path = os.path.normpath(os.path.abspath(version_path)) for vers_path in self._version_locations: if os.path.normpath(vers_path) == norm_path: break else: raise util.CommandError( "Path %s is not represented in current " "version locations" % version_path ) if self.version_locations: self._ensure_directory(version_path) path = self._rev_path(version_path, revid, message, create_date) if not splice: for head in heads: if head is not None and not head.is_head: raise util.CommandError( "Revision %s is not a head revision; please specify " "--splice to create a new branch from this revision" % head.revision ) if depends_on: with self._catch_revision_errors(): depends_on = [ dep if dep in rev.branch_labels # maintain branch labels else rev.revision # resolve partial revision identifiers for rev, dep in [ (self.revision_map.get_revision(dep), dep) for dep in util.to_list(depends_on) ] ] self._generate_template( os.path.join(self.dir, "script.py.mako"), path, up_revision=str(revid), down_revision=revision.tuple_rev_as_scalar( tuple(h.revision if h is not None else None for h in heads) ), branch_labels=util.to_tuple(branch_labels), depends_on=revision.tuple_rev_as_scalar(depends_on), create_date=create_date, comma=util.format_as_comma, message=message if message is not None else ("empty message"), **kw ) try: script = Script._from_path(self, path) except revision.RevisionError as err: compat.raise_from_cause(util.CommandError(err.args[0])) if branch_labels and not script.branch_labels: raise util.CommandError( "Version %s specified branch_labels %s, however the " "migration file %s does not have them; have you upgraded " "your script.py.mako to include the " "'branch_labels' section?" % (script.revision, branch_labels, script.path) ) self.revision_map.add_revision(script) return script def _rev_path(self, path, rev_id, message, create_date): slug = "_".join(_slug_re.findall(message or "")).lower() if len(slug) > self.truncate_slug_length: slug = slug[: self.truncate_slug_length].rsplit("_", 1)[0] + "_" filename = "%s.py" % ( self.file_template % { "rev": rev_id, "slug": slug, "year": create_date.year, "month": create_date.month, "day": create_date.day, "hour": create_date.hour, "minute": create_date.minute, "second": create_date.second, } ) return os.path.join(path, filename) class Script(revision.Revision): """Represent a single revision file in a ``versions/`` directory. The :class:`.Script` instance is returned by methods such as :meth:`.ScriptDirectory.iterate_revisions`. """ def __init__(self, module, rev_id, path): self.module = module self.path = path super(Script, self).__init__( rev_id, module.down_revision, branch_labels=util.to_tuple( getattr(module, "branch_labels", None), default=() ), dependencies=util.to_tuple( getattr(module, "depends_on", None), default=() ), ) module = None """The Python module representing the actual script itself.""" path = None """Filesystem path of the script.""" _db_current_indicator = None """Utility variable which when set will cause string output to indicate this is a "current" version in some database""" @property def doc(self): """Return the docstring given in the script.""" return re.split("\n\n", self.longdoc)[0] @property def longdoc(self): """Return the docstring given in the script.""" doc = self.module.__doc__ if doc: if hasattr(self.module, "_alembic_source_encoding"): doc = doc.decode(self.module._alembic_source_encoding) return doc.strip() else: return "" @property def log_entry(self): entry = "Rev: %s%s%s%s%s\n" % ( self.revision, " (head)" if self.is_head else "", " (branchpoint)" if self.is_branch_point else "", " (mergepoint)" if self.is_merge_point else "", " (current)" if self._db_current_indicator else "", ) if self.is_merge_point: entry += "Merges: %s\n" % (self._format_down_revision(),) else: entry += "Parent: %s\n" % (self._format_down_revision(),) if self.dependencies: entry += "Also depends on: %s\n" % ( util.format_as_comma(self.dependencies) ) if self.is_branch_point: entry += "Branches into: %s\n" % ( util.format_as_comma(self.nextrev) ) if self.branch_labels: entry += "Branch names: %s\n" % ( util.format_as_comma(self.branch_labels), ) entry += "Path: %s\n" % (self.path,) entry += "\n%s\n" % ( "\n".join(" %s" % para for para in self.longdoc.splitlines()) ) return entry def __str__(self): return "%s -> %s%s%s%s, %s" % ( self._format_down_revision(), self.revision, " (head)" if self.is_head else "", " (branchpoint)" if self.is_branch_point else "", " (mergepoint)" if self.is_merge_point else "", self.doc, ) def _head_only( self, include_branches=False, include_doc=False, include_parents=False, tree_indicators=True, head_indicators=True, ): text = self.revision if include_parents: if self.dependencies: text = "%s (%s) -> %s" % ( self._format_down_revision(), util.format_as_comma(self.dependencies), text, ) else: text = "%s -> %s" % (self._format_down_revision(), text) if include_branches and self.branch_labels: text += " (%s)" % util.format_as_comma(self.branch_labels) if head_indicators or tree_indicators: text += "%s%s%s" % ( " (head)" if self._is_real_head else "", " (effective head)" if self.is_head and not self._is_real_head else "", " (current)" if self._db_current_indicator else "", ) if tree_indicators: text += "%s%s" % ( " (branchpoint)" if self.is_branch_point else "", " (mergepoint)" if self.is_merge_point else "", ) if include_doc: text += ", %s" % self.doc return text def cmd_format( self, verbose, include_branches=False, include_doc=False, include_parents=False, tree_indicators=True, ): if verbose: return self.log_entry else: return self._head_only( include_branches, include_doc, include_parents, tree_indicators ) def _format_down_revision(self): if not self.down_revision: return "" else: return util.format_as_comma(self._versioned_down_revisions) @classmethod def _from_path(cls, scriptdir, path): dir_, filename = os.path.split(path) return cls._from_filename(scriptdir, dir_, filename) @classmethod def _list_py_dir(cls, scriptdir, path): if scriptdir.sourceless: # read files in version path, e.g. pyc or pyo files # in the immediate path paths = os.listdir(path) names = set(fname.split(".")[0] for fname in paths) # look for __pycache__ if os.path.exists(os.path.join(path, "__pycache__")): # add all files from __pycache__ whose filename is not # already in the names we got from the version directory. # add as relative paths including __pycache__ token paths.extend( os.path.join("__pycache__", pyc) for pyc in os.listdir(os.path.join(path, "__pycache__")) if pyc.split(".")[0] not in names ) return paths else: return os.listdir(path) @classmethod def _from_filename(cls, scriptdir, dir_, filename): if scriptdir.sourceless: py_match = _sourceless_rev_file.match(filename) else: py_match = _only_source_rev_file.match(filename) if not py_match: return None py_filename = py_match.group(1) if scriptdir.sourceless: is_c = py_match.group(2) == "c" is_o = py_match.group(2) == "o" else: is_c = is_o = False if is_o or is_c: py_exists = os.path.exists(os.path.join(dir_, py_filename)) pyc_exists = os.path.exists(os.path.join(dir_, py_filename + "c")) # prefer .py over .pyc because we'd like to get the # source encoding; prefer .pyc over .pyo because we'd like to # have the docstrings which a -OO file would not have if py_exists or is_o and pyc_exists: return None module = util.load_python_file(dir_, filename) if not hasattr(module, "revision"): # attempt to get the revision id from the script name, # this for legacy only m = _legacy_rev.match(filename) if not m: raise util.CommandError( "Could not determine revision id from filename %s. " "Be sure the 'revision' variable is " "declared inside the script (please see 'Upgrading " "from Alembic 0.1 to 0.2' in the documentation)." % filename ) else: revision = m.group(1) else: revision = module.revision return Script(module, revision, os.path.join(dir_, filename)) zzzeek-alembic-bee044a1c187/alembic/script/revision.py000066400000000000000000001024301353106760100227120ustar00rootroot00000000000000import collections import re from sqlalchemy import util as sqlautil from .. import util from ..util import compat _relative_destination = re.compile(r"(?:(.+?)@)?(\w+)?((?:\+|-)\d+)") _revision_illegal_chars = ["@", "-", "+"] class RevisionError(Exception): pass class RangeNotAncestorError(RevisionError): def __init__(self, lower, upper): self.lower = lower self.upper = upper super(RangeNotAncestorError, self).__init__( "Revision %s is not an ancestor of revision %s" % (lower or "base", upper or "base") ) class MultipleHeads(RevisionError): def __init__(self, heads, argument): self.heads = heads self.argument = argument super(MultipleHeads, self).__init__( "Multiple heads are present for given argument '%s'; " "%s" % (argument, ", ".join(heads)) ) class ResolutionError(RevisionError): def __init__(self, message, argument): super(ResolutionError, self).__init__(message) self.argument = argument class RevisionMap(object): """Maintains a map of :class:`.Revision` objects. :class:`.RevisionMap` is used by :class:`.ScriptDirectory` to maintain and traverse the collection of :class:`.Script` objects, which are themselves instances of :class:`.Revision`. """ def __init__(self, generator): """Construct a new :class:`.RevisionMap`. :param generator: a zero-arg callable that will generate an iterable of :class:`.Revision` instances to be used. These are typically :class:`.Script` subclasses within regular Alembic use. """ self._generator = generator @util.memoized_property def heads(self): """All "head" revisions as strings. This is normally a tuple of length one, unless unmerged branches are present. :return: a tuple of string revision numbers. """ self._revision_map return self.heads @util.memoized_property def bases(self): """All "base" revisions as strings. These are revisions that have a ``down_revision`` of None, or empty tuple. :return: a tuple of string revision numbers. """ self._revision_map return self.bases @util.memoized_property def _real_heads(self): """All "real" head revisions as strings. :return: a tuple of string revision numbers. """ self._revision_map return self._real_heads @util.memoized_property def _real_bases(self): """All "real" base revisions as strings. :return: a tuple of string revision numbers. """ self._revision_map return self._real_bases @util.memoized_property def _revision_map(self): """memoized attribute, initializes the revision map from the initial collection. """ map_ = {} heads = sqlautil.OrderedSet() _real_heads = sqlautil.OrderedSet() self.bases = () self._real_bases = () has_branch_labels = set() has_depends_on = set() for revision in self._generator(): if revision.revision in map_: util.warn( "Revision %s is present more than once" % revision.revision ) map_[revision.revision] = revision if revision.branch_labels: has_branch_labels.add(revision) if revision.dependencies: has_depends_on.add(revision) heads.add(revision.revision) _real_heads.add(revision.revision) if revision.is_base: self.bases += (revision.revision,) if revision._is_real_base: self._real_bases += (revision.revision,) # add the branch_labels to the map_. We'll need these # to resolve the dependencies. for revision in has_branch_labels: self._map_branch_labels(revision, map_) for revision in has_depends_on: self._add_depends_on(revision, map_) for rev in map_.values(): for downrev in rev._all_down_revisions: if downrev not in map_: util.warn( "Revision %s referenced from %s is not present" % (downrev, rev) ) down_revision = map_[downrev] down_revision.add_nextrev(rev) if downrev in rev._versioned_down_revisions: heads.discard(downrev) _real_heads.discard(downrev) map_[None] = map_[()] = None self.heads = tuple(heads) self._real_heads = tuple(_real_heads) for revision in has_branch_labels: self._add_branches(revision, map_, map_branch_labels=False) return map_ def _map_branch_labels(self, revision, map_): if revision.branch_labels: for branch_label in revision._orig_branch_labels: if branch_label in map_: raise RevisionError( "Branch name '%s' in revision %s already " "used by revision %s" % ( branch_label, revision.revision, map_[branch_label].revision, ) ) map_[branch_label] = revision def _add_branches(self, revision, map_, map_branch_labels=True): if map_branch_labels: self._map_branch_labels(revision, map_) if revision.branch_labels: revision.branch_labels.update(revision.branch_labels) for node in self._get_descendant_nodes( [revision], map_, include_dependencies=False ): node.branch_labels.update(revision.branch_labels) parent = node while ( parent and not parent._is_real_branch_point and not parent.is_merge_point ): parent.branch_labels.update(revision.branch_labels) if parent.down_revision: parent = map_[parent.down_revision] else: break def _add_depends_on(self, revision, map_): if revision.dependencies: deps = [map_[dep] for dep in util.to_tuple(revision.dependencies)] revision._resolved_dependencies = tuple([d.revision for d in deps]) def add_revision(self, revision, _replace=False): """add a single revision to an existing map. This method is for single-revision use cases, it's not appropriate for fully populating an entire revision map. """ map_ = self._revision_map if not _replace and revision.revision in map_: util.warn( "Revision %s is present more than once" % revision.revision ) elif _replace and revision.revision not in map_: raise Exception("revision %s not in map" % revision.revision) map_[revision.revision] = revision self._add_branches(revision, map_) self._add_depends_on(revision, map_) if revision.is_base: self.bases += (revision.revision,) if revision._is_real_base: self._real_bases += (revision.revision,) for downrev in revision._all_down_revisions: if downrev not in map_: util.warn( "Revision %s referenced from %s is not present" % (downrev, revision) ) map_[downrev].add_nextrev(revision) if revision._is_real_head: self._real_heads = tuple( head for head in self._real_heads if head not in set(revision._all_down_revisions).union( [revision.revision] ) ) + (revision.revision,) if revision.is_head: self.heads = tuple( head for head in self.heads if head not in set(revision._versioned_down_revisions).union( [revision.revision] ) ) + (revision.revision,) def get_current_head(self, branch_label=None): """Return the current head revision. If the script directory has multiple heads due to branching, an error is raised; :meth:`.ScriptDirectory.get_heads` should be preferred. :param branch_label: optional branch name which will limit the heads considered to those which include that branch_label. :return: a string revision number. .. seealso:: :meth:`.ScriptDirectory.get_heads` """ current_heads = self.heads if branch_label: current_heads = self.filter_for_lineage( current_heads, branch_label ) if len(current_heads) > 1: raise MultipleHeads( current_heads, "%s@head" % branch_label if branch_label else "head", ) if current_heads: return current_heads[0] else: return None def _get_base_revisions(self, identifier): return self.filter_for_lineage(self.bases, identifier) def get_revisions(self, id_): """Return the :class:`.Revision` instances with the given rev id or identifiers. May be given a single identifier, a sequence of identifiers, or the special symbols "head" or "base". The result is a tuple of one or more identifiers, or an empty tuple in the case of "base". In the cases where 'head', 'heads' is requested and the revision map is empty, returns an empty tuple. Supports partial identifiers, where the given identifier is matched against all identifiers that start with the given characters; if there is exactly one match, that determines the full revision. """ if isinstance(id_, (list, tuple, set, frozenset)): return sum([self.get_revisions(id_elem) for id_elem in id_], ()) else: resolved_id, branch_label = self._resolve_revision_number(id_) return tuple( self._revision_for_ident(rev_id, branch_label) for rev_id in resolved_id ) def get_revision(self, id_): """Return the :class:`.Revision` instance with the given rev id. If a symbolic name such as "head" or "base" is given, resolves the identifier into the current head or base revision. If the symbolic name refers to multiples, :class:`.MultipleHeads` is raised. Supports partial identifiers, where the given identifier is matched against all identifiers that start with the given characters; if there is exactly one match, that determines the full revision. """ resolved_id, branch_label = self._resolve_revision_number(id_) if len(resolved_id) > 1: raise MultipleHeads(resolved_id, id_) elif resolved_id: resolved_id = resolved_id[0] return self._revision_for_ident(resolved_id, branch_label) def _resolve_branch(self, branch_label): try: branch_rev = self._revision_map[branch_label] except KeyError: try: nonbranch_rev = self._revision_for_ident(branch_label) except ResolutionError: raise ResolutionError( "No such branch: '%s'" % branch_label, branch_label ) else: return nonbranch_rev else: return branch_rev def _revision_for_ident(self, resolved_id, check_branch=None): if check_branch: branch_rev = self._resolve_branch(check_branch) else: branch_rev = None try: revision = self._revision_map[resolved_id] except KeyError: # break out to avoid misleading py3k stack traces revision = False if revision is False: # do a partial lookup revs = [ x for x in self._revision_map if x and x.startswith(resolved_id) ] if branch_rev: revs = self.filter_for_lineage(revs, check_branch) if not revs: raise ResolutionError( "No such revision or branch '%s'" % resolved_id, resolved_id, ) elif len(revs) > 1: raise ResolutionError( "Multiple revisions start " "with '%s': %s..." % (resolved_id, ", ".join("'%s'" % r for r in revs[0:3])), resolved_id, ) else: revision = self._revision_map[revs[0]] if check_branch and revision is not None: if not self._shares_lineage( revision.revision, branch_rev.revision ): raise ResolutionError( "Revision %s is not a member of branch '%s'" % (revision.revision, check_branch), resolved_id, ) return revision def _filter_into_branch_heads(self, targets): targets = set(targets) for rev in list(targets): if targets.intersection( self._get_descendant_nodes([rev], include_dependencies=False) ).difference([rev]): targets.discard(rev) return targets def filter_for_lineage( self, targets, check_against, include_dependencies=False ): id_, branch_label = self._resolve_revision_number(check_against) shares = [] if branch_label: shares.append(branch_label) if id_: shares.extend(id_) return [ tg for tg in targets if self._shares_lineage( tg, shares, include_dependencies=include_dependencies ) ] def _shares_lineage( self, target, test_against_revs, include_dependencies=False ): if not test_against_revs: return True if not isinstance(target, Revision): target = self._revision_for_ident(target) test_against_revs = [ self._revision_for_ident(test_against_rev) if not isinstance(test_against_rev, Revision) else test_against_rev for test_against_rev in util.to_tuple( test_against_revs, default=() ) ] return bool( set( self._get_descendant_nodes( [target], include_dependencies=include_dependencies ) ) .union( self._get_ancestor_nodes( [target], include_dependencies=include_dependencies ) ) .intersection(test_against_revs) ) def _resolve_revision_number(self, id_): if isinstance(id_, compat.string_types) and "@" in id_: branch_label, id_ = id_.split("@", 1) elif id_ is not None and ( ( isinstance(id_, tuple) and id_ and not isinstance(id_[0], compat.string_types) ) or not isinstance(id_, compat.string_types + (tuple, )) ): raise RevisionError( "revision identifier %r is not a string; ensure database " "driver settings are correct" % (id_,) ) else: branch_label = None # ensure map is loaded self._revision_map if id_ == "heads": if branch_label: return ( self.filter_for_lineage(self.heads, branch_label), branch_label, ) else: return self._real_heads, branch_label elif id_ == "head": current_head = self.get_current_head(branch_label) if current_head: return (current_head,), branch_label else: return (), branch_label elif id_ == "base" or id_ is None: return (), branch_label else: return util.to_tuple(id_, default=None), branch_label def _relative_iterate( self, destination, source, is_upwards, implicit_base, inclusive, assert_relative_length, ): if isinstance(destination, compat.string_types): match = _relative_destination.match(destination) if not match: return None else: return None relative = int(match.group(3)) symbol = match.group(2) branch_label = match.group(1) reldelta = 1 if inclusive and not symbol else 0 if is_upwards: if branch_label: from_ = "%s@head" % branch_label elif symbol: if symbol.startswith("head"): from_ = symbol else: from_ = "%s@head" % symbol else: from_ = "head" to_ = source else: if branch_label: to_ = "%s@base" % branch_label elif symbol: to_ = "%s@base" % symbol else: to_ = "base" from_ = source revs = list( self._iterate_revisions( from_, to_, inclusive=inclusive, implicit_base=implicit_base ) ) if symbol: if branch_label: symbol_rev = self.get_revision( "%s@%s" % (branch_label, symbol) ) else: symbol_rev = self.get_revision(symbol) if symbol.startswith("head"): index = 0 elif symbol == "base": index = len(revs) - 1 else: range_ = compat.range(len(revs) - 1, 0, -1) for index in range_: if symbol_rev.revision == revs[index].revision: break else: index = 0 else: index = 0 if is_upwards: revs = revs[index - relative - reldelta :] if ( not index and assert_relative_length and len(revs) < abs(relative - reldelta) ): raise RevisionError( "Relative revision %s didn't " "produce %d migrations" % (destination, abs(relative)) ) else: revs = revs[0 : index - relative + reldelta] if ( not index and assert_relative_length and len(revs) != abs(relative) + reldelta ): raise RevisionError( "Relative revision %s didn't " "produce %d migrations" % (destination, abs(relative)) ) return iter(revs) def iterate_revisions( self, upper, lower, implicit_base=False, inclusive=False, assert_relative_length=True, select_for_downgrade=False, ): """Iterate through script revisions, starting at the given upper revision identifier and ending at the lower. The traversal uses strictly the `down_revision` marker inside each migration script, so it is a requirement that upper >= lower, else you'll get nothing back. The iterator yields :class:`.Revision` objects. """ relative_upper = self._relative_iterate( upper, lower, True, implicit_base, inclusive, assert_relative_length, ) if relative_upper: return relative_upper relative_lower = self._relative_iterate( lower, upper, False, implicit_base, inclusive, assert_relative_length, ) if relative_lower: return relative_lower return self._iterate_revisions( upper, lower, inclusive=inclusive, implicit_base=implicit_base, select_for_downgrade=select_for_downgrade, ) def _get_descendant_nodes( self, targets, map_=None, check=False, omit_immediate_dependencies=False, include_dependencies=True, ): if omit_immediate_dependencies: def fn(rev): if rev not in targets: return rev._all_nextrev else: return rev.nextrev elif include_dependencies: def fn(rev): return rev._all_nextrev else: def fn(rev): return rev.nextrev return self._iterate_related_revisions( fn, targets, map_=map_, check=check ) def _get_ancestor_nodes( self, targets, map_=None, check=False, include_dependencies=True ): if include_dependencies: def fn(rev): return rev._all_down_revisions else: def fn(rev): return rev._versioned_down_revisions return self._iterate_related_revisions( fn, targets, map_=map_, check=check ) def _iterate_related_revisions(self, fn, targets, map_, check=False): if map_ is None: map_ = self._revision_map seen = set() todo = collections.deque() for target in targets: todo.append(target) if check: per_target = set() while todo: rev = todo.pop() if check: per_target.add(rev) if rev in seen: continue seen.add(rev) todo.extend(map_[rev_id] for rev_id in fn(rev)) yield rev if check: overlaps = per_target.intersection(targets).difference( [target] ) if overlaps: raise RevisionError( "Requested revision %s overlaps with " "other requested revisions %s" % ( target.revision, ", ".join(r.revision for r in overlaps), ) ) def _iterate_revisions( self, upper, lower, inclusive=True, implicit_base=False, select_for_downgrade=False, ): """iterate revisions from upper to lower. The traversal is depth-first within branches, and breadth-first across branches as a whole. """ requested_lowers = self.get_revisions(lower) # some complexity to accommodate an iteration where some # branches are starting from nothing, and others are starting # from a given point. Additionally, if the bottom branch # is specified using a branch identifier, then we limit operations # to just that branch. limit_to_lower_branch = isinstance( lower, compat.string_types ) and lower.endswith("@base") uppers = util.dedupe_tuple(self.get_revisions(upper)) if not uppers and not requested_lowers: return upper_ancestors = set(self._get_ancestor_nodes(uppers, check=True)) if limit_to_lower_branch: lowers = self.get_revisions(self._get_base_revisions(lower)) elif implicit_base and requested_lowers: lower_ancestors = set(self._get_ancestor_nodes(requested_lowers)) lower_descendants = set( self._get_descendant_nodes(requested_lowers) ) base_lowers = set() candidate_lowers = upper_ancestors.difference( lower_ancestors ).difference(lower_descendants) for rev in candidate_lowers: for downrev in rev._all_down_revisions: if self._revision_map[downrev] in candidate_lowers: break else: base_lowers.add(rev) lowers = base_lowers.union(requested_lowers) elif implicit_base: base_lowers = set(self.get_revisions(self._real_bases)) lowers = base_lowers.union(requested_lowers) elif not requested_lowers: lowers = set(self.get_revisions(self._real_bases)) else: lowers = requested_lowers # represents all nodes we will produce total_space = set( rev.revision for rev in upper_ancestors ).intersection( rev.revision for rev in self._get_descendant_nodes( lowers, check=True, omit_immediate_dependencies=( select_for_downgrade and requested_lowers ), ) ) if not total_space: # no nodes. determine if this is an invalid range # or not. start_from = set(requested_lowers) start_from.update( self._get_ancestor_nodes( list(start_from), include_dependencies=True ) ) # determine all the current branch points represented # by requested_lowers start_from = self._filter_into_branch_heads(start_from) # if the requested start is one of those branch points, # then just return empty set if start_from.intersection(upper_ancestors): return else: # otherwise, they requested nodes out of # order raise RangeNotAncestorError(lower, upper) # organize branch points to be consumed separately from # member nodes branch_todo = set( rev for rev in (self._revision_map[rev] for rev in total_space) if rev._is_real_branch_point and len(total_space.intersection(rev._all_nextrev)) > 1 ) # it's not possible for any "uppers" to be in branch_todo, # because the ._all_nextrev of those nodes is not in total_space # assert not branch_todo.intersection(uppers) todo = collections.deque( r for r in uppers if r.revision in total_space ) # iterate for total_space being emptied out total_space_modified = True while total_space: if not total_space_modified: raise RevisionError( "Dependency resolution failed; iteration can't proceed" ) total_space_modified = False # when everything non-branch pending is consumed, # add to the todo any branch nodes that have no # descendants left in the queue if not todo: todo.extendleft( sorted( ( rev for rev in branch_todo if not rev._all_nextrev.intersection(total_space) ), # favor "revisioned" branch points before # dependent ones key=lambda rev: 0 if rev.is_branch_point else 1, ) ) branch_todo.difference_update(todo) # iterate nodes that are in the immediate todo while todo: rev = todo.popleft() total_space.remove(rev.revision) total_space_modified = True # do depth first for elements within branches, # don't consume any actual branch nodes todo.extendleft( [ self._revision_map[downrev] for downrev in reversed(rev._all_down_revisions) if self._revision_map[downrev] not in branch_todo and downrev in total_space ] ) if not inclusive and rev in requested_lowers: continue yield rev assert not branch_todo class Revision(object): """Base class for revisioned objects. The :class:`.Revision` class is the base of the more public-facing :class:`.Script` object, which represents a migration script. The mechanics of revision management and traversal are encapsulated within :class:`.Revision`, while :class:`.Script` applies this logic to Python files in a version directory. """ nextrev = frozenset() """following revisions, based on down_revision only.""" _all_nextrev = frozenset() revision = None """The string revision number.""" down_revision = None """The ``down_revision`` identifier(s) within the migration script. Note that the total set of "down" revisions is down_revision + dependencies. """ dependencies = None """Additional revisions which this revision is dependent on. From a migration standpoint, these dependencies are added to the down_revision to form the full iteration. However, the separation of down_revision from "dependencies" is to assist in navigating a history that contains many branches, typically a multi-root scenario. """ branch_labels = None """Optional string/tuple of symbolic names to apply to this revision's branch""" @classmethod def verify_rev_id(cls, revision): illegal_chars = set(revision).intersection(_revision_illegal_chars) if illegal_chars: raise RevisionError( "Character(s) '%s' not allowed in revision identifier '%s'" % (", ".join(sorted(illegal_chars)), revision) ) def __init__( self, revision, down_revision, dependencies=None, branch_labels=None ): self.verify_rev_id(revision) self.revision = revision self.down_revision = tuple_rev_as_scalar(down_revision) self.dependencies = tuple_rev_as_scalar(dependencies) self._resolved_dependencies = () self._orig_branch_labels = util.to_tuple(branch_labels, default=()) self.branch_labels = set(self._orig_branch_labels) def __repr__(self): args = [repr(self.revision), repr(self.down_revision)] if self.dependencies: args.append("dependencies=%r" % (self.dependencies,)) if self.branch_labels: args.append("branch_labels=%r" % (self.branch_labels,)) return "%s(%s)" % (self.__class__.__name__, ", ".join(args)) def add_nextrev(self, revision): self._all_nextrev = self._all_nextrev.union([revision.revision]) if self.revision in revision._versioned_down_revisions: self.nextrev = self.nextrev.union([revision.revision]) @property def _all_down_revisions(self): return ( util.to_tuple(self.down_revision, default=()) + self._resolved_dependencies ) @property def _versioned_down_revisions(self): return util.to_tuple(self.down_revision, default=()) @property def is_head(self): """Return True if this :class:`.Revision` is a 'head' revision. This is determined based on whether any other :class:`.Script` within the :class:`.ScriptDirectory` refers to this :class:`.Script`. Multiple heads can be present. """ return not bool(self.nextrev) @property def _is_real_head(self): return not bool(self._all_nextrev) @property def is_base(self): """Return True if this :class:`.Revision` is a 'base' revision.""" return self.down_revision is None @property def _is_real_base(self): """Return True if this :class:`.Revision` is a "real" base revision, e.g. that it has no dependencies either.""" # we use self.dependencies here because this is called up # in initialization where _real_dependencies isn't set up # yet return self.down_revision is None and self.dependencies is None @property def is_branch_point(self): """Return True if this :class:`.Script` is a branch point. A branchpoint is defined as a :class:`.Script` which is referred to by more than one succeeding :class:`.Script`, that is more than one :class:`.Script` has a `down_revision` identifier pointing here. """ return len(self.nextrev) > 1 @property def _is_real_branch_point(self): """Return True if this :class:`.Script` is a 'real' branch point, taking into account dependencies as well. """ return len(self._all_nextrev) > 1 @property def is_merge_point(self): """Return True if this :class:`.Script` is a merge point.""" return len(self._versioned_down_revisions) > 1 def tuple_rev_as_scalar(rev): if not rev: return None elif len(rev) == 1: return rev[0] else: return rev zzzeek-alembic-bee044a1c187/alembic/templates/000077500000000000000000000000001353106760100211745ustar00rootroot00000000000000zzzeek-alembic-bee044a1c187/alembic/templates/generic/000077500000000000000000000000001353106760100226105ustar00rootroot00000000000000zzzeek-alembic-bee044a1c187/alembic/templates/generic/README000066400000000000000000000000461353106760100234700ustar00rootroot00000000000000Generic single-database configuration.zzzeek-alembic-bee044a1c187/alembic/templates/generic/alembic.ini.mako000066400000000000000000000032211353106760100256310ustar00rootroot00000000000000# A generic, single database configuration. [alembic] # path to migration scripts script_location = ${script_location} # template used to generate migration files # file_template = %%(rev)s_%%(slug)s # timezone to use when rendering the date # within the migration file as well as the filename. # string value is passed to dateutil.tz.gettz() # leave blank for localtime # timezone = # max length of characters to apply to the # "slug" field # truncate_slug_length = 40 # set to 'true' to run the environment during # the 'revision' command, regardless of autogenerate # revision_environment = false # set to 'true' to allow .pyc and .pyo files without # a source .py file to be detected as revisions in the # versions/ directory # sourceless = false # version location specification; this defaults # to ${script_location}/versions. When using multiple version # directories, initial revisions must be specified with --version-path # version_locations = %(here)s/bar %(here)s/bat ${script_location}/versions # the output encoding used when revision files # are written from script.py.mako # output_encoding = utf-8 sqlalchemy.url = driver://user:pass@localhost/dbname # Logging configuration [loggers] keys = root,sqlalchemy,alembic [handlers] keys = console [formatters] keys = generic [logger_root] level = WARN handlers = console qualname = [logger_sqlalchemy] level = WARN handlers = qualname = sqlalchemy.engine [logger_alembic] level = INFO handlers = qualname = alembic [handler_console] class = StreamHandler args = (sys.stderr,) level = NOTSET formatter = generic [formatter_generic] format = %(levelname)-5.5s [%(name)s] %(message)s datefmt = %H:%M:%S zzzeek-alembic-bee044a1c187/alembic/templates/generic/env.py000066400000000000000000000037671353106760100237670ustar00rootroot00000000000000from logging.config import fileConfig from sqlalchemy import engine_from_config from sqlalchemy import pool from alembic import context # this is the Alembic Config object, which provides # access to the values within the .ini file in use. config = context.config # Interpret the config file for Python logging. # This line sets up loggers basically. fileConfig(config.config_file_name) # add your model's MetaData object here # for 'autogenerate' support # from myapp import mymodel # target_metadata = mymodel.Base.metadata target_metadata = None # other values from the config, defined by the needs of env.py, # can be acquired: # my_important_option = config.get_main_option("my_important_option") # ... etc. def run_migrations_offline(): """Run migrations in 'offline' mode. This configures the context with just a URL and not an Engine, though an Engine is acceptable here as well. By skipping the Engine creation we don't even need a DBAPI to be available. Calls to context.execute() here emit the given string to the script output. """ url = config.get_main_option("sqlalchemy.url") context.configure( url=url, target_metadata=target_metadata, literal_binds=True, dialect_opts={"paramstyle": "named"}, ) with context.begin_transaction(): context.run_migrations() def run_migrations_online(): """Run migrations in 'online' mode. In this scenario we need to create an Engine and associate a connection with the context. """ connectable = engine_from_config( config.get_section(config.config_ini_section), prefix="sqlalchemy.", poolclass=pool.NullPool, ) with connectable.connect() as connection: context.configure( connection=connection, target_metadata=target_metadata ) with context.begin_transaction(): context.run_migrations() if context.is_offline_mode(): run_migrations_offline() else: run_migrations_online() zzzeek-alembic-bee044a1c187/alembic/templates/generic/script.py.mako000066400000000000000000000007561353106760100254240ustar00rootroot00000000000000"""${message} Revision ID: ${up_revision} Revises: ${down_revision | comma,n} Create Date: ${create_date} """ from alembic import op import sqlalchemy as sa ${imports if imports else ""} # revision identifiers, used by Alembic. revision = ${repr(up_revision)} down_revision = ${repr(down_revision)} branch_labels = ${repr(branch_labels)} depends_on = ${repr(depends_on)} def upgrade(): ${upgrades if upgrades else "pass"} def downgrade(): ${downgrades if downgrades else "pass"} zzzeek-alembic-bee044a1c187/alembic/templates/multidb/000077500000000000000000000000001353106760100226345ustar00rootroot00000000000000zzzeek-alembic-bee044a1c187/alembic/templates/multidb/README000066400000000000000000000000511353106760100235100ustar00rootroot00000000000000Rudimentary multi-database configuration.zzzeek-alembic-bee044a1c187/alembic/templates/multidb/alembic.ini.mako000066400000000000000000000033601353106760100256610ustar00rootroot00000000000000# a multi-database configuration. [alembic] # path to migration scripts script_location = ${script_location} # template used to generate migration files # file_template = %%(rev)s_%%(slug)s # timezone to use when rendering the date # within the migration file as well as the filename. # string value is passed to dateutil.tz.gettz() # leave blank for localtime # timezone = # max length of characters to apply to the # "slug" field # truncate_slug_length = 40 # set to 'true' to run the environment during # the 'revision' command, regardless of autogenerate # revision_environment = false # set to 'true' to allow .pyc and .pyo files without # a source .py file to be detected as revisions in the # versions/ directory # sourceless = false # version location specification; this defaults # to ${script_location}/versions. When using multiple version # directories, initial revisions must be specified with --version-path # version_locations = %(here)s/bar %(here)s/bat ${script_location}/versions # the output encoding used when revision files # are written from script.py.mako # output_encoding = utf-8 databases = engine1, engine2 [engine1] sqlalchemy.url = driver://user:pass@localhost/dbname [engine2] sqlalchemy.url = driver://user:pass@localhost/dbname2 # Logging configuration [loggers] keys = root,sqlalchemy,alembic [handlers] keys = console [formatters] keys = generic [logger_root] level = WARN handlers = console qualname = [logger_sqlalchemy] level = WARN handlers = qualname = sqlalchemy.engine [logger_alembic] level = INFO handlers = qualname = alembic [handler_console] class = StreamHandler args = (sys.stderr,) level = NOTSET formatter = generic [formatter_generic] format = %(levelname)-5.5s [%(name)s] %(message)s datefmt = %H:%M:%S zzzeek-alembic-bee044a1c187/alembic/templates/multidb/env.py000066400000000000000000000101021353106760100237700ustar00rootroot00000000000000import logging from logging.config import fileConfig import re from sqlalchemy import engine_from_config from sqlalchemy import pool from alembic import context USE_TWOPHASE = False # this is the Alembic Config object, which provides # access to the values within the .ini file in use. config = context.config # Interpret the config file for Python logging. # This line sets up loggers basically. fileConfig(config.config_file_name) logger = logging.getLogger("alembic.env") # gather section names referring to different # databases. These are named "engine1", "engine2" # in the sample .ini file. db_names = config.get_main_option("databases") # add your model's MetaData objects here # for 'autogenerate' support. These must be set # up to hold just those tables targeting a # particular database. table.tometadata() may be # helpful here in case a "copy" of # a MetaData is needed. # from myapp import mymodel # target_metadata = { # 'engine1':mymodel.metadata1, # 'engine2':mymodel.metadata2 # } target_metadata = {} # other values from the config, defined by the needs of env.py, # can be acquired: # my_important_option = config.get_main_option("my_important_option") # ... etc. def run_migrations_offline(): """Run migrations in 'offline' mode. This configures the context with just a URL and not an Engine, though an Engine is acceptable here as well. By skipping the Engine creation we don't even need a DBAPI to be available. Calls to context.execute() here emit the given string to the script output. """ # for the --sql use case, run migrations for each URL into # individual files. engines = {} for name in re.split(r",\s*", db_names): engines[name] = rec = {} rec["url"] = context.config.get_section_option(name, "sqlalchemy.url") for name, rec in engines.items(): logger.info("Migrating database %s" % name) file_ = "%s.sql" % name logger.info("Writing output to %s" % file_) with open(file_, "w") as buffer: context.configure( url=rec["url"], output_buffer=buffer, target_metadata=target_metadata.get(name), literal_binds=True, dialect_opts={"paramstyle": "named"}, ) with context.begin_transaction(): context.run_migrations(engine_name=name) def run_migrations_online(): """Run migrations in 'online' mode. In this scenario we need to create an Engine and associate a connection with the context. """ # for the direct-to-DB use case, start a transaction on all # engines, then run all migrations, then commit all transactions. engines = {} for name in re.split(r",\s*", db_names): engines[name] = rec = {} rec["engine"] = engine_from_config( context.config.get_section(name), prefix="sqlalchemy.", poolclass=pool.NullPool, ) for name, rec in engines.items(): engine = rec["engine"] rec["connection"] = conn = engine.connect() if USE_TWOPHASE: rec["transaction"] = conn.begin_twophase() else: rec["transaction"] = conn.begin() try: for name, rec in engines.items(): logger.info("Migrating database %s" % name) context.configure( connection=rec["connection"], upgrade_token="%s_upgrades" % name, downgrade_token="%s_downgrades" % name, target_metadata=target_metadata.get(name), ) context.run_migrations(engine_name=name) if USE_TWOPHASE: for rec in engines.values(): rec["transaction"].prepare() for rec in engines.values(): rec["transaction"].commit() except: for rec in engines.values(): rec["transaction"].rollback() raise finally: for rec in engines.values(): rec["connection"].close() if context.is_offline_mode(): run_migrations_offline() else: run_migrations_online() zzzeek-alembic-bee044a1c187/alembic/templates/multidb/script.py.mako000066400000000000000000000016331353106760100254430ustar00rootroot00000000000000<%! import re %>"""${message} Revision ID: ${up_revision} Revises: ${down_revision | comma,n} Create Date: ${create_date} """ from alembic import op import sqlalchemy as sa ${imports if imports else ""} # revision identifiers, used by Alembic. revision = ${repr(up_revision)} down_revision = ${repr(down_revision)} branch_labels = ${repr(branch_labels)} depends_on = ${repr(depends_on)} def upgrade(engine_name): globals()["upgrade_%s" % engine_name]() def downgrade(engine_name): globals()["downgrade_%s" % engine_name]() <% db_names = config.get_main_option("databases") %> ## generate an "upgrade_() / downgrade_()" function ## for each database name in the ini file. % for db_name in re.split(r',\s*', db_names): def upgrade_${db_name}(): ${context.get("%s_upgrades" % db_name, "pass")} def downgrade_${db_name}(): ${context.get("%s_downgrades" % db_name, "pass")} % endfor zzzeek-alembic-bee044a1c187/alembic/templates/pylons/000077500000000000000000000000001353106760100225205ustar00rootroot00000000000000zzzeek-alembic-bee044a1c187/alembic/templates/pylons/README000066400000000000000000000000731353106760100234000ustar00rootroot00000000000000Configuration that reads from a Pylons project environment.zzzeek-alembic-bee044a1c187/alembic/templates/pylons/alembic.ini.mako000066400000000000000000000022101353106760100255360ustar00rootroot00000000000000# a Pylons configuration. [alembic] # path to migration scripts script_location = ${script_location} # template used to generate migration files # file_template = %%(rev)s_%%(slug)s # timezone to use when rendering the date # within the migration file as well as the filename. # string value is passed to dateutil.tz.gettz() # leave blank for localtime # timezone = # max length of characters to apply to the # "slug" field # truncate_slug_length = 40 # set to 'true' to run the environment during # the 'revision' command, regardless of autogenerate # revision_environment = false # set to 'true' to allow .pyc and .pyo files without # a source .py file to be detected as revisions in the # versions/ directory # sourceless = false # version location specification; this defaults # to ${script_location}/versions. When using multiple version # directories, initial revisions must be specified with --version-path # version_locations = %(here)s/bar %(here)s/bat ${script_location}/versions # the output encoding used when revision files # are written from script.py.mako # output_encoding = utf-8 pylons_config_file = ./development.ini # that's it !zzzeek-alembic-bee044a1c187/alembic/templates/pylons/env.py000066400000000000000000000043051353106760100236640ustar00rootroot00000000000000"""Pylons bootstrap environment. Place 'pylons_config_file' into alembic.ini, and the application will be loaded from there. """ from logging.config import fileConfig from paste.deploy import loadapp from alembic import context try: # if pylons app already in, don't create a new app from pylons import config as pylons_config pylons_config["__file__"] except: config = context.config # can use config['__file__'] here, i.e. the Pylons # ini file, instead of alembic.ini config_file = config.get_main_option("pylons_config_file") fileConfig(config_file) wsgi_app = loadapp("config:%s" % config_file, relative_to=".") # customize this section for non-standard engine configurations. meta = __import__( "%s.model.meta" % wsgi_app.config["pylons.package"] ).model.meta # add your model's MetaData object here # for 'autogenerate' support # from myapp import mymodel # target_metadata = mymodel.Base.metadata target_metadata = None def run_migrations_offline(): """Run migrations in 'offline' mode. This configures the context with just a URL and not an Engine, though an Engine is acceptable here as well. By skipping the Engine creation we don't even need a DBAPI to be available. Calls to context.execute() here emit the given string to the script output. """ context.configure( url=meta.engine.url, target_metadata=target_metadata, literal_binds=True, dialect_opts={"paramstyle": "named"}, ) with context.begin_transaction(): context.run_migrations() def run_migrations_online(): """Run migrations in 'online' mode. In this scenario we need to create an Engine and associate a connection with the context. """ # specify here how the engine is acquired # engine = meta.engine raise NotImplementedError("Please specify engine connectivity here") with engine.connect() as connection: # noqa context.configure( connection=connection, target_metadata=target_metadata ) with context.begin_transaction(): context.run_migrations() if context.is_offline_mode(): run_migrations_offline() else: run_migrations_online() zzzeek-alembic-bee044a1c187/alembic/templates/pylons/script.py.mako000066400000000000000000000007561353106760100253340ustar00rootroot00000000000000"""${message} Revision ID: ${up_revision} Revises: ${down_revision | comma,n} Create Date: ${create_date} """ from alembic import op import sqlalchemy as sa ${imports if imports else ""} # revision identifiers, used by Alembic. revision = ${repr(up_revision)} down_revision = ${repr(down_revision)} branch_labels = ${repr(branch_labels)} depends_on = ${repr(depends_on)} def upgrade(): ${upgrades if upgrades else "pass"} def downgrade(): ${downgrades if downgrades else "pass"} zzzeek-alembic-bee044a1c187/alembic/testing/000077500000000000000000000000001353106760100206535ustar00rootroot00000000000000zzzeek-alembic-bee044a1c187/alembic/testing/__init__.py000066400000000000000000000013611353106760100227650ustar00rootroot00000000000000from sqlalchemy.testing import config # noqa from sqlalchemy.testing import engines # noqa from sqlalchemy.testing import exclusions # noqa from sqlalchemy.testing import mock # noqa from sqlalchemy.testing import provide_metadata # noqa from sqlalchemy.testing.config import requirements as requires # noqa from alembic import util # noqa from .assertions import assert_raises # noqa from .assertions import assert_raises_message # noqa from .assertions import emits_python_deprecation_warning # noqa from .assertions import eq_ # noqa from .assertions import eq_ignore_whitespace # noqa from .assertions import is_ # noqa from .assertions import is_not_ # noqa from .assertions import ne_ # noqa from .fixtures import TestBase # noqa zzzeek-alembic-bee044a1c187/alembic/testing/assertions.py000066400000000000000000000050671353106760100234270ustar00rootroot00000000000000from __future__ import absolute_import import re from sqlalchemy.engine import default from sqlalchemy.testing.assertions import _expect_warnings from sqlalchemy.testing.assertions import assert_raises # noqa from sqlalchemy.testing.assertions import assert_raises_message # noqa from sqlalchemy.testing.assertions import eq_ # noqa from sqlalchemy.testing.assertions import is_ # noqa from sqlalchemy.testing.assertions import is_not_ # noqa from sqlalchemy.testing.assertions import ne_ # noqa from sqlalchemy.util import decorator from ..util.compat import py3k def eq_ignore_whitespace(a, b, msg=None): # sqlalchemy.testing.assertion has this function # but not with the special "!U" detection part a = re.sub(r"^\s+?|\n", "", a) a = re.sub(r" {2,}", " ", a) b = re.sub(r"^\s+?|\n", "", b) b = re.sub(r" {2,}", " ", b) # convert for unicode string rendering, # using special escape character "!U" if py3k: b = re.sub(r"!U", "", b) else: b = re.sub(r"!U", "u", b) assert a == b, msg or "%r != %r" % (a, b) _dialect_mods = {} def _get_dialect(name): if name is None or name == "default": return default.DefaultDialect() else: try: dialect_mod = _dialect_mods[name] except KeyError: dialect_mod = getattr( __import__("sqlalchemy.dialects.%s" % name).dialects, name ) _dialect_mods[name] = dialect_mod d = dialect_mod.dialect() if name == "postgresql": d.implicit_returning = True elif name == "mssql": d.legacy_schema_aliasing = False return d def expect_warnings(*messages, **kw): """Context manager which expects one or more warnings. With no arguments, squelches all SAWarnings emitted via sqlalchemy.util.warn and sqlalchemy.util.warn_limited. Otherwise pass string expressions that will match selected warnings via regex; all non-matching warnings are sent through. The expect version **asserts** that the warnings were in fact seen. Note that the test suite sets SAWarning warnings to raise exceptions. """ return _expect_warnings(Warning, messages, **kw) def emits_python_deprecation_warning(*messages): """Decorator form of expect_warnings(). Note that emits_warning does **not** assert that the warnings were in fact seen. """ @decorator def decorate(fn, *args, **kw): with _expect_warnings(DeprecationWarning, assert_=False, *messages): return fn(*args, **kw) return decorate zzzeek-alembic-bee044a1c187/alembic/testing/env.py000066400000000000000000000234531353106760100220240ustar00rootroot00000000000000#!coding: utf-8 import os import shutil import textwrap from sqlalchemy.testing import engines from sqlalchemy.testing import provision from .. import util from ..script import Script from ..script import ScriptDirectory from ..util.compat import get_current_bytecode_suffixes from ..util.compat import has_pep3147 from ..util.compat import u def _get_staging_directory(): if provision.FOLLOWER_IDENT: return "scratch_%s" % provision.FOLLOWER_IDENT else: return "scratch" def staging_env(create=True, template="generic", sourceless=False): from alembic import command, script cfg = _testing_config() if create: path = os.path.join(_get_staging_directory(), "scripts") if os.path.exists(path): shutil.rmtree(path) command.init(cfg, path, template=template) if sourceless: try: # do an import so that a .pyc/.pyo is generated. util.load_python_file(path, "env.py") except AttributeError: # we don't have the migration context set up yet # so running the .env py throws this exception. # theoretically we could be using py_compiler here to # generate .pyc/.pyo without importing but not really # worth it. pass assert sourceless in ( "pep3147_envonly", "simple", "pep3147_everything", ), sourceless make_sourceless( os.path.join(path, "env.py"), "pep3147" if "pep3147" in sourceless else "simple", ) sc = script.ScriptDirectory.from_config(cfg) return sc def clear_staging_env(): shutil.rmtree(_get_staging_directory(), True) def script_file_fixture(txt): dir_ = os.path.join(_get_staging_directory(), "scripts") path = os.path.join(dir_, "script.py.mako") with open(path, "w") as f: f.write(txt) def env_file_fixture(txt): dir_ = os.path.join(_get_staging_directory(), "scripts") txt = ( """ from alembic import context config = context.config """ + txt ) path = os.path.join(dir_, "env.py") pyc_path = util.pyc_file_from_path(path) if pyc_path: os.unlink(pyc_path) with open(path, "w") as f: f.write(txt) def _sqlite_file_db(tempname="foo.db"): dir_ = os.path.join(_get_staging_directory(), "scripts") url = "sqlite:///%s/%s" % (dir_, tempname) return engines.testing_engine(url=url) def _sqlite_testing_config(sourceless=False): dir_ = os.path.join(_get_staging_directory(), "scripts") url = "sqlite:///%s/foo.db" % dir_ return _write_config_file( """ [alembic] script_location = %s sqlalchemy.url = %s sourceless = %s [loggers] keys = root [handlers] keys = console [logger_root] level = WARN handlers = console qualname = [handler_console] class = StreamHandler args = (sys.stderr,) level = NOTSET formatter = generic [formatters] keys = generic [formatter_generic] format = %%(levelname)-5.5s [%%(name)s] %%(message)s datefmt = %%H:%%M:%%S """ % (dir_, url, "true" if sourceless else "false") ) def _multi_dir_testing_config(sourceless=False, extra_version_location=""): dir_ = os.path.join(_get_staging_directory(), "scripts") url = "sqlite:///%s/foo.db" % dir_ return _write_config_file( """ [alembic] script_location = %s sqlalchemy.url = %s sourceless = %s version_locations = %%(here)s/model1/ %%(here)s/model2/ %%(here)s/model3/ %s [loggers] keys = root [handlers] keys = console [logger_root] level = WARN handlers = console qualname = [handler_console] class = StreamHandler args = (sys.stderr,) level = NOTSET formatter = generic [formatters] keys = generic [formatter_generic] format = %%(levelname)-5.5s [%%(name)s] %%(message)s datefmt = %%H:%%M:%%S """ % ( dir_, url, "true" if sourceless else "false", extra_version_location, ) ) def _no_sql_testing_config(dialect="postgresql", directives=""): """use a postgresql url with no host so that connections guaranteed to fail""" dir_ = os.path.join(_get_staging_directory(), "scripts") return _write_config_file( """ [alembic] script_location = %s sqlalchemy.url = %s:// %s [loggers] keys = root [handlers] keys = console [logger_root] level = WARN handlers = console qualname = [handler_console] class = StreamHandler args = (sys.stderr,) level = NOTSET formatter = generic [formatters] keys = generic [formatter_generic] format = %%(levelname)-5.5s [%%(name)s] %%(message)s datefmt = %%H:%%M:%%S """ % (dir_, dialect, directives) ) def _write_config_file(text): cfg = _testing_config() with open(cfg.config_file_name, "w") as f: f.write(text) return cfg def _testing_config(): from alembic.config import Config if not os.access(_get_staging_directory(), os.F_OK): os.mkdir(_get_staging_directory()) return Config(os.path.join(_get_staging_directory(), "test_alembic.ini")) def write_script( scriptdir, rev_id, content, encoding="ascii", sourceless=False ): old = scriptdir.revision_map.get_revision(rev_id) path = old.path content = textwrap.dedent(content) if encoding: content = content.encode(encoding) with open(path, "wb") as fp: fp.write(content) pyc_path = util.pyc_file_from_path(path) if pyc_path: os.unlink(pyc_path) script = Script._from_path(scriptdir, path) old = scriptdir.revision_map.get_revision(script.revision) if old.down_revision != script.down_revision: raise Exception( "Can't change down_revision " "on a refresh operation." ) scriptdir.revision_map.add_revision(script, _replace=True) if sourceless: make_sourceless( path, "pep3147" if sourceless == "pep3147_everything" else "simple" ) def make_sourceless(path, style): import py_compile py_compile.compile(path) if style == "simple" and has_pep3147(): pyc_path = util.pyc_file_from_path(path) suffix = get_current_bytecode_suffixes()[0] filepath, ext = os.path.splitext(path) simple_pyc_path = filepath + suffix shutil.move(pyc_path, simple_pyc_path) pyc_path = simple_pyc_path elif style == "pep3147" and not has_pep3147(): raise NotImplementedError() else: assert style in ("pep3147", "simple") pyc_path = util.pyc_file_from_path(path) assert os.access(pyc_path, os.F_OK) os.unlink(path) def three_rev_fixture(cfg): a = util.rev_id() b = util.rev_id() c = util.rev_id() script = ScriptDirectory.from_config(cfg) script.generate_revision(a, "revision a", refresh=True) write_script( script, a, """\ "Rev A" revision = '%s' down_revision = None from alembic import op def upgrade(): op.execute("CREATE STEP 1") def downgrade(): op.execute("DROP STEP 1") """ % a, ) script.generate_revision(b, "revision b", refresh=True) write_script( script, b, u( """# coding: utf-8 "Rev B, méil, %3" revision = '{}' down_revision = '{}' from alembic import op def upgrade(): op.execute("CREATE STEP 2") def downgrade(): op.execute("DROP STEP 2") """ ).format(b, a), encoding="utf-8", ) script.generate_revision(c, "revision c", refresh=True) write_script( script, c, """\ "Rev C" revision = '%s' down_revision = '%s' from alembic import op def upgrade(): op.execute("CREATE STEP 3") def downgrade(): op.execute("DROP STEP 3") """ % (c, b), ) return a, b, c def multi_heads_fixture(cfg, a, b, c): """Create a multiple head fixture from the three-revs fixture""" d = util.rev_id() e = util.rev_id() f = util.rev_id() script = ScriptDirectory.from_config(cfg) script.generate_revision( d, "revision d from b", head=b, splice=True, refresh=True ) write_script( script, d, """\ "Rev D" revision = '%s' down_revision = '%s' from alembic import op def upgrade(): op.execute("CREATE STEP 4") def downgrade(): op.execute("DROP STEP 4") """ % (d, b), ) script.generate_revision( e, "revision e from d", head=d, splice=True, refresh=True ) write_script( script, e, """\ "Rev E" revision = '%s' down_revision = '%s' from alembic import op def upgrade(): op.execute("CREATE STEP 5") def downgrade(): op.execute("DROP STEP 5") """ % (e, d), ) script.generate_revision( f, "revision f from b", head=b, splice=True, refresh=True ) write_script( script, f, """\ "Rev F" revision = '%s' down_revision = '%s' from alembic import op def upgrade(): op.execute("CREATE STEP 6") def downgrade(): op.execute("DROP STEP 6") """ % (f, b), ) return d, e, f def _multidb_testing_config(engines): """alembic.ini fixture to work exactly with the 'multidb' template""" dir_ = os.path.join(_get_staging_directory(), "scripts") databases = ", ".join(engines.keys()) engines = "\n\n".join( "[%s]\n" "sqlalchemy.url = %s" % (key, value.url) for key, value in engines.items() ) return _write_config_file( """ [alembic] script_location = %s sourceless = false databases = %s %s [loggers] keys = root [handlers] keys = console [logger_root] level = WARN handlers = console qualname = [handler_console] class = StreamHandler args = (sys.stderr,) level = NOTSET formatter = generic [formatters] keys = generic [formatter_generic] format = %%(levelname)-5.5s [%%(name)s] %%(message)s datefmt = %%H:%%M:%%S """ % (dir_, databases, engines) ) zzzeek-alembic-bee044a1c187/alembic/testing/fixtures.py000066400000000000000000000156611353106760100231070ustar00rootroot00000000000000# coding: utf-8 from contextlib import contextmanager import io import re from sqlalchemy import Column from sqlalchemy import inspect from sqlalchemy import MetaData from sqlalchemy import String from sqlalchemy import Table from sqlalchemy import text from sqlalchemy.testing import config from sqlalchemy.testing import mock from sqlalchemy.testing.assertions import eq_ from sqlalchemy.testing.fixtures import TestBase # noqa import alembic from .assertions import _get_dialect from ..environment import EnvironmentContext from ..migration import MigrationContext from ..operations import Operations from ..util.compat import configparser from ..util.compat import string_types from ..util.compat import text_type from ..util.sqla_compat import create_mock_engine testing_config = configparser.ConfigParser() testing_config.read(["test.cfg"]) def capture_db(): buf = [] def dump(sql, *multiparams, **params): buf.append(str(sql.compile(dialect=engine.dialect))) engine = create_mock_engine("postgresql://", dump) return engine, buf _engs = {} @contextmanager def capture_context_buffer(**kw): if kw.pop("bytes_io", False): buf = io.BytesIO() else: buf = io.StringIO() kw.update({"dialect_name": "sqlite", "output_buffer": buf}) conf = EnvironmentContext.configure def configure(*arg, **opt): opt.update(**kw) return conf(*arg, **opt) with mock.patch.object(EnvironmentContext, "configure", configure): yield buf def op_fixture( dialect="default", as_sql=False, naming_convention=None, literal_binds=False, native_boolean=None, ): opts = {} if naming_convention: opts["target_metadata"] = MetaData(naming_convention=naming_convention) class buffer_(object): def __init__(self): self.lines = [] def write(self, msg): msg = msg.strip() msg = re.sub(r"[\n\t]", "", msg) if as_sql: # the impl produces soft tabs, # so search for blocks of 4 spaces msg = re.sub(r" ", "", msg) msg = re.sub(r"\;\n*$", "", msg) self.lines.append(msg) def flush(self): pass buf = buffer_() class ctx(MigrationContext): def clear_assertions(self): buf.lines[:] = [] def assert_(self, *sql): # TODO: make this more flexible about # whitespace and such eq_(buf.lines, list(sql)) def assert_contains(self, sql): for stmt in buf.lines: if sql in stmt: return else: assert False, "Could not locate fragment %r in %r" % ( sql, buf.lines, ) if as_sql: opts["as_sql"] = as_sql if literal_binds: opts["literal_binds"] = literal_binds if dialect == "mariadb": ctx_dialect = _get_dialect("mysql") ctx_dialect.server_version_info = (10, 0, 0, "MariaDB") else: ctx_dialect = _get_dialect(dialect) if native_boolean is not None: ctx_dialect.supports_native_boolean = native_boolean # this is new as of SQLAlchemy 1.2.7 and is used by SQL Server, # which breaks assumptions in the alembic test suite ctx_dialect.non_native_boolean_check_constraint = True if not as_sql: def execute(stmt, *multiparam, **param): if isinstance(stmt, string_types): stmt = text(stmt) assert stmt.supports_execution sql = text_type(stmt.compile(dialect=ctx_dialect)) buf.write(sql) connection = mock.Mock(dialect=ctx_dialect, execute=execute) else: opts["output_buffer"] = buf connection = None context = ctx(ctx_dialect, connection, opts) alembic.op._proxy = Operations(context) return context class AlterColRoundTripFixture(object): # since these tests are about syntax, use more recent SQLAlchemy as some of # the type / server default compare logic might not work on older # SQLAlchemy versions as seems to be the case for SQLAlchemy 1.1 on Oracle __requires__ = ("alter_column", "sqlalchemy_12") def setUp(self): self.conn = config.db.connect() self.ctx = MigrationContext.configure(self.conn) self.op = Operations(self.ctx) self.metadata = MetaData() def _compare_type(self, t1, t2): c1 = Column("q", t1) c2 = Column("q", t2) assert not self.ctx.impl.compare_type( c1, c2 ), "Type objects %r and %r didn't compare as equivalent" % (t1, t2) def _compare_server_default(self, t1, s1, t2, s2): c1 = Column("q", t1, server_default=s1) c2 = Column("q", t2, server_default=s2) assert not self.ctx.impl.compare_server_default( c1, c2, s2, s1 ), "server defaults %r and %r didn't compare as equivalent" % (s1, s2) def tearDown(self): self.metadata.drop_all(self.conn) self.conn.close() def _run_alter_col(self, from_, to_, compare=None): column = Column( from_.get("name", "colname"), from_.get("type", String(10)), nullable=from_.get("nullable", True), server_default=from_.get("server_default", None), # comment=from_.get("comment", None) ) t = Table("x", self.metadata, column) t.create(self.conn) insp = inspect(self.conn) old_col = insp.get_columns("x")[0] # TODO: conditional comment support self.op.alter_column( "x", column.name, existing_type=column.type, existing_server_default=column.server_default if column.server_default is not None else False, existing_nullable=True if column.nullable else False, # existing_comment=column.comment, nullable=to_.get("nullable", None), # modify_comment=False, server_default=to_.get("server_default", False), new_column_name=to_.get("name", None), type_=to_.get("type", None), ) insp = inspect(self.conn) new_col = insp.get_columns("x")[0] if compare is None: compare = to_ eq_( new_col["name"], compare["name"] if "name" in compare else column.name, ) self._compare_type( new_col["type"], compare.get("type", old_col["type"]) ) eq_(new_col["nullable"], compare.get("nullable", column.nullable)) self._compare_server_default( new_col["type"], new_col.get("default", None), compare.get("type", old_col["type"]), compare["server_default"].text if "server_default" in compare else column.server_default.arg.text if column.server_default is not None else None, ) zzzeek-alembic-bee044a1c187/alembic/testing/requirements.py000066400000000000000000000064461353106760100237620ustar00rootroot00000000000000import sys from sqlalchemy.testing import exclusions from sqlalchemy.testing.requirements import Requirements from alembic import util from alembic.util import sqla_compat class SuiteRequirements(Requirements): @property def schemas(self): """Target database must support external schemas, and have one named 'test_schema'.""" return exclusions.open() @property def unique_constraint_reflection(self): def doesnt_have_check_uq_constraints(config): from sqlalchemy import inspect # temporary if config.db.name == "oracle": return True insp = inspect(config.db) try: insp.get_unique_constraints("x") except NotImplementedError: return True except TypeError: return True except Exception: pass return False return exclusions.skip_if(doesnt_have_check_uq_constraints) @property def foreign_key_match(self): return exclusions.open() @property def check_constraints_w_enforcement(self): """Target database must support check constraints and also enforce them.""" return exclusions.open() @property def reflects_pk_names(self): return exclusions.closed() @property def reflects_fk_options(self): return exclusions.closed() @property def sqlalchemy_issue_3740(self): """Fixes percent sign escaping for paramstyles that don't require it""" return exclusions.skip_if( lambda config: not util.sqla_120, "SQLAlchemy 1.2 or greater required", ) @property def sqlalchemy_12(self): return exclusions.skip_if( lambda config: not util.sqla_1216, "SQLAlchemy 1.2.16 or greater required", ) @property def sqlalchemy_1115(self): return exclusions.skip_if( lambda config: not util.sqla_1115, "SQLAlchemy 1.1.15 or greater required", ) @property def sqlalchemy_110(self): return exclusions.skip_if( lambda config: not util.sqla_110, "SQLAlchemy 1.1.0 or greater required", ) @property def sqlalchemy_issue_4436(self): def check(config): vers = sqla_compat._vers if vers == (1, 3, 0, "b1"): return True elif vers >= (1, 2, 16): return False else: return True return exclusions.skip_if( check, "SQLAlchemy 1.2.16, 1.3.0b2 or greater required" ) @property def python3(self): return exclusions.skip_if( lambda: sys.version_info < (3,), "Python version 3.xx is required." ) @property def pep3147(self): return exclusions.only_if(lambda config: util.compat.has_pep3147()) @property def comments(self): return exclusions.only_if( lambda config: sqla_compat._dialect_supports_comments( config.db.dialect ) ) @property def comments_api(self): return exclusions.only_if(lambda config: util.sqla_120) @property def alter_column(self): return exclusions.open() zzzeek-alembic-bee044a1c187/alembic/util/000077500000000000000000000000001353106760100201535ustar00rootroot00000000000000zzzeek-alembic-bee044a1c187/alembic/util/__init__.py000066400000000000000000000023471353106760100222720ustar00rootroot00000000000000from .exc import CommandError from .langhelpers import _with_legacy_names # noqa from .langhelpers import asbool # noqa from .langhelpers import dedupe_tuple # noqa from .langhelpers import Dispatcher # noqa from .langhelpers import immutabledict # noqa from .langhelpers import memoized_property # noqa from .langhelpers import ModuleClsProxy # noqa from .langhelpers import rev_id # noqa from .langhelpers import to_list # noqa from .langhelpers import to_tuple # noqa from .messaging import err # noqa from .messaging import format_as_comma # noqa from .messaging import msg # noqa from .messaging import obfuscate_url_pw # noqa from .messaging import status # noqa from .messaging import warn # noqa from .messaging import write_outstream # noqa from .pyfiles import coerce_resource_to_filename # noqa from .pyfiles import edit # noqa from .pyfiles import load_python_file # noqa from .pyfiles import pyc_file_from_path # noqa from .pyfiles import template_to_file # noqa from .sqla_compat import sqla_110 # noqa from .sqla_compat import sqla_1115 # noqa from .sqla_compat import sqla_120 # noqa from .sqla_compat import sqla_1216 # noqa if not sqla_110: raise CommandError("SQLAlchemy 1.1.0 or greater is required. ") zzzeek-alembic-bee044a1c187/alembic/util/compat.py000066400000000000000000000236431353106760100220200ustar00rootroot00000000000000import collections import inspect import io import sys py27 = sys.version_info >= (2, 7) py2k = sys.version_info.major < 3 py3k = sys.version_info.major >= 3 py33 = sys.version_info >= (3, 3) py35 = sys.version_info >= (3, 5) py36 = sys.version_info >= (3, 6) ArgSpec = collections.namedtuple( "ArgSpec", ["args", "varargs", "keywords", "defaults"] ) def inspect_getargspec(func): """getargspec based on fully vendored getfullargspec from Python 3.3.""" if inspect.ismethod(func): func = func.__func__ if not inspect.isfunction(func): raise TypeError("{!r} is not a Python function".format(func)) co = func.__code__ if not inspect.iscode(co): raise TypeError("{!r} is not a code object".format(co)) nargs = co.co_argcount names = co.co_varnames nkwargs = co.co_kwonlyargcount if py3k else 0 args = list(names[:nargs]) nargs += nkwargs varargs = None if co.co_flags & inspect.CO_VARARGS: varargs = co.co_varnames[nargs] nargs = nargs + 1 varkw = None if co.co_flags & inspect.CO_VARKEYWORDS: varkw = co.co_varnames[nargs] return ArgSpec(args, varargs, varkw, func.__defaults__) if py3k: from io import StringIO else: # accepts strings from StringIO import StringIO # noqa if py3k: import builtins as compat_builtins string_types = (str,) binary_type = bytes text_type = str def callable(fn): # noqa return hasattr(fn, "__call__") def u(s): return s def ue(s): return s range = range # noqa else: import __builtin__ as compat_builtins string_types = (basestring,) # noqa binary_type = str text_type = unicode # noqa callable = callable # noqa def u(s): return unicode(s, "utf-8") # noqa def ue(s): return unicode(s, "unicode_escape") # noqa range = xrange # noqa if py33: import collections.abc as collections_abc else: import collections as collections_abc # noqa if py35: from inspect import formatannotation def inspect_formatargspec( args, varargs=None, varkw=None, defaults=None, kwonlyargs=(), kwonlydefaults={}, annotations={}, formatarg=str, formatvarargs=lambda name: "*" + name, formatvarkw=lambda name: "**" + name, formatvalue=lambda value: "=" + repr(value), formatreturns=lambda text: " -> " + text, formatannotation=formatannotation, ): """Copy formatargspec from python 3.7 standard library. Python 3 has deprecated formatargspec and requested that Signature be used instead, however this requires a full reimplementation of formatargspec() in terms of creating Parameter objects and such. Instead of introducing all the object-creation overhead and having to reinvent from scratch, just copy their compatibility routine. """ def formatargandannotation(arg): result = formatarg(arg) if arg in annotations: result += ": " + formatannotation(annotations[arg]) return result specs = [] if defaults: firstdefault = len(args) - len(defaults) for i, arg in enumerate(args): spec = formatargandannotation(arg) if defaults and i >= firstdefault: spec = spec + formatvalue(defaults[i - firstdefault]) specs.append(spec) if varargs is not None: specs.append(formatvarargs(formatargandannotation(varargs))) else: if kwonlyargs: specs.append("*") if kwonlyargs: for kwonlyarg in kwonlyargs: spec = formatargandannotation(kwonlyarg) if kwonlydefaults and kwonlyarg in kwonlydefaults: spec += formatvalue(kwonlydefaults[kwonlyarg]) specs.append(spec) if varkw is not None: specs.append(formatvarkw(formatargandannotation(varkw))) result = "(" + ", ".join(specs) + ")" if "return" in annotations: result += formatreturns(formatannotation(annotations["return"])) return result else: from inspect import formatargspec as inspect_formatargspec # noqa if py3k: from configparser import ConfigParser as SafeConfigParser import configparser else: from ConfigParser import SafeConfigParser # noqa import ConfigParser as configparser # noqa if py2k: from mako.util import parse_encoding if py35: import importlib.util import importlib.machinery def load_module_py(module_id, path): spec = importlib.util.spec_from_file_location(module_id, path) module = importlib.util.module_from_spec(spec) spec.loader.exec_module(module) return module def load_module_pyc(module_id, path): spec = importlib.util.spec_from_file_location(module_id, path) module = importlib.util.module_from_spec(spec) spec.loader.exec_module(module) return module elif py3k: import importlib.machinery def load_module_py(module_id, path): module = importlib.machinery.SourceFileLoader( module_id, path ).load_module(module_id) del sys.modules[module_id] return module def load_module_pyc(module_id, path): module = importlib.machinery.SourcelessFileLoader( module_id, path ).load_module(module_id) del sys.modules[module_id] return module if py3k: def get_bytecode_suffixes(): try: return importlib.machinery.BYTECODE_SUFFIXES except AttributeError: return importlib.machinery.DEBUG_BYTECODE_SUFFIXES def get_current_bytecode_suffixes(): if py35: suffixes = importlib.machinery.BYTECODE_SUFFIXES else: if sys.flags.optimize: suffixes = importlib.machinery.OPTIMIZED_BYTECODE_SUFFIXES else: suffixes = importlib.machinery.BYTECODE_SUFFIXES return suffixes def has_pep3147(): if py35: return True else: # TODO: not sure if we are supporting old versions of Python # the import here emits a deprecation warning which the test # suite only catches if imp wasn't imported alreadt # http://www.python.org/dev/peps/pep-3147/#detecting-pep-3147-availability import imp return hasattr(imp, "get_tag") else: import imp def load_module_py(module_id, path): # noqa with open(path, "rb") as fp: mod = imp.load_source(module_id, path, fp) if py2k: source_encoding = parse_encoding(fp) if source_encoding: mod._alembic_source_encoding = source_encoding del sys.modules[module_id] return mod def load_module_pyc(module_id, path): # noqa with open(path, "rb") as fp: mod = imp.load_compiled(module_id, path, fp) # no source encoding here del sys.modules[module_id] return mod def get_current_bytecode_suffixes(): if sys.flags.optimize: return [".pyo"] # e.g. .pyo else: return [".pyc"] # e.g. .pyc def has_pep3147(): return False try: exec_ = getattr(compat_builtins, "exec") except AttributeError: # Python 2 def exec_(func_text, globals_, lcl): exec("exec func_text in globals_, lcl") ################################################ # cross-compatible metaclass implementation # Copyright (c) 2010-2012 Benjamin Peterson def with_metaclass(meta, base=object): """Create a base class with a metaclass.""" return meta("%sBase" % meta.__name__, (base,), {}) ################################################ if py3k: def reraise(tp, value, tb=None, cause=None): if cause is not None: value.__cause__ = cause if value.__traceback__ is not tb: raise value.with_traceback(tb) raise value def raise_from_cause(exception, exc_info=None): if exc_info is None: exc_info = sys.exc_info() exc_type, exc_value, exc_tb = exc_info reraise(type(exception), exception, tb=exc_tb, cause=exc_value) else: exec( "def reraise(tp, value, tb=None, cause=None):\n" " raise tp, value, tb\n" ) def raise_from_cause(exception, exc_info=None): # not as nice as that of Py3K, but at least preserves # the code line where the issue occurred if exc_info is None: exc_info = sys.exc_info() exc_type, exc_value, exc_tb = exc_info reraise(type(exception), exception, tb=exc_tb) # produce a wrapper that allows encoded text to stream # into a given buffer, but doesn't close it. # not sure of a more idiomatic approach to this. class EncodedIO(io.TextIOWrapper): def close(self): pass if py2k: # in Py2K, the io.* package is awkward because it does not # easily wrap the file type (e.g. sys.stdout) and I can't # figure out at all how to wrap StringIO.StringIO # and also might be user specified too. So create a full # adapter. class ActLikePy3kIO(object): """Produce an object capable of wrapping either sys.stdout (e.g. file) *or* StringIO.StringIO(). """ def _false(self): return False def _true(self): return True readable = seekable = _false writable = _true closed = False def __init__(self, file_): self.file_ = file_ def write(self, text): return self.file_.write(text) def flush(self): return self.file_.flush() class EncodedIO(EncodedIO): def __init__(self, file_, encoding): super(EncodedIO, self).__init__( ActLikePy3kIO(file_), encoding=encoding ) zzzeek-alembic-bee044a1c187/alembic/util/exc.py000066400000000000000000000000501353106760100212770ustar00rootroot00000000000000class CommandError(Exception): pass zzzeek-alembic-bee044a1c187/alembic/util/langhelpers.py000066400000000000000000000220361353106760100230340ustar00rootroot00000000000000import collections import textwrap import uuid import warnings from .compat import callable from .compat import collections_abc from .compat import exec_ from .compat import inspect_getargspec from .compat import string_types from .compat import with_metaclass class _ModuleClsMeta(type): def __setattr__(cls, key, value): super(_ModuleClsMeta, cls).__setattr__(key, value) cls._update_module_proxies(key) class ModuleClsProxy(with_metaclass(_ModuleClsMeta)): """Create module level proxy functions for the methods on a given class. The functions will have a compatible signature as the methods. """ _setups = collections.defaultdict(lambda: (set(), [])) @classmethod def _update_module_proxies(cls, name): attr_names, modules = cls._setups[cls] for globals_, locals_ in modules: cls._add_proxied_attribute(name, globals_, locals_, attr_names) def _install_proxy(self): attr_names, modules = self._setups[self.__class__] for globals_, locals_ in modules: globals_["_proxy"] = self for attr_name in attr_names: globals_[attr_name] = getattr(self, attr_name) def _remove_proxy(self): attr_names, modules = self._setups[self.__class__] for globals_, locals_ in modules: globals_["_proxy"] = None for attr_name in attr_names: del globals_[attr_name] @classmethod def create_module_class_proxy(cls, globals_, locals_): attr_names, modules = cls._setups[cls] modules.append((globals_, locals_)) cls._setup_proxy(globals_, locals_, attr_names) @classmethod def _setup_proxy(cls, globals_, locals_, attr_names): for methname in dir(cls): cls._add_proxied_attribute(methname, globals_, locals_, attr_names) @classmethod def _add_proxied_attribute(cls, methname, globals_, locals_, attr_names): if not methname.startswith("_"): meth = getattr(cls, methname) if callable(meth): locals_[methname] = cls._create_method_proxy( methname, globals_, locals_ ) else: attr_names.add(methname) @classmethod def _create_method_proxy(cls, name, globals_, locals_): fn = getattr(cls, name) def _name_error(name): raise NameError( "Can't invoke function '%s', as the proxy object has " "not yet been " "established for the Alembic '%s' class. " "Try placing this code inside a callable." % (name, cls.__name__) ) globals_["_name_error"] = _name_error translations = getattr(fn, "_legacy_translations", []) if translations: spec = inspect_getargspec(fn) if spec[0] and spec[0][0] == "self": spec[0].pop(0) outer_args = inner_args = "*args, **kw" translate_str = "args, kw = _translate(%r, %r, %r, args, kw)" % ( fn.__name__, tuple(spec), translations, ) def translate(fn_name, spec, translations, args, kw): return_kw = {} return_args = [] for oldname, newname in translations: if oldname in kw: warnings.warn( "Argument %r is now named %r " "for method %s()." % (oldname, newname, fn_name) ) return_kw[newname] = kw.pop(oldname) return_kw.update(kw) args = list(args) if spec[3]: pos_only = spec[0][: -len(spec[3])] else: pos_only = spec[0] for arg in pos_only: if arg not in return_kw: try: return_args.append(args.pop(0)) except IndexError: raise TypeError( "missing required positional argument: %s" % arg ) return_args.extend(args) return return_args, return_kw globals_["_translate"] = translate else: outer_args = "*args, **kw" inner_args = "*args, **kw" translate_str = "" func_text = textwrap.dedent( """\ def %(name)s(%(args)s): %(doc)r %(translate)s try: p = _proxy except NameError: _name_error('%(name)s') return _proxy.%(name)s(%(apply_kw)s) e """ % { "name": name, "translate": translate_str, "args": outer_args, "apply_kw": inner_args, "doc": fn.__doc__, } ) lcl = {} exec_(func_text, globals_, lcl) return lcl[name] def _with_legacy_names(translations): def decorate(fn): fn._legacy_translations = translations return fn return decorate def asbool(value): return value is not None and value.lower() == "true" def rev_id(): return uuid.uuid4().hex[-12:] def to_list(x, default=None): if x is None: return default elif isinstance(x, string_types): return [x] elif isinstance(x, collections_abc.Iterable): return list(x) else: return [x] def to_tuple(x, default=None): if x is None: return default elif isinstance(x, string_types): return (x,) elif isinstance(x, collections_abc.Iterable): return tuple(x) else: return (x,) def unique_list(seq, hashfunc=None): seen = set() seen_add = seen.add if not hashfunc: return [x for x in seq if x not in seen and not seen_add(x)] else: return [ x for x in seq if hashfunc(x) not in seen and not seen_add(hashfunc(x)) ] def dedupe_tuple(tup): return tuple(unique_list(tup)) class memoized_property(object): """A read-only @property that is only evaluated once.""" def __init__(self, fget, doc=None): self.fget = fget self.__doc__ = doc or fget.__doc__ self.__name__ = fget.__name__ def __get__(self, obj, cls): if obj is None: return self obj.__dict__[self.__name__] = result = self.fget(obj) return result class immutabledict(dict): def _immutable(self, *arg, **kw): raise TypeError("%s object is immutable" % self.__class__.__name__) __delitem__ = ( __setitem__ ) = __setattr__ = clear = pop = popitem = setdefault = update = _immutable def __new__(cls, *args): new = dict.__new__(cls) dict.__init__(new, *args) return new def __init__(self, *args): pass def __reduce__(self): return immutabledict, (dict(self),) def union(self, d): if not self: return immutabledict(d) else: d2 = immutabledict(self) dict.update(d2, d) return d2 def __repr__(self): return "immutabledict(%s)" % dict.__repr__(self) class Dispatcher(object): def __init__(self, uselist=False): self._registry = {} self.uselist = uselist def dispatch_for(self, target, qualifier="default"): def decorate(fn): if self.uselist: self._registry.setdefault((target, qualifier), []).append(fn) else: assert (target, qualifier) not in self._registry self._registry[(target, qualifier)] = fn return fn return decorate def dispatch(self, obj, qualifier="default"): if isinstance(obj, string_types): targets = [obj] elif isinstance(obj, type): targets = obj.__mro__ else: targets = type(obj).__mro__ for spcls in targets: if qualifier != "default" and (spcls, qualifier) in self._registry: return self._fn_or_list(self._registry[(spcls, qualifier)]) elif (spcls, "default") in self._registry: return self._fn_or_list(self._registry[(spcls, "default")]) else: raise ValueError("no dispatch function for object: %s" % obj) def _fn_or_list(self, fn_or_list): if self.uselist: def go(*arg, **kw): for fn in fn_or_list: fn(*arg, **kw) return go else: return fn_or_list def branch(self): """Return a copy of this dispatcher that is independently writable.""" d = Dispatcher() if self.uselist: d._registry.update( (k, [fn for fn in self._registry[k]]) for k in self._registry ) else: d._registry.update(self._registry) return d zzzeek-alembic-bee044a1c187/alembic/util/messaging.py000066400000000000000000000047421353106760100225110ustar00rootroot00000000000000import logging import sys import textwrap import warnings from sqlalchemy.engine import url from .compat import binary_type from .compat import collections_abc from .compat import py27 from .compat import string_types log = logging.getLogger(__name__) if py27: # disable "no handler found" errors logging.getLogger("alembic").addHandler(logging.NullHandler()) try: import fcntl import termios import struct ioctl = fcntl.ioctl(0, termios.TIOCGWINSZ, struct.pack("HHHH", 0, 0, 0, 0)) _h, TERMWIDTH, _hp, _wp = struct.unpack("HHHH", ioctl) if TERMWIDTH <= 0: # can occur if running in emacs pseudo-tty TERMWIDTH = None except (ImportError, IOError): TERMWIDTH = None def write_outstream(stream, *text): encoding = getattr(stream, "encoding", "ascii") or "ascii" for t in text: if not isinstance(t, binary_type): t = t.encode(encoding, "replace") t = t.decode(encoding) try: stream.write(t) except IOError: # suppress "broken pipe" errors. # no known way to handle this on Python 3 however # as the exception is "ignored" (noisily) in TextIOWrapper. break def status(_statmsg, fn, *arg, **kw): msg(_statmsg + " ...", False) try: ret = fn(*arg, **kw) write_outstream(sys.stdout, " done\n") return ret except: write_outstream(sys.stdout, " FAILED\n") raise def err(message): log.error(message) msg("FAILED: %s" % message) sys.exit(-1) def obfuscate_url_pw(u): u = url.make_url(u) if u.password: u.password = "XXXXX" return str(u) def warn(msg, stacklevel=2): warnings.warn(msg, UserWarning, stacklevel=stacklevel) def msg(msg, newline=True): if TERMWIDTH is None: write_outstream(sys.stdout, msg) if newline: write_outstream(sys.stdout, "\n") else: # left indent output lines lines = textwrap.wrap(msg, TERMWIDTH) if len(lines) > 1: for line in lines[0:-1]: write_outstream(sys.stdout, " ", line, "\n") write_outstream(sys.stdout, " ", lines[-1], ("\n" if newline else "")) def format_as_comma(value): if value is None: return "" elif isinstance(value, string_types): return value elif isinstance(value, collections_abc.Iterable): return ", ".join(value) else: raise ValueError("Don't know how to comma-format %r" % value) zzzeek-alembic-bee044a1c187/alembic/util/pyfiles.py000066400000000000000000000057241353106760100222100ustar00rootroot00000000000000import os import re import tempfile from mako import exceptions from mako.template import Template from .compat import get_current_bytecode_suffixes from .compat import has_pep3147 from .compat import load_module_py from .compat import load_module_pyc from .compat import py35 from .exc import CommandError def template_to_file(template_file, dest, output_encoding, **kw): template = Template(filename=template_file) try: output = template.render_unicode(**kw).encode(output_encoding) except: with tempfile.NamedTemporaryFile(suffix=".txt", delete=False) as ntf: ntf.write( exceptions.text_error_template() .render_unicode() .encode(output_encoding) ) fname = ntf.name raise CommandError( "Template rendering failed; see %s for a " "template-oriented traceback." % fname ) else: with open(dest, "wb") as f: f.write(output) def coerce_resource_to_filename(fname): """Interpret a filename as either a filesystem location or as a package resource. Names that are non absolute paths and contain a colon are interpreted as resources and coerced to a file location. """ if not os.path.isabs(fname) and ":" in fname: import pkg_resources fname = pkg_resources.resource_filename(*fname.split(":")) return fname def pyc_file_from_path(path): """Given a python source path, locate the .pyc. """ if has_pep3147(): if py35: import importlib candidate = importlib.util.cache_from_source(path) else: import imp candidate = imp.cache_from_source(path) if os.path.exists(candidate): return candidate # even for pep3147, fall back to the old way of finding .pyc files, # to support sourceless operation filepath, ext = os.path.splitext(path) for ext in get_current_bytecode_suffixes(): if os.path.exists(filepath + ext): return filepath + ext else: return None def edit(path): """Given a source path, run the EDITOR for it""" import editor try: editor.edit(path) except Exception as exc: raise CommandError("Error executing editor (%s)" % (exc,)) def load_python_file(dir_, filename): """Load a file from the given path as a Python module.""" module_id = re.sub(r"\W", "_", filename) path = os.path.join(dir_, filename) _, ext = os.path.splitext(filename) if ext == ".py": if os.path.exists(path): module = load_module_py(module_id, path) else: pyc_path = pyc_file_from_path(path) if pyc_path is None: raise ImportError("Can't find Python file %s" % path) else: module = load_module_pyc(module_id, pyc_path) elif ext in (".pyc", ".pyo"): module = load_module_pyc(module_id, path) return module zzzeek-alembic-bee044a1c187/alembic/util/sqla_compat.py000066400000000000000000000151421353106760100230330ustar00rootroot00000000000000import re from sqlalchemy import __version__ from sqlalchemy import schema from sqlalchemy import sql from sqlalchemy import types as sqltypes from sqlalchemy.ext.compiler import compiles from sqlalchemy.schema import CheckConstraint from sqlalchemy.schema import Column from sqlalchemy.schema import ForeignKeyConstraint from sqlalchemy.sql.expression import _BindParamClause from sqlalchemy.sql.expression import _TextClause as TextClause from sqlalchemy.sql.visitors import traverse from . import compat def _safe_int(value): try: return int(value) except: return value _vers = tuple( [_safe_int(x) for x in re.findall(r"(\d+|[abc]\d)", __version__)] ) sqla_110 = _vers >= (1, 1, 0) sqla_1115 = _vers >= (1, 1, 15) sqla_120 = _vers >= (1, 2, 0) sqla_1216 = _vers >= (1, 2, 16) sqla_14 = _vers >= (1, 4) AUTOINCREMENT_DEFAULT = "auto" def _table_for_constraint(constraint): if isinstance(constraint, ForeignKeyConstraint): return constraint.parent else: return constraint.table def _columns_for_constraint(constraint): if isinstance(constraint, ForeignKeyConstraint): return [fk.parent for fk in constraint.elements] elif isinstance(constraint, CheckConstraint): return _find_columns(constraint.sqltext) else: return list(constraint.columns) def _fk_spec(constraint): source_columns = [ constraint.columns[key].name for key in constraint.column_keys ] source_table = constraint.parent.name source_schema = constraint.parent.schema target_schema = constraint.elements[0].column.table.schema target_table = constraint.elements[0].column.table.name target_columns = [element.column.name for element in constraint.elements] ondelete = constraint.ondelete onupdate = constraint.onupdate deferrable = constraint.deferrable initially = constraint.initially return ( source_schema, source_table, source_columns, target_schema, target_table, target_columns, onupdate, ondelete, deferrable, initially, ) def _fk_is_self_referential(constraint): spec = constraint.elements[0]._get_colspec() tokens = spec.split(".") tokens.pop(-1) # colname tablekey = ".".join(tokens) return tablekey == constraint.parent.key def _is_type_bound(constraint): # this deals with SQLAlchemy #3260, don't copy CHECK constraints # that will be generated by the type. # new feature added for #3260 return constraint._type_bound def _find_columns(clause): """locate Column objects within the given expression.""" cols = set() traverse(clause, {}, {"column": cols.add}) return cols def _remove_column_from_collection(collection, column): """remove a column from a ColumnCollection.""" # workaround for older SQLAlchemy, remove the # same object that's present to_remove = collection[column.key] collection.remove(to_remove) def _textual_index_column(table, text_): """a workaround for the Index construct's severe lack of flexibility""" if isinstance(text_, compat.string_types): c = Column(text_, sqltypes.NULLTYPE) table.append_column(c) return c elif isinstance(text_, TextClause): return _textual_index_element(table, text_) else: raise ValueError("String or text() construct expected") class _textual_index_element(sql.ColumnElement): """Wrap around a sqlalchemy text() construct in such a way that we appear like a column-oriented SQL expression to an Index construct. The issue here is that currently the Postgresql dialect, the biggest recipient of functional indexes, keys all the index expressions to the corresponding column expressions when rendering CREATE INDEX, so the Index we create here needs to have a .columns collection that is the same length as the .expressions collection. Ultimately SQLAlchemy should support text() expressions in indexes. See SQLAlchemy issue 3174. """ __visit_name__ = "_textual_idx_element" def __init__(self, table, text): self.table = table self.text = text self.key = text.text self.fake_column = schema.Column(self.text.text, sqltypes.NULLTYPE) table.append_column(self.fake_column) def get_children(self): return [self.fake_column] @compiles(_textual_index_element) def _render_textual_index_column(element, compiler, **kw): return compiler.process(element.text, **kw) class _literal_bindparam(_BindParamClause): pass @compiles(_literal_bindparam) def _render_literal_bindparam(element, compiler, **kw): return compiler.render_literal_bindparam(element, **kw) def _get_index_expressions(idx): return list(idx.expressions) def _get_index_column_names(idx): return [getattr(exp, "name", None) for exp in _get_index_expressions(idx)] def _get_index_final_name(dialect, idx): # trying to keep the truncation rules totally localized on the # SQLA side while also stepping around the quoting issue. Ideally # the _prepared_index_name() method on the SQLA side would have # a quoting option or the truncation routine would be broken out. # # test for SQLA quoted_name construct, introduced in # 0.9 or thereabouts. # this doesn't work in 0.8 and the "quote" option on Index doesn't # seem to work in 0.8 either. if hasattr(idx.name, "quote"): # might be quoted_name, might be truncated_name, keep it the # same quoted_name_cls = type(idx.name) new_name = quoted_name_cls(str(idx.name), quote=False) idx = schema.Index(name=new_name) return dialect.ddl_compiler(dialect, None)._prepared_index_name(idx) def _dialect_supports_comments(dialect): if sqla_120: return dialect.supports_comments else: return False def _comment_attribute(obj): """return the .comment attribute from a Table or Column""" if sqla_120: return obj.comment else: return None def _is_mariadb(mysql_dialect): return ( mysql_dialect.server_version_info and "MariaDB" in mysql_dialect.server_version_info ) def _mariadb_normalized_version_info(mysql_dialect): if len(mysql_dialect.server_version_info) > 5: return mysql_dialect.server_version_info[3:] else: return mysql_dialect.server_version_info if sqla_14: from sqlalchemy import create_mock_engine else: from sqlalchemy import create_engine def create_mock_engine(url, executor): return create_engine( "postgresql://", strategy="mock", executor=executor ) zzzeek-alembic-bee044a1c187/docs/000077500000000000000000000000001353106760100165325ustar00rootroot00000000000000zzzeek-alembic-bee044a1c187/docs/build/000077500000000000000000000000001353106760100176315ustar00rootroot00000000000000zzzeek-alembic-bee044a1c187/docs/build/Makefile000066400000000000000000000064551353106760100213030ustar00rootroot00000000000000# Makefile for Sphinx documentation # # You can set these variables from the command line. SPHINXOPTS = SPHINXBUILD = sphinx-build PAPER = BUILDDIR = output # Internal variables. PAPEROPT_a4 = -D latex_paper_size=a4 PAPEROPT_letter = -D latex_paper_size=letter ALLSPHINXOPTS = -d $(BUILDDIR)/doctrees $(PAPEROPT_$(PAPER)) $(SPHINXOPTS) . .PHONY: help clean html dirhtml pickle json htmlhelp qthelp latex changes linkcheck doctest help: @echo "Please use \`make ' where is one of" @echo " html to make standalone HTML files" @echo " dist-html same as html, but places files in /doc" @echo " dirhtml to make HTML files named index.html in directories" @echo " pickle to make pickle files" @echo " json to make JSON files" @echo " htmlhelp to make HTML files and a HTML help project" @echo " qthelp to make HTML files and a qthelp project" @echo " latex to make LaTeX files, you can set PAPER=a4 or PAPER=letter" @echo " changes to make an overview of all changed/added/deprecated items" @echo " linkcheck to check all external links for integrity" @echo " doctest to run all doctests embedded in the documentation (if enabled)" clean: -rm -rf $(BUILDDIR)/* html: $(SPHINXBUILD) -b html $(ALLSPHINXOPTS) $(BUILDDIR)/html @echo @echo "Build finished. The HTML pages are in $(BUILDDIR)/html." dist-html: $(SPHINXBUILD) -b html $(ALLSPHINXOPTS) $(BUILDDIR)/html cp -R $(BUILDDIR)/html/* ../ rm -fr $(BUILDDIR)/html @echo @echo "Build finished. The HTML pages are in ../." dirhtml: $(SPHINXBUILD) -b dirhtml $(ALLSPHINXOPTS) $(BUILDDIR)/dirhtml @echo @echo "Build finished. The HTML pages are in $(BUILDDIR)/dirhtml." pickle: $(SPHINXBUILD) -b pickle $(ALLSPHINXOPTS) $(BUILDDIR)/pickle @echo @echo "Build finished; now you can process the pickle files." json: $(SPHINXBUILD) -b json $(ALLSPHINXOPTS) $(BUILDDIR)/json @echo @echo "Build finished; now you can process the JSON files." htmlhelp: $(SPHINXBUILD) -b htmlhelp $(ALLSPHINXOPTS) $(BUILDDIR)/htmlhelp @echo @echo "Build finished; now you can run HTML Help Workshop with the" \ ".hhp project file in $(BUILDDIR)/htmlhelp." qthelp: $(SPHINXBUILD) -b qthelp $(ALLSPHINXOPTS) $(BUILDDIR)/qthelp @echo @echo "Build finished; now you can run "qcollectiongenerator" with the" \ ".qhcp project file in $(BUILDDIR)/qthelp, like this:" @echo "# qcollectiongenerator $(BUILDDIR)/qthelp/Alembic.qhcp" @echo "To view the help file:" @echo "# assistant -collectionFile $(BUILDDIR)/qthelp/Alembic.qhc" latex: $(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) $(BUILDDIR)/latex @echo @echo "Build finished; the LaTeX files are in $(BUILDDIR)/latex." @echo "Run \`make all-pdf' or \`make all-ps' in that directory to" \ "run these through (pdf)latex." changes: $(SPHINXBUILD) -b changes $(ALLSPHINXOPTS) $(BUILDDIR)/changes @echo @echo "The overview file is in $(BUILDDIR)/changes." linkcheck: $(SPHINXBUILD) -b linkcheck $(ALLSPHINXOPTS) $(BUILDDIR)/linkcheck @echo @echo "Link check complete; look for any errors in the above output " \ "or in $(BUILDDIR)/linkcheck/output.txt." doctest: $(SPHINXBUILD) -b doctest $(ALLSPHINXOPTS) $(BUILDDIR)/doctest @echo "Testing of doctests in the sources finished, look at the " \ "results in $(BUILDDIR)/doctest/output.txt." zzzeek-alembic-bee044a1c187/docs/build/_static/000077500000000000000000000000001353106760100212575ustar00rootroot00000000000000zzzeek-alembic-bee044a1c187/docs/build/_static/nature_override.css000066400000000000000000000005271353106760100251720ustar00rootroot00000000000000@import url("nature.css"); @import url("site_custom_css.css"); .versionadded, .versionchanged, .deprecated { background-color: #FFFFCC; border: 1px solid #FFFF66; margin-bottom: 10px; margin-top: 10px; padding: 7px; } .versionadded > p > span, .versionchanged > p > span, .deprecated > p > span{ font-style: italic; } zzzeek-alembic-bee044a1c187/docs/build/_static/site_custom_css.css000066400000000000000000000000001353106760100251650ustar00rootroot00000000000000zzzeek-alembic-bee044a1c187/docs/build/_templates/000077500000000000000000000000001353106760100217665ustar00rootroot00000000000000zzzeek-alembic-bee044a1c187/docs/build/_templates/site_custom_sidebars.html000066400000000000000000000000001353106760100270540ustar00rootroot00000000000000zzzeek-alembic-bee044a1c187/docs/build/api/000077500000000000000000000000001353106760100204025ustar00rootroot00000000000000zzzeek-alembic-bee044a1c187/docs/build/api/api_overview.png000066400000000000000000003620751353106760100236240ustar00rootroot00000000000000‰PNG  IHDRCò)ÉsRGB®Îé pHYs N NwŒ#ÕiTXtXML:com.adobe.xmp 5 2 1 °ã2Ý@IDATxìØEÒÇ›Œ"`# ¨QÄtzgÄœîLg@=s¼ÓO ñLgÄœO̘Óa˜s@E%*AD1Ì׿ÂZûfsxw÷­zžÙ™ééðïž™þwUW7‹¼8CÀ0 CÀ0 CÀ0šÍ›Xy­¸†€!`†€!`†€!`F†¬!†€!`†€!`†@“DÀÈP“¬v+´!`†€!`†€!`´¬gl:TmÔn³fÍj#£–KCÀ0 CÀ0 ºB .ȤG7j'<®«ÚªóÂ@Šâ[ÙŠg†€!`†€!ЈÔ4‚ôüöÛo߯¿þê>øà÷í·ß â$×ã’t.ÆþWP#¤ÇìÛ·oïúöí+™hÞ¼yŠ U&W–Š!`†€!`†@SA ™'ó2†*/½f" â?ûŽ;º…ZȵhÑ¢A 4|ƒ“ö§*Pf櫯¾rãÆsÝ»ww-[ÎåëÔiRØð>;6 CÀ0 CÀ0òA f5CJ„tÿË/¿8¶1cƸÅ_< ,l•!éùᇄà’5ˆõ¬Z¢*Ë®eÇ0 CÀ0 C F¨92„–G7:ÈJ‚æÌ™S£U`ÙNB€úüùçŸS¦Ž$ê;®õKº×Ά€!`†€!`¹ P“®µ!CtŒ• Ñqþé§Ÿr)oÞa.¿ürwàÊöÖ[o¥îùå—Sço¹å–ÔùLh;N?ýt·Á¸•VZÉýå/qwÝu×<·L›6Íp nýõ×—pÛm·{ä‘G„»úê«%}ò‘$Æ “ëãÇOº\Ô¹3fHÜÇ/*žL7SŸ!6L !½J‚3Ýg× CÀ0 CÀ0 \¨92¤Dˆ2eˆÐ?þ([®…Î'Ü£>ꮸâ Ùî»ï¾Ô­wÜqGêü3Ï<“:Ÿî€|o»í¶î¤“Nr¯¼òŠtî~øa·ýöÛ»sÏ=7uÛ„ „uÖYî7ÞrÝsÏ=nóÍ7—{5àOýôS÷æ›oº®]»º3Î8£AR e9 äkæÌ™rìØ±©c´ko¿ý¶ûúë¯S±hYÐ!ï¾û®›8qbêz% ¸ª"?lF†*¼¥a†€!`†@ÓA ¦ÈÕ¢b%D:·döìÙe­µµ×^Û½ð bšG'ýÕW_uk­µV*M´TË.»¬[tÑEÝôéÓåæq£Fâsþùç§<©­²Ê*S9ˆË7Þ˜J3×\ŽwëÖMâ +¯¼²ëÒ¥‹»öÚkå¸OŸ>²wÕUWI”%ò½Ûn»¹W\ÑõîÝ[þwÜq¹&Yt8ÕöQ¿J„ uÔ»‰!`†€!`†€!P j’ A è³ÑQ†œÐy.§¬³Î:BrÞyç1s#Íu×]7•ä|óÍç,ùÑù=ºßu×]eÝœå—_^4BÕV[Í~øá®U«Vnã7–=„5vÚ¶m›Š›ƒM6ÙDþCT •§Ÿ~Úm½õÖbšyÛgŸ}D²–'žxbƒ¨xà·Þzë¹#+æq¯¾újQÑC*%íÛ·w«¬²Šü×s¦¨„ò¸™t5m=Ö½æEÿç­5 CÀ0 CÀ0²"Ð̆(](%9êÿ ,°€x4ÓŽjº{í|é˜5k–hGp9]Ja!WLðºuë&ñ£‰A£nㄇÿˆîåý†€!`†€!`%D ­fHµ,ìÙpH Û˜1cÜâ‹/^ÂlXTDã‡~H™’ˆç+AˆŒè4F­[š†€!`†€!`Š@"Re‘!/E;ÄÞ¤¾ ^Ù<¸Q÷Jˆê«¤VCÀ0 CÀ0 C !‰dˆ tŠ!C 4Bìú駆wgøwÄG¸Ù³g§B,¼ðÂnÛm·uýúõK+ö€ü¡]¨V ÃðáÃÝÇìÎ9çÁN“´SO=µØ¢Ïsÿÿþ÷?÷ßÿþ×í¼óÎnà 7œçzºÔ) ÔB„À×Ö!ê¿ZqMW;o†€!`†€!`äƒ@âœ!#’ :Í?þø£[vÙeÝ—_~™ÕL®S§NnúôéóäeРAîñÇooó\ÌãÄ+¯¼âÖXc ·çž{ºë®».;+tàÀî™gžq&Lp¬Ô«W/·Ür˹>ø ä™¸øâ‹Ý¡‡ê.ºè"÷ü#§ø!;O?ý´[zé¥ó‘ Dñõ}rŠÈ†€!`†€!`5ˆÀþ•Ï|h"§s„T+jzr-ëçŸî¦N*¤ k×®nôèÑr¬÷C^xáé”÷Ýwz:µ'/¾ø¢›2eJêyT-Ä Í† q¼ôÒKnÆŒB@ "\'ÇJÐÆŽ«·Èõ×_ݽõÖ[)“±ÔE€÷<ò¦£×)ÄŒ|2ÿfòäÉ9™jXâ×·ß~;啇 xŽc¯NZÊþÆo4мi¸|÷¤ÏFœZßà«[¾ñYxCÀ0 CÀ0 C V˜‡ iƵ3L¹2Ô¹sg·à‚ ºõÖ[Ï¡)AÐ,!÷Þ{¯h˜Ö]w]·Á81b„\ãgذa®{÷înuÖó2öäâB|ÈM7Ý$÷r|Ùe—9×Úk¯íúôéãöÙg·È"‹8LÈ0Wãx¿ýös«­¶šëß¿?·qBÛŹ¾}ûº%–XBâç$eóÍ7Oå¡cÇŽî”SNá’yÅü yÿÛßþ&ù|òÉ'5HÚý¹çž+aÉO—.]$¿hŽn¾ùfùOþÐ"}øá‡ÇYg%ù?äCä¾UW]U°{ì±ÇÒ¦‘ËʨDHça~hb†€!`†€!`Ô;‰dH‰b%Cª=Èo¾ùÆáÂùá‡vJ–Zj)ÇùÝwß]41C‡uÖ½9ðÀEKòÕW_ ñXsÍ5…œyä‘¢€@.¸àÉÊÆoìÎ;ï<éÐsÌ1²0é™gž)ÄëŽ;î˜'»wÞy§{÷ÝwÝvÛm'÷ì¶Ûn¢q!âDÓ³å–[J¾®½öZ÷È#¸3Î8Ã=ÿüóBܹO>ùÄ|ðÁ®uëÖŽt) qç+O=õ”c.Äç³Ï>“òüñ‚n˜¿…2räHÇuH+ÌÑ2*Kݨk‘Ö¡qÚ}†€!`†€!`†@- ÕBHˆÔ4-Ÿ‚-¿üò ‚C^Ðê ÅÁ íÍÉ'Ÿ,a0qƒxÜrË-nÇw”NúË/¿ì®¿þz!7Çwœk×®›þùÝ€ä\|C [ÄQ`CÞ{ï=1a“?¿ÿ …yóÍ7EËBܬ³ƒ&I;°çCÀ0 CÀ0 ZC -¢ !í +)*U™#s×]w9Ö#„ BgŸyFÌ ‚ö®¸qh°×^{ q9öØc%Ü*«¬âØð,Çü¼¹ÝsÏ=r/ñ? Qâ$‚´D8YèÝ»·˜Á¡Â<í¶Ûns:tp,Ë!4[,dŠC…Áƒ»Í6ÛLÈ%—\"n©Iâ„SƒRKœ mµÕVîì³Ïv÷ÝwŸx“CK•®|¹ä%4#¼‘¡\P³0†€!`†€!`õ€À<‹®ÒVWÚx)Ã)köL›6M<­¡Ña¾L1¦Y!ptÆñÌFšxTcÑÏPÔl ·ÖhqZ´h^–|AÐZ¡ Â7ñ ¼µMœ8QÜQ7¸1øC™1Ë#´>a@Láð2G™qÏ ëÿ0÷ˆ4>úhwþù狆ÂTJaþ„ñî»ïrÈúMh«˜U¨€Û•W^)|ÙpPÁ)¤.âd¬Ð´ì>CÀ0 CÀ0 C Ú˜kk–&WÚf¯[š Ÿ†x(yIФ[·n‰ š%„ÅTÑÚ@ÞvÝuW™‹Ä¤8Q ŽLBÙBÇ aXÈ$Œ-Ip€Pˆã„¤¸r=‡&(]~s#®RËþ†€!`†€!`µ€@Z2¤ä‡N²«Æäµ×^Kyb«¦B² êC=äžxâ ñ9:à€d}¢JäÍsxp¼ðâ‹/–4I¼¾a²ÇÚC¥Œb¥DˆzÖú.iæ-2CÀ0 CÀ0 C  Hk&‡ÙÞÕд̜9S4/˜Êá*-ŒÎ% ÷UX¾‚³„y´iÓ¦à8н‘u0US²Rh|DI­î)‹º2ß 9´\˜ÈÍ7ß|’®™ÉЏÝg†€!`†€!P ¤Õ ‘yÕ BƒÀ†×5H„ˆù2&<¯ÑqG3¢Nj¡ð™òHypŠÀ:CºæO¦ð¥¾†&GŤ¯õ©¡ 6Hëabˆ£êÂ¥eá^CÀ0 CÀ0 C žH$Cj*¥i%Ct¨ÙÐ( A€øh:Üê™ mQ­ $…P!E]ºt©xqp·»bÒÖºÄj½A†TûCüÔוqn/´%h†€!`†€!`TD2DúÚ‘¦“Ìv¦1ƒðFµœƒ ©Ù\ËP¶¤p¼€æ ²J ê BŤ­¤FÉu¨Ú!ö¡©fˆûL CÀ0 CÀ0 zG ‘ i'Z‰„GIhHgíds9Õ Õ hK.¹¤Ì—Â]w¥ÓÃ…^X\…“6õD=j])BëE½¡åƒ$©†(Ô “®Ýk†€!`†€!`T;ó!Õ h§˜Ž4e:ÍjþFÕé\!ˆj†¸®a«€LùcÍ¡çž{NpÍ®×pZcñØBDë€}Hj©7%D!%D%6ÂS÷Ç&†€!`†€!`†@½"0¢ t‚Ùè«Ù[Hn8 ¢SFH7Â0¨^d¥•Vr#GŽ,jÞN¡X|ûí·nÀ€E§­¤VÉŽjù ·Ô{Õ©Iï)4ïvŸ!`†€!`†€!P ¤%Ctˆ!6ìé@«h‡ ó„z‘«'2Ä‚­3fÌbعsg… "ûo¾ùÆ-¿üò%1“Sm„HëSµ}ìÕLŽºU"ÌÞÄ0 CÀ0 CÀ¨gÒ’¡° úD§Y PhG¸z"C¸ž^l±ÅÜĉÝÒK/Mñ*"º®ÓÊ+¯,®¯‹Iòƒ°g£îtS-‘’$=¯„¨˜tí^CÀ0 CÀ0 C ÚH$Cši:ÇeHB‡ZÿC|tS3:ÝKà:øYqÅÝ—_~é6ÝtÓŠ•æÓO?uh¢X µXQ ’ÙÅ ’þ×{ŠMÛî7 CÀ0 CÀ0ª´dˆŽ1DG ‘v¦õ¤'Ü8_í’DÔ’Î…åèÝ»·ƒœàh Ròõ×_»e–Y¦¤iRŸÔa¸)ùapzä6\ʉùH’tç“ÂÚ9CÀ0 CÀ0 C iÉ7j§X e%=º;Íáq¶„Ë}¼h~¼ê9M?þ_Ïë2tÕUW¥H¡ž/ç~ܸqB†÷bÓ I„³WŒt¯é¨PÿWz¯yÔtõ?û¸éy g{CÀ0 CÀ0 C 2’!"¢Ã©Î°3ª‰(ÑÐÿ½;öêêû£>rß}÷]ƒÎ?ùÌF„´,833fŒþ-ûþ½÷Þ“5†Ê‘¦Ö¥"¬_=W {Íg˜¿:¸UVYEÚ#¤H¯iØjÈ·åÁ0 CÀ0 C vhæ AT;ÙÍœS%Bì!B5¶Ž;º®]»VT»“9§vµÆ/&‹‹,²ˆhˆ D!)*$N»Ç0 CÀ0 C é"U3TKÐ(‚é¦k ½üòËnñů¥âX^c š={¶x2ĉ‡Š"ýo{CÀ0 CÀ0 C æÎœÏ%d•‡QWœá ¼Ôrùå—»|¸ÄÏúLåû¥Nu+4J‚Ë‘žÅi†€!`†€!P¿ü1¼^ãe¤C¬fqhƒ EtœüñÇ’—ìÑGuwß}·Ä‹¶©OŸ>r|Çw¸+®¸BŽÑbì¶ÛnîÊ+¯tp€ÌoéÕ«—{íµ×ÜsÏ=ç %O>ù¤,xšK˜ &¸ÕV[Í1©}ûöböwÏ=÷H>N<ñDwÚi§IºO<ñ„»õÖ[ÝÀÝšk®)çŸ×_]ò¸ÐB ¹m¶Ù&¼T’ãûî»Ï=óÌ3î”SNq ,°@Iâ #¡NÙtžk%q¬[ÖŽ CÀ0 CÀ0 LÔfˆB*!‚Í™3GˆP9Ⱥä’K ±Ñÿœ¥–ZJÿÊþ /”¹Jo¼ñ†ûðä†õƒûÔSOåfÿý÷"„æ g¸û~óÍ7…qÆ òÑ Yþ×”)S$.½?ÿüóÔ_}õ•{ûí·SŽ& ™ä_µ>ï¾û®,H›º¡Ô+dˆ½’_ö&†€!`†€!`†@¾Ô¢SL‡=fL©˜cR.Y{íµÝ /¼JïÕW_uk­µVƒäfÍš%×Y¸²†¶dôèÑîñÇS7g ƒyܨQ£„øœþù²è-÷áY S9Ê{ã7r*oÁlíÖÞ{ï-û=z¸ýöÛÏ 2Dþ£õÚj«­$ïcÇŽu8/@ãÅb´˜ýñÿ¸ãŽË;ÝBo€ÜB„ب_êð51 CÀ0 CÀ0rE nÈaÕ ©™çrj†ÖYg7}út÷Î;ï¸W^yEغë®ÛûvÚIòµå–[º%–XÂí±Ç⦛y?‹-¶˜„Í Ò·o_×¶m[9ÖŸM6ÙD!*… Ä-æv,.˺Jh­Î<óL!`<ð€{öÙgSÑóuä‘GJ~Î9ç‡é`%rK*éÕ:7"T ô- CÀ0 CÀ0ê º C!¢zè(«Ö œdh5Ö8LÞžþy×¹sg·Â +4h!Š‹/¾Ø 0@LÊn¾ùf·û9Ýûï¿/a³…p!q"Ĺe–Y†19(ðÍÒ©§ž*$‡(˜ÇtüñÇ»vØAbd®’ Ú¯#F¸óÎ;O:p¾RdfZ¿¦ÒZ±½!`†€!`†@¾Ô¢ÐJˆÔ›œÎ¢Ó\.Á‘fd!6Ìætb¿¦Ù¢E wÈ!‡ˆfRƒ¹Ĉ¹7gŸ}¶ËF sŽTÐæ\tÑENµFË.»¬^*h?ÿüóË}ºWç­[·–óA¨¨óˆI“&驲îÕDNë9$C¦*+ô¹!`†€!`u‡@Ýx“£f”ÑA¦³¬¢rÖ¦r˜ýðÃîÐCmZ©nݺ¹víÚ¹qãÆÉs9Æf›mæ˜ ”Kæñ`R‹ÝàÁƒÝ]wÝå;ì0‰“D‰¯œ’<<Ö/$ó@¤Rk8Q§ºQßJ†Œ•³ö-nCÀ0 CÀ0êº"CZEtŒUsPNÍéA†.½ôRIšùBtÔU0kÃÅ5d g»îº«˜º]wÝuǹ„ˆ`¶ÆZE»ì²‹Û|óÍÅQC×®]ÝäÉ“Å)ƒš³iÚgu–»þúëõ¯ìõ\ÈóZ-œ,@ô.¹ä¹G•%ºìC"dd¨è[†€!`†€!`ÔuE†èk§˜½v˜ËYe!- ëú0w(ÉAäFŽ™Z S4æçì³Ï>4—0ùË_ÜM7ÝäŽ:ê(ÇZ>l&w¤‰÷·üà›-”bæO…š¡ 6Ø@œ@à8¢eË–îÜsÏuýû÷“*Û1ä6$AeKÈ"6 CÀ0 CÀ¨{šyÒPóþˆ):û˜«±1?‡5t¾ýö[q€këJ™r%µLâ>ûì31‘ëÙ³§xm‹‡Ë% /Öš:uª#´3·Ýv›‘p.O<îRüg]£~ýú‰† -¦jXŠø³Å!»üòËÝrË-'Î*:tèàØð€‡é! °6o^7Óà²Áa× CÀ0 CÀ0ŠD ®4C`A‡9Ôb„ÇEbUÔí:ur«®ºjÆ8r ƒ g ¡ÃLçC˜ËÔ¢ulħ1з4 CÀ0 CÀ¨êŠ i'™ê¡£ qP2ôÚk¯¹ñãÇ×OÍ5BIЮ±Øj›6mÜ‹/¾Ø9˜[¯ÔmX×’KÔ0 CÀ0 C æ¨+2Dmh'™3óY0ŸZ~ùåÝßÿþ÷”·¹Ð20<ÎV›:W’Ua> yV—- Læ[i ɽ÷ÞëØÊ%aùµ>ÙãQ—ßZÞðZxO¹òeñ†€!`†€!`ÔuE†´sLg±`;ùä“ÅõŒ3d>ÑÌ™3ëÕ°VO>Þæ¾úê+÷òË/»­·Þºä­€üÜzë­2G×öÉ”Dˆ¹B8V`±×zê¹?hŸÀ·ä¬ç ÂŒ°cÇŽr0ZßF„ê¥ö­†€!`†€!`Tº!CÚ1†±)¢CMÇšIöê‰ ˆq¶@‡Zµ=ì³ š ˆq•Z;4qâD‰wÁÌ– ¹Žóò±ÄKä¾QoÔ#{êLër²á†œú¤ÞÀŸMïÑú/´œ?MùÒý2{fƒÛçëÖÃ5oÓ.unö„OÝo?ÿ˜úÏA»Ez¹f-ç.LËÿ¾úÐE¿ýá^Ý5kîæ_Ô/ˆÛüwmâo¿ºY_äÅú£½5kÑʵ#Ìïý2ÇýðÍÇúWöÍ[{ º/:÷ÛO?¸Ù“Æ¥þsÐr¾®ÍB´‡_~˜î~šúUƒ0­Ú/èZw^8uîçSÜœé ÌmÝ©»kÕ¡K*Ìœï&¸Ÿ¿ÿ6õŸƒ6]s-ÛuL3üRP¸Jáýú‹´ÚF-Ë/³¦9ÚaóVmìü½"+Õ†HÎÞasA/Õ;löÄÏ\Û®KúWy,H~o"¶3 "PdH;ºW­û aGç™Î4èаp>s9´Ä©Ê•´äZWÌÇ!NÒÈE¾þúk·Â +ä>—8; õ‡hý€5íäˆkJtC2TlÞ?»ã,7gÚäÑ,¶éÞ®Ëê–sѯ?»Onê~›!ŸÝƒv ,·¦„¡CG:©*Í|{ëõ×ÓÝ|ž4!|(?½ùäí­Y‹–n….r­XHÂÌüô 7îîáž0É_ùiÞº­ë}ØÕâ„|÷ÎSî«Ç®•cýiÝ©«[áÀ¹ë>qnÊË÷»I/64gœ‰Ü2»Ÿª·¸oFß즽û\ê?{¯ç–Øê„¿|ðR7ëË÷„é¶ö6náA»§Î~)(*†_óV­ÝRÛá:ôZýÄkèèÇÉ_¸/ï¿ØÍøèé<¶ðsfLmP‚²=ƒ<§ÁX¹žÁÉ/r“_š» ‚¬.žAÃonu&|>ºæhÿøÉu¸³[„w¤31 êF .\k1¤rƒé&g¸×ž={vÊÍö÷ߟr½Íu {Õ åB†Hçâ‹/–µ}V[m5þ–L}ôQqž‹g8òzÚi§‰I]èU®d™i¤ˆ” Ar”…Ú!LåX£ ³9Õ)aâ6#[¾zø ?ªî]t÷Ù0[P»nÔh-›·j[yõÅX÷áÕG».«m溭»¡šÈ¸eÒÈï?ÇólhÝzî64‡;,ˆ!`4&u¡@:Áh}ØÐÐI†è@’X‹†ët¬9i‚ ±gËG–^ziYÃh¡…æŽâçso¦°˜ß-¹ä’.—xYgˆ|³È+ ‚G±Q&ê0$Dª!‚i=ª†Hëž}.2ëËwEc²ò×ç¼.ÃL™2Å~úéîì³Ï<ë²M¬PŽ8Â-²á_]Ç×­ú’õØ5nÑMötÝ×ߥ ¼>þøãî“O>qp@A÷çzS¥ÒÉ5?a8{†C4ªë¸ýR+»åöîÞ¿ì÷ÝÛ£]çUUW-7†€!к!C”J;Ò %Cœ§£MMjˆ„j’¸¦a³ ~¾ñÆ9‘–lq…×Yvýõ×Ï)Þ‡zÈ­±Æ®{÷îa5¬d†ºRR«ÄVç A†¨[6ÂVïË€ Oßæ¸‹kÕ±k®·Ô]84¨¯¾úª d+Ü?ÿùOѰ^pÁÙ‚ÚõFD`ÁU7u´íZ CËí{^QHM˜0Á}ô‘Ÿ{WéÝ»·ûÏþã6ÜpCWÎtŠÍº=ÃÅ"XÞû[wZØ­xÈ•®…7q61 êF nÈbí9‡4çÈ´ ª1bÏFØ\eå•Wv£F*ùœ¡o¾ùFÖðÉe.Ò˜1cÜ^{íUò<äŠA9ÃQ_Ô%DˆÿHj„Øó_ ‘Þ“k¾fO纭38mpÚ „“¼$Áä­ù(T&Ožìºv-ŽŒMš4I< ’—PhÓx>Lç\cÚ´iâ¦ü™gž o“c p.‚I¢ åÅìÔ¤ºèä5Bã¼Ü¿ü¼sŽŸ§@›ãùÓÚlèV=öpl¡ íç]À»!LŸ>=5Ï’ç„xãï\¦x¤tx?@’x¾H/.|kÐôãý2›Ø3œ ¡Ú¾ÞªCn‘j»”–{C ö˜÷M^£eÒ¦~ Ã4“ï!F|\qP€ÙVŽÙøh庭ºêªâBü¹Þ“-yÇí7#’ÙÂΚ5ËálaóÍ7Ï6[\Õx]ë…=õDý¨{mõ(W ¢yÏùök×fÁEçiéÔÁ®»î*i¢uÛl³ÍäÜsÏu;Ûa‡\·nÝ$oÿ÷ÿ'×þô§?¹üãrÌ!:JwÜqGê8¼8餓\=ÜRK-%ÿß~ûm CÜûí·_*ûLÚΗ^zÉ1÷ìè£v‹/¾¸t™»Æœ»SO=Õ]vÙeîúë¯wÛl³&cû*D uÇîî7ï‰ðçï¿«ÂÜeÏÏØ[lá¶Ûn;ÑÓÿûßÿºAƒÉÞW?ü°Dôïÿ[žEþð¼qÄòNç¹=ꨣÜ*«¬âÞ{ï= Ëýgžy¦œ#r 'ˆf5èØî¿ÿ~9߯_?‡gÏm·ÝÖ]wÝu.žÏù@3¿ð »»ï¾[îÃdw%9ºtéâÖ[o=y¯K€Ø=Ã1@ì¯!`ˆ@Ý!0„T@†BB¤fVt¤é8êžIølt¸óÙèȲ®„(Ÿû2…Åö›'ßLá¸öüóÏ»µ×^;§°ÙâªÖëZ/ê(Aë M õÉ^ë˜=õ®d8—g©×žg‰ûãxX:JÔ+(F‡©ÝvÛM‚AN 7[mµ•ƒ@Ðq6l˜ˆ×iFF-a¶ÜrKù¯?Œß~ûíB>¾ûî;é,ANÐLBT >Œn#t–ÐÆÐ9ŒËqÇ'9´K¬{EGoüøñB’Éï%—\"#ÓW]u•;æ˜csÌè0B”Àye”¡lü±h„ˆç‹/¾x‡î |C† q{î¹gYÚ—Ñþ€€DòÐtJÑ·|tÝqÞÛÖìyâ¡b a2Á³9âÙÙ}÷ÝeŽ7¢a`¹âŠ+äÙyë­·ƒ xÚ|çw¤]s-ÏùçŸ/í˜9r˜×A²xvx†xžŽ<òHy1†Pñ¬ìåµïa:üðC¡>äCܳÏ>ë(‹áÒ‰C—Ã?<5× ¢†òÉúûßÿ.$äjûí·OŒÍØM7Ý$Úœ¾}û:ˆØÐáC#EڈƣftŒXC”ÀŽ‘ì¸ :T4ž‹.º¨”oäÈ‘¢ÉЇ³ÿÕ‹@¯¿ K¹h¯Ö\Nÿà%÷ëß»¤õ‘Ðxþío“¬o¼ñÆî±Ç“¶Hû¦='‘‹n¸ÁvØaò}úÌ“>Ú/¼ð‚ŒHfJ{žkì„Ͷ֣žÿ뱆-tO@’!9:;àŽ@’Tè4ŽQgÌc,fuÖBÅw’,³Ì2©ÓhžˆÍ ¤‰ø$Ó8Òíù°Þ{ï½B†è0b’ñî»ïÊ~ìØ±ŽmÓM7u·Þz«DqðÁ ‰ð„ó4øøó‘e> £úÌa„³9´[o½µÛ`ƒ dÄòÿûŸÌ!Žz­¯°Œá¹ð8 “ëñOS¿rmº,Ö ¸vbQÆ«BG†z¥Ã„dJ—Ñì}÷Ý×1ˆ‘c´HIIVÁL;rLÖÆ\N5Bƒü\‡¸0‡!Zò‡9dˆt!3´‘PÐå2×⦇`æ£ù ã³cC œdzÆÒ¥‹ÆÓV$ÈI&‡^x¡„ÇÉö×^{-qn^<=ž¯¸¦çÍ7ß”gVælÐëìíѰcCÀ0?†À?/eËX6>Tñ s…\6ìÓ±1¿æšk¤ÃùôÓO‹‰d†Nð#<"k1uÂÄjŸ}öIÅ‹i¶ì8_`’>#þØ£3ªÈˆ<ÄêÁ”‘J3ð±>묳DkE|t®é(Ó±Å#—üÖr˜xñ_ë°ÎR¼a½sþžî—Ø$su«*ZÌ¢è ¢ñAËÞI±Lä0EBÛƒ‡ùC˜*©@Š 8h†T0Ât“«O~âä“O޼ÙSä5‘ÿðF¾c§ŠÕŒ¼›äÈ“­Èk"O²"ÿ1¼Ö*òl fû Œ9~ƒhÎôI‰!¨{O€ûRãéIPä Gä;N‘'Á‰éS·ž<%^ Oz§‘Ÿ?ù‘éð´{bù‰áóœOwÂw#Oàå²ïTFÞ“ÞLvÇdañ…ù’ò‚6G]²Æ¯£5`þ‰ÿˆÆ/5ø©£”Œ²¶ £û¸tUA›€¶Q|Fð­$˜[1×%>š©÷Ù>7¨{4.Ì ËÅ<.·X†bô:>GcxòÒÈ4¼«á?LßÐÅ'Š b¡Cæ­©+L‘öˆöÈĨ6pV‚& >ÚÜdã„mm:á9G»£ZSLNuÞ ÷°–^‹Çy3ëëÄÆžá8²õù¿m·¥ê³`V*C Ž02”¦21}5j”˜1·#ÁžWǘQÄ…ø0Ã`NJ&¡ƒË¹&xÂ\*]'/f¬ÉÁLD˜TÏGÝ$3e¶€Û`L(“3ŸC=Tˆ.Ž3*)´M\ˆ›µˆÎKp ÂüJæÐ1ˆšŸÖb™òͳ=Ãù"fá CÀøf¨ÀþøkGqXä”->´™„¹AØšxà‰ÁÐ ¡ebQ@>^éÈÌ%‚1$`§Îä]\Å>ðÀ‰#úéîmÊçœø™kÛ}é¦ •½Nxõ_ù9Cw¸VªwM“±îãV8ðb×¢ië´Z±f~ö†ë°t¿àŒ†@µ!`š¡,5ÂDu4CLÇ-|ðÁ²F.x™or˜`eš”ç14Ch˜2!<•m´ÑFbª†$Ó¦¤<Ø9gDÈAÝ"°â˪šüì Ÿºß~ú¡nëÀ f„|8â÷óÌ©á);6 *CÀÈP‚w0´=¸V Ý$‡·^wÝu¢¥ÁÖ<“O:S9<Œ±f ‹¢íI'x)cnžæX‘=iÞHº{í¼!`Ô/í_¾~ g%3j1¾‰~«Áœ[– ¦ƒ€‘¡ëzذa2/·«¬M $æÆot{ý¾¾Lx-~¼óÎ;‹Xï ,~Éá"öÓO?{w&ã!.,ú7`À™ |óÍ7Ëb¬ñ0ö?3ß{+s»j†€!`eG²h›aPL(E#µu†rD‘I¹<¶±Æ.PÉí=ö˜<ÌáÚBé¢E‹Ã„_´CÌ1RaN’w«ì^|ñEÑøàÉŽùJ¡©Ü+¯¼"ÿ‡ âN=õT½Õöy"ðÁ•‡¹>'0¯¢KžwZpC ºøaüδCÕ]G–;C )"À ±hÉ~/<ÿM R"€—Púìá=Ÿnod(2 çñê†Yn¬™ûsØa‡I(Lä0¥ËäÊ5Œî€pýû÷—.qÙÍ|#´JxÃÝ+¢.¶• =ù䓲&$HÓ ã´ãܲ™-䘅¬Þ»lHÕ;PXüϺVí«×ÁCÍT¶e´&hÞªkÖ¢izyå[«¤‡=–-~») ÉQMT¦e²jÐ¥PØã Ì¯Á)íŒÿºeʼ‘¡Lè$\ó‹ Šg9Ö±`nÚ´:ÙÖ £‚Háʘu„p‹{ìƒ:Èm·Ýv©`!®#x•ƒláȽ‰!`Iø•—‹~MºT5ç¸KÕäÅ2b”¾Þ ¡Å|é½Ç–;ýÆŠ_‰{¿È±"–—Xh¡…Ž5VMÕnºJ€â%À ™_Û±–J ¡%Jžû ÅQÌá?š!BÝu×]E£“nm¡LQáHá”SNqÏ<óŒc¿ÓO?½ApÈæx c=Ö±†L CÀ0 C 6hÊDHIî™o=fÌÇ ²‰!P. =³gÏÉfm4“™Q£!2ލ`Âóåªñ¢ÅÁÉ$æ?ÿù,|šO¼[o½µ,²‰F‰yB¼,ØT–^ziQõüñbš9bqÕj‘a‡v™œ¯UK~Ã|4oٺɚ-„8ر!Ð(`¢ÚìÛîFɃ%jeA@ûwìéÛA€Øª©ÿR–‚[¤U…Àœ9s¤ÍA„ö´ÉtýÓŠ’!%=ú°¼÷Þ{âªZÏ“a½ÆqµË AƒÜË/¿ì–Xb Ñðä›ß¿üå/27hìØ±‰·®½öÚ¢ 4HÕ"!áÑãöíÛ»>}úHCÓÆ’¤jÉ;ùèsÂHײ]ÇjÊ’åÅ( Í[¶òD¿UIâ*W$o ìV9ê¦&i:T.L-ÞêEàÛ7w öÝØ4«ÞL–8gôéÜ…±‡ýôÓO§2|øp÷ñÇ»sÎ9Ç-°ÀÇ“tãÉ'Ÿ,ýЋ.º(é²{ðÁÅyޝtwbÀ9IÝh¿,S~ÖмôÒKëj½JÚ„HEû¥aÙõûf¾£…'Êu¬„‡ƒc’e’ö£ÕÚi.¹Æ FZ¹ÞÓáÈ#6šŸþ¹ëÞ½»Ô§:“°ºmŒ±4›*?ϘìZ-еª‹ÿÊ ƒ\_? Qíù¬j-s5ƒ@SkïJ„ @ºa²DçtÙe—u¬“˜¯™ÜÀe@x„ ÒÇ(eåã¹÷‹/¾H«¹ºà‚ ÜG!¯?üðR&]ñ¸ðHÌ|õ=÷ÜÓáø YwÝuK½Œ?Þ-¶Øbr®Öè“¢@èÑ£‡kÓ¦l­Zµ’¥hè“&õK+¢RòÞMÕ¦ìÍ~´Ö›ÝÜüÓø¾ÿþ{×¥KQGrBD}'5¼ú(µ•¨.Œ`$×G…Æü’·³E + H#$¦ƒÞìgûᇲædòäÉî“O>‘åE(Ï$ĉU ýŒÞ½{§œ2άY³\ÇŽE+…F‰ëôC8G»VXÁá8.5ÂàKû-4BݺuK§_C>éêàÔÅØà1˜¸—_~yé kúLlxæ…µaæ›o> "{ÖŸ„â! %B¼àEXÊ ”´=,åBYâWíœÖKëÖ­e>:d•lòôöÛo A$ÍÎ;Ë%Ê=uêT!hèÞ}÷]·è¢‹ºN:é­©=8ó,RžL’ ÊG¾Á~âĉ¢Á ë&S¼JÀ©GÚH¸Oº¯ì†ÛúD6F h@ìMê äR×4òj”oß|‚7C5fÍòdE  ïÞ=l:2¾ÆÀ_u@IDAT›ìxîŽzÅAë\Û{Ú…IeP¬Á6F}Ð ÿñÇ3f€%Cèôâ¡·k×®Òñ‡p$ÉK/½$Z&¼ÓáÀŠ© êÑ÷ÜsÏ•xöÛo?œÅ|¿W¯^ŽEê¬eY:Ôú©8ÄZrÉ%…ܬµÖZB4¸ŽW_È“z÷ýì³ÏRqà-×%—\¢Q5Ø£q".HÊæ›oî˜ ¾Ë.»¤ˆáYgåYdwÈ!‡HžW]uU‰µ+‘3f¸-¶ØÂ¡½0`€4U${)'8P.ä‚B\H%yæž#7Ýt“Û`ƒ ä˜üP>Èrï½÷Jв3bņöˆ:Úf›m¤~(ÿ<òH¹ÎÎC‰nÌ‹O’løPfâØË/=C>ˆ“s¤l¢frúžÓ÷€¶ÏøýÑ ÑÐÈ…>¡-_DbèСîÍ7ß”òaêÇÒ0I}Mú½ô“y>¸‡gçì³Ïì˜ÃŽ =z´;î¸ãÜÞ{ï-ñ0ßèÿþïÿ„¬±§À˜øöØcw '8ú¡qÉŸï¾ûN4f”ö±úç?ÿ™"hñ8õ¿öGÑ~±iÛä:Çñw@ÙÉE¸å:J *d‹å!Œ ûÖ[o¦¿ÿOC¸òÊ+…¢n+FPYR‰ë¯¿¾¡¸Ý&d‰ëŒV$5ÐbÒæ^¼Þ?l½œdˆ‘ ~lØS÷4ºxÃ+¶Lv¿!`Ì‹ÀÇ7žèçãÜá‰~f“yï¬Ü™ž»œäZu(Mþø¨iÇKGþÒ­e~ +WZK©’$}gÂù¬z=ß§JH«ù;¹æ­ÚV"©ªHƒçŒMŸG%D™ÈÚ úEtÞéHOš4IÊ‚iX\èÔc6Fç¼_¿~r™=άBGS™üãÒߣӽÏ>ûHmÇFm$õ0î;î¸C4"¬ûØÃ~5j2D_íÑG•¹55ÚÄó¶xùä~ä‘G„ ÐïÄT RN^鋆dH~! „ÁDí®»îL C´D˜ÃA¨ `Ú.0ˆûÕW_-iMŸ>] çÀ³6´*Ú,4?q!=ÌÁJóˆ@‡ "· ­A«…¼öÚk¢ucÞ愘bfwþùç»M7ÝTÊAã’>E´vh¤À\³‰jˆµýߊtRV2DâÐ )*–`¤+žÇŽ’— yàá:æ˜cÜã?.®°UÍHØ$[LF1T­÷Ö[o‰='a“ì09¯vŸ°fµõ¤ñÑøQ÷†B~ô¡áe‘¤!Ë׿UÓÏfwæ£Ç”…MÉ/ 0 Ìlú!*GÚ§!`(Õ­ùÏZšÕ"ö¼S”ñ} ¿36µ`ëìV¾;tðh#t¶ùîêw©ߤ¾ÿº»Iy’£ùðl"`Íöôý2 ¦VhFèGÑáM'˜j!'ÈJÛ¶m˜OááѹAê…-¢yä˜Iöê<€N7a q¡£°6¤¶¼³ÅEÃBnhoÈÊ+¯,mPË ÷àXBEÉ„>,ïµvÚI/;Êùí·ß¦þ§j¤Èš"ú¸àÍÀó†À?Ñ|ab§¢ÇZÎc–¨BüõŒ mCka£OÌ%WÈÿh|Ù𡼚fÔU.fr”Ÿr³Q×ä}XïšöÑ èA‡¹ÜdˆÂ1é ‚ê5=ˆzÐÈd‹‰($aÄâ¹çžKk‡IµûdaTìW_}uý›Ú§³ÛLðùÚ¼júéìNøËy¬Ö¨g}0xM CÀ(5áw…÷Æ”:‹¯ö mÐ9b£Ý(‰N×1*i‰}µ)Šb«ýöé- DˆAk&èã½-¨':æ¿Ðç`ƒ80èV§a€ „=íEçÔ„ñ©iš’®¡‰AKCú¡@˜ú|Zv=ÖkÓ8H ¬xæNQN´=¤£sxYPùè£Ü™gž)sŒ¦L™âžxâ iëz=Û^óÅ`’Šë5=Ÿ´g`¯m7Þx£8GC£y9í´ÓæÑÄi|Š ñé±^ãG˜<"Ô u¥m@N¦ùá›î´Cí¦ *§ËN†È/2Äˈ†¦Ú‘L™+å5Ø-ìœÆD# m1™XÇH $¶–ZLFƒy‡v˜Tô´iÓR+Ÿ|2oH5SØmbóšNÔæó9ìYÕæõæ×_²wÕûÕzŒà-S¥D?64@6ê[ë¿RyÈ%–ów,›Ù‚–—öΊ‹¶}í´Ù~®&±qëœç@Ûƒ¶\Úi½†ùå‡éE-ü¦€¯>gÙ&ipP×L¨æ=ÌH&¦ÉÌ E;ј¢mPóÀ7óÌ`ê]xÇðLê·)ŽE½—¿’åÓwœ>£ì÷tùPm Ï2ƒÏÌSA¸7.hzöìéFûy+tü/»ì2×۵чËçùWÍŽÆ¿Ã;Hº<HÒšB ŽãàBrðÁ‹)ýF´1h–˜Æ³&b˜†aÖ5sæLÑ`1'Š>²ï¾ûÊ^èƒr/áñ<‡¦ˆ~'äŽvËç` 8$Ý\V39M h¦ô£ÅË©’B%ñÀñ°¡>Íd‹É‡LÝbóHCHg‡™o’ì6ÓyLÉ׿5Ýi¾y,4¼~lâ¿Bã+×}ýÄl¡´ãáƒFùu7eÀtÇå*«Å[9âWýÏžM'r‡f¤¦T¹lѦkÞ²M©¢+K‹<ƒl´£R?ƒñrNxê×}ý]³æsM¥â×ëñ?¸Æ7žÕtI¸æškÜ¿ÿýo!7tÄ™‹‚©Zü>úrÌïùë_ÿ*é‰e Rk¿-]:áù°Þy&é¸31Ÿô B8Hú‚ƒ–EJ¹NgòÀ`;mŒwì‡Hè<#æ! jÇ5_8W€2ÈŽÖooô;Ѭ õbÎ^ÞHƒù­·ÞZH“Düð¾ùûßÿ.XRÞEXF½ñÆâ7Ùlh`˜Ÿ¤ÓA4 È}cœ\xá…rš>+d¸öKïÓ=y†ðpÀ’ÎcF¤Þ Á‡2SòË»žºƒlBû£.Ù´-fº§ìd(̈v’•eÊX©¯¡YA͈Vµ].¶˜ššbì05ž|÷ùؼw’Ýi.¶•ùæ+]xº"êÑIƒ®iV"DY)?Nâæ¼IÓD üàòÎ7nœ|¸µ3ÆuÝJP¿“îuÍZ´*ETe‹ã·_æ¸è×âÃxÇðMÑï D3Æ:J!ÞÅt~zøk:CŒð2ËŽA®ó<Ñð39K…Ð4ëÊŽÙóS fôLøÖ¡í¡Ó÷}A è˜`ÖÍ7QMG¾’Öá5Þõ¥ƒF¹°tˆK®ë‹Äï+ç%C|‡xöÀŽ}%¾Kã¾ÒuYuÓ&ë½Tû`žNhS¬YÃÀ0íœv PÁb',}h§Ìùæù§j]¢ñ ç§àyŽM'ú,p.Œf¾ m%|& fl*h È+ió\AÒ¥ IC‹Ã ¦u¼tî’ÆÇÇuÞë ×Chh›Dï ½_SH7,‡Üàpü5lØ0ùŸ³ÎI´B<Ëúmb°&,–H \xÞ±Lâý‚0Ÿ*ž&鱩 Yƒ„q?ï#´xJRãu“ >ä󪫮Íu£ó£4½tû°½ÅóœtOÙɉ’ÝÂXR†Êu›En€Ë‡©Æü5Œ^ „ùP;LÔ–Ï>û¬4üðaÑÆÍÇác¤å ã)÷1v§Œ jwJ9ÉO%„ºo¹4ÂJä­\ih=óà±ñ_ ¡Mâ.êµ/ïFåùx3‚F[Ѩ~˜Š-Yµ¡b˧÷ë{†g<ÙŒÑY Wé=#¨£³=<Bø>02¤FOé°°Fž¡øv0¿‚‚5‚Ɔ‘j`ÁìíƒdtXèŒ1O×Ã÷Üs¬3Â3Ú'K\ú²Æó`éh!|ãáe}‘påyFž¡ç;ˆ@˜è„‘Gd³Í6]˜ì0âM@¸ˆƒž¼õ¦£ˆð$_xýjlQ2ÄsÇ3ÇF2ðì•êùkìrVSú¼ßÀ—=¢ûLyD{™«Pgù„Ï%^œ7¨ûèLá!+IéîáYûŒIáh›éʃƆ-WI"Aá½I‹¤†×y> A… $ŽwC®’ >ùj¯õA»Ë¥í5 Y[®€årƒA%ȇu&.FêPòb'¦q0d>J(4=)žÚaÂàéÔ訛F&ÿ1:˜IˆQ»MØv!ó!&.>N¡Ýièn²tr½G ¾si€¹Æ]ÊpŸ¹Íu°£wR¼¶J4ÊN¹é¸h祔y¶¸êÞKtÜi7¼ø¨òó?þ<×G‰K_Šð½Â1Ϙ‚-ÞÆ´"H|¾“±ù^ð}@«£¢ó<ù×Á¬Å·BôÐC‰© ë¤üç?ÿ‘Û™ƒ©b€A/ÌŒ BÌeÀËZ(¾ ,fÉÈ*ëÌ1"|÷ÝwÏãå4Ûú"aÇMój™ûÊ·³L–òY_DË_©}hµÀ»ZÛFT*M)}Ny>ÙTÛ ìq^C®¢áôSÚ£™ÅM5ƒ•ê§Õ¾hèwgÃ&éÛ©í¶—Ë7¶"dðõå£ûr“!âçCƒÀ’±ñd e»™l1å¦à'›&#y||p¶À\ŒŽ‘vR-6µÛÔÑÄ Éœãñ§³;Í)²RB QUëKîË/w¸÷mµ@C·çšï\÷Z>ÈÇJ„ F‹¤$[_3Cèä±Vñ3úŠÃ aÃ>š ñNW®yÏŽôØô…R®4é¼Ñ±‚¼÷ð#éŒ"SvM7[>s¹®/Ãøó’˽ñ0ÅÆE‡öAǘüèF:¥ÈߤîvÝÖÞÆGVZSÐ8Åü—r– Ôm”gD7ðmLQëÌ?Ba¤”ÑXÌe4 ×ÓÍó„` ¼Óé@dСER2DÙÑ4é„o4PhoÀ¢é`FG<äÉÜIn€I/Óú"LTWA[„•#è *bÂU.fvÙÖÑx*¹W²¬ß(öúŽc_Šg¯’å©ö´À4.zm¥áÝÜŠ›¤G õ1µ_F¬úÝfŸN*F†4ú@dÊ”†-tŸËÈ|&[LÒÅ5”lv˜Ø‰«WÔy¬È«Â‡QËÍ9F£B»M>*áõ¸]e6›WM'Ý)i¶JI9ë¶ReÈ5-kØ+„ ±Ú4 ã"hö” A˜1IA0£ á%†Ž¤©\d/4h:i×ýr¤ ùcàìx&è 0‚ÍÈ5RyF‹ÌE×Xc 1ÿA£[Œ”".4Úa§Cé£ •j„ú‹û.t{¯W4Ñ/§l÷.¿ÿ…~ÑÕyáËv_x÷%›vjõùËåÝÆSêcFy!ø8'ç)ð¾‡A‚ÕS)i§›ç©®{ѾèDfFH!”YEMêøÏè2“¾Ñò0€Gø\EÓÓ5E¸Ou= -¢sô˜ëú"O%÷´ pÓ=i‡ßÜræ¥Ý"˸æÞ±ISÞk¼ß´­r¬sMmCM +kå ­©‡BRåY×÷Tºç¾¢Ã‡dŒ¤ËLå ú#%ì0ñT¡/ö?®Ì{„†)“-&$(»FF s±O7'éϨÝi)GÖÓ§Öð M\Ã+õùO0^øl:ê_Œ©fŽøÒW»}ÜŒÒFÂv‚]?æ8ê¦tIï;|hè2¬qÐâ?$ â…PWÌ™£“8î×ñrœ”&ç™ÀjСÙqcŽ£Î;˜l­ir Á™“RÁÏ9˜ 0§‚VF¼U M‡¸µ\Ä¡$„x™˜¶Ú äaÉ3¦L* ppŽü¥‹KÃæ²×¹ ÚygO>uË%ŽZÓ¾GŸ’º¥-ƒec“!]ÕoW¡è4Ÿ!ñMZ_„Ž¢.2ˆ–…¶Ç)¡½†÷3¯GU.Ú5fájÚ¦P —´W‚¦kŠFõšÞ—ïÈW.ë‹h<•ÜëóF[©ô³¶Ò¡W»mæ¯dq5-ðÕ¾mEÛK¶¾÷±~šNžæ´…ÏGcŠï’:QHÚëúDIùcP1þíÓp,‚[ð\…ïó ÑÆbvËÚF?þx®·7¹p´7m{žö•I*¢"CúÒ½fŠ\Ø!Ñó¶Ï:yx8Á!DÜUbî±R P^Ø‹O¡úb œ”—,fí4’[>hEÐ1ašylœ ë• Ñt|è(ñræeLX%´€Œ(ãïM ǬsÂÜFoy™âÆ3ÝdkÂââÁŸ?Ú(ê6L“Qh®édmÂbÊü4:Ù&[cJŠ5Ô…é AƒD;FžµS[L:˜Þ±0á⣮é&¦3°wïÞB†À9† 6éÓþ ,º^‡\ð?ü§ãÆ»‡=Ò,d}ž!æ1¡µæùat6ŸBÍO©÷!ÅAÛîK–Å7ýö³ç§ýƒ$|xßó=a>sÜxײ`(íSRæÊ•Zx×cršÍ‹vBþ´®xæÛªÎ¶¸žNxV·ÜrËDg X}`~>dÈt·§Îóí „4q{çH¾Qà>C‡M…µƒ¹h»S<´ýi»Ô󺯢±ÝsÌFæp^`RÛîT;°Ô«¾ t_š–&–ùîY³ʦe×­ÍjÐV2ºË‹Ž=>ûC2GÑ"ˆvû¸±Ô9ñpØý3ê AÈ4Ùz¯½örxPä¥ÏK–ùv8 eÿý÷"„ý7¤ŠŽó ¡[Ót“­!*HüC„I›J1éС„Bñ¾ÅœA> ™&¦ƒ#’|g s>H[<.Íc>{ÚqÒ1¥£HMÛP>ñÔrØ'~æÚv_º$EPìØ+–%‰¸ÀHèø3šMg‰Q[¹Eã ¹`ÁÈPxv’ÖáÅ©E;äYȧÂâÅŽÞ§ë¼i(.œÑ®ñªJ| <„Â;'Óú"aؤcˆ}¦õE’î©ô9ma›©tê==íÛQNäQ"¤d4 LÄñÔËÀÞ}¬Ö\sMyó<…V9X„Ñ€„ð&Yø.á13ë€ÐZ€k´ ¬°m´‘cCø~ð]ä[æ‡khn¸ âXHds ”>qñmGC‹Õ„z€ Ï:ƒ,rªš6 ñ<굦°§Þµ/Ê>lIuDå—M|â‘o¤‘ŸÈy3ȳÛÈk‚"Ïø#ß  Ó‘Wý—-}‹¸rP—~^Uä'ÓFþãy3È›lHÝûND[¨7ñÛÈ„Fþ¥y­\ä „´o¯VÏ»m{³0¹Ç/&ùRä;>‘w‰.çüz%‘7“‹ü‹V ô!9Ïž<øBä]r¦0ö®våºYJxß’ÿþ¥QÏ¢m޼‰\äGŠ#?Ç@ÂøÑd¹î ƒü÷s†ä˜¦©Ë5Òõ ¹î½_É9ß “ÿþ#&ÿýÜ;ùï]ìÊO´ä¿'*òßœÉÿ¤ŸR¤Ci›þÃ*Iø…Œå¿ï8Ê{ˆòz­Lä;ƒ©,øJ Ã}^‹–:+u!Ç⣞‰Ç›9F^ëùhä'®K=úfŽ1¥öÖ9»F¿Ìž™>@\yåŸE?ÏœZpNx—ÐŽ½3yÇxSψöÇ»Ç;®‘º«†ï ï?˜!Ïq¼nyÎi^ƒ$uÏó@Þ<.ò OÊû%ÀøÞòJ'¼¯x®’Òâžg?Pymsê]‘.®¤ó´e??Jêƒï~58{s¤Äox¥Ã¢TyÿüÞ ¢ß~þ©TÑU}<~00òä#õMôÚžÈÏ¡‹¼¶2í³éÑ"ïýpž²Ñý@\ä;ö‘תF~ð-òšùæØ›‚GžäGž DžEúÓŸä y’¼Ûy¿{çYѨQ£äRß¾}#ß!޼™itíµ×Ê9?&ßYoÙñóMjÏóL{âRñƒ‘_ó'òښȓÈ[Pè¥È»È»¿¼s“È€ÉwÚ[SÈu¾©^{œ ›.}ýîЯJ?P—ú¾‘¯)ŠñÆØIPÍ(·šwá]ŠÑá¸É«N”Õ;mLðF4.M‚Ñ:FÓ1ŸÕkŤ£ñê^ó‡–Çh¦);¦EÔ‚-¶J.& 6—=mD7ƒ‡¾s¹?[˜U޹ŵhÛ>[°F½ýæçmø­”†`©í«”q£ÀÌ!êá½$ò§žÖI Ü Ú¶emc¦õO0³a´8)-âàyÆã*&Dù¼4}Ld™«Ä;HZÔk¹×gL÷•Ì ùaîÚM•L·1Ò_Úï:ÚçÓµ9Þ»˜%}·¸—ïšK±n&œêjÓmL×°@àÛÁs¢ŽK°nÀZÓ;´$ôɰž!/¸®×u°ühsqEÖ‡o΃ðºÈ¼l‚ù9汤A¼˜yóýRÁÛ"ùe>/eDËO¡MN—>fðhƒ’ð! ,)(+‚·HL]é? áB[Dza½HÀ&ð£e¦ÝÑ7b¯mQ¯…0”ÝLŽDuÓÎH÷`„™«–c;¶Ðt r< Ñ8y€™ïÁ‹ w½ õK]kãÓzg_–5ÿý¨Ã<ª÷|Ë¢½a&ÉÐE}“âQ¾"Ìóè<é‡">Ýdk:lØfã1.|Fã÷ëÔõæ7|ìèÌéqè‹0éâcpž'Ìõ ê9;LÕ°?ú裉"w!éHÁO81]‰ƒä“%å!]mË^“$怡gš º¼•ÑV´ÍÔëó‘78yÞ mKë*ü¦àÂ_­Â³Šç9È9¦7&åE@¿Iš Ïœ¶=gûÒ!¶tøuÏ»;ðNä]~ŸÒ……Èó]¤>y†0ż“ùi˜ 3_r€ó Âò-„ôx+ 4¤_Fzq¢ +s–è¯ñíÅüŒ ¼©bŽ–N(#Î~@€ ñ …¸ààhµÕV“Û0KUóXú’ @ê@ŸÆ›)}ÈY:ó;î‡\a® ¼²ößkÞ…úÝÖ4›Âžv³®ßXÝ'•=} M ]À9†p#Š0s™¢$\Ü~3Sx½–é> ó+´S¤÷dÚÓy@âˆ<Õð#LÚÌI C‡0ïƒL=Ó<îZüOÙã¢u?ߘÿ¿u±[üϸf-[ ʬ›¾t_hä¼Ìùxxõ®ÌÊ/=ì…Ѧ 4Èõð£ÐLÞÎ&jwLÛN7Ùš8øÐi …É›ÌÇaÚ6u žéÃƱýöÛËG‹2òq`î^é¼I¼ø!*¥HG?¸6Æ+P¶‰éÌ{ #Íh!\˜3׈õÃâqÅç;…åKwÌGŸöÁ>/áqº{í|CÀL;[ìù¯ïbÚçª]踘”:ÀHøn¶g®<¸ës§Ïb÷xª|‹è¨ó ãý uÄ»—y1K¦ÔçA”:G˜Å[Lˆñ⠲Ĝ=]c’0qÁû"§ ¡ð È$ä…ï 4´7 ª…¢È*hMÁâJ¦ô±ú@CE^’£7s .ª@âTД‘Güšgm‹ºW\’öéu÷I¡ 8GcNÚ²EEãb2*¤…ÊÕFŠú’ÑpU_22@ãWS´t÷‘*VF XÕ›xñÖƒð°ññTa±:$H!ÜâÁKTT ŒØ3Ñœ¾X!GmNf£³óo ÚÑ ë½šÊ>ñù;Kj¶–SË_Ly1)Ó‘$uÕ›)¾n¸A ~F¤´ƒ¥÷¤{1À]7#`x£Qn´_8˜¹ÐnãθÎyÈ/bœ àb›5ƒÐ‚æ"¼œzè!#Ï~ž“!00I”Rl:||Ø(f:1“ Fìpü ÓÁçr3A–æsôãx·£•I'Ä…9™Æ£ÄHMîÒÝG_þi¢}¢©&ázq©0ØñÁøL飭‚Ñ š.H!Z-•h1ØH{‡P55Ñ6>ëzœˆ…¿¡lâ+¡_q“ǘèšÉ‚("ß!‘ lÄAXo¦"Nü‹<ò#òä%ò\&Òy÷¤R†L÷1Y™ eLâC˜ô托|;tèP™´.ü“¬íóчcú||xçù¹, úUaú|?ý\ q>À½NQü@xäɆü;à¤ÜŠLÚ`ã¸E¸…ðâ¶£r1ãäÌ aœ2 b‘0T±¸öÞ³æ™ÀŠ™‘¯1ù«†52¯d—ÀFëXÛ@É"¯Âˆ´¼º¯d/½ôRY;„ ŸL.ÕvÙ\&É6ÙšQ=æ½eÌôÔ>9S¸L×ÐÄd3!-6Ð +ya¤Mµo™ò–t-WR˜tçüK85RÚm%]¾*y¾M—ÅJš\øM)Å»Æw–DH]•ZÂù§åL§ÔùŽÇ‡«¾»˜÷#¥Š'—<𼩉{}þJÑfrI¿)…S0÷àN0áB«Â2ž$ˆvž© Ì'ͤÑaÎ)K?ਇïŽyt.,$°øñÖı¸ï¦ÝòdžìžÞ}7ý9Lí° ¯F?óq,‚¸7“`ÞŠ{Ð1ß—~ S%ôû‹yVD˜Hócm¡¸dK ô1Cg!¾óàƒ6 k¨° Žôx¡Q 0§[ÏÿCLô»›©¼e'Cš¸fŒLéKH¯Å÷™ì' KãÂd¹xÎPÉtªDLpB U‹áyŽtB<(˜Ì„‚ÙÂú)ñN âÝŽVÅt’™2ýPÏúÒÓ:/SRU­–— …ǨÐÕ{Y93 çÃ'H?6ʼðñ,“i}¢ræÉâNF@;aÉWíl.„ï}Þô»B‡(›`&ÆdkL H çÔ¼³ë$'$~t7uORLÚÆôT%iݾ:ÿ4)ÊÃdgÌf4Ÿî1ñ¤Ãâ ×2í1-'ñy…¤ ~”/.q,ã×ãÿÓÅWŠÿ”“ Ü蘲çy4)`œk;e-85ûâY‚˜„mr£&àšcö¨y8Ï0ý2M5Š0—¥OF8ÌÆY‡‹o$ÂÂâ8€XpÍ[õ¸3Î8C¾Ù¤Í ]\0[§ ©p¦|ô/$Ó21_=„Å–9&š¿°<ÙÒçà!‡”‡w}Nû«ùa±baeïþ»IšÇ)ú¬+)×óéöe'CÚp¨L6}ùhƒHÊXu¸Z7^:˜ïƒ°Š<“Äi˜¸Jdâñeº¸¼š¼Ar0jH÷òrTÉ4ÚExöàƒ¥e ÑAt>`\ ů7!¿¦ à–k#¬4&—_«¤î‡)+í\;dìFžCho桪1Ïœ&í„6{„6£›žËC}\ñÑ+neû]° qã}¾ÇØKÀ\Qe.˜Ž:Cž‰f §Ì!ûÌÛþcyÀ@%¿•ŒXCB˜¨Ü£Gé¨{ì±òÂÊ€ÑWæ=0R#æÑ1i™<1’KÇK矒/ìúñ¦¦Ã\¹½÷Þ[î!-òÃ}äÍ)£ÐtrèxÑQÃJ!œ8–7<¦Cˆ y\ôùe®-ä¯4!T~=™ÐMÌ‘EÕç»LÇŽ|‘W¾—Ìóe~.ó%'m¿Î’óæíîÏþsƒx2yÎ Ë”ï1í„MßÍÔ‹>{ñv”oÜž>õ«Þ85/̃¦/‚ó¦JŠbJšzÌ ŸU9‘ðC]v!árâ)ê’g;I¸‚c A›TQò¢ÿ!0lù ï„PÐÌ„Âs’Ƀ†Í–>Ïoº²jìyo±5eÑö¦ý²°=r¬×£Š ‹¨&®†î5#á¿ò¨L!@®ªñÒéà#ÆÄgr|Â3®¤)Ó}&Óédpâç3Ø:ñ#UŸæ S:œ* |ðpÇ‹œ2@|0ObÂæ bX|ü0L=ƒ‡nIu^-e\v¯³]óÖsG…‹ÉeÔrRnýÏ TÓɲÍ0  ÐFè(²KÒ¹x˜\þwXÚ{3ªòu†>ºö÷óŒ)¹'c}ÞÀŽNDˆ}:¡ 8ìÀ¤ÇŒ*C.˜¼P OF¦&?cZÍ·‚÷>ÇŒ £áAˆ2q!>³ž¤uG…&¦9|‡Ât5ö J ™Â¤š<1)›;„ïÒÕW_-ßEÈæ8º\d¯½öÓmeÇ ?ñš&ß2§@\dϨ5ùJZï…K:¸|°c †°¤A~À3O.å)$Œ>_¼›Á]Ûqq\«Â:é4ˆµZ&ËwáàÎ;ÔNSÓ»3ý—£„Xð"Ò—î3Eš•)ªJlIù©ý&¶˜¼Ððà‘at /Q¼ä3Ýׯ_? ˇû¸‡F§`ä¸mÄë >ü¨â›—6£k¼Ôù(á9‹¼±ÀªYa“’8-#ÿã¦sz­žöúQ ±ÓP=•3^mßœ×Qˆx˜zøV]h×Cy£ ´:íìu+e>–ßÿÂ’ýRæ)åöãÆñÓ9ÿŸ{ÿÁùÏsÇw!fHCCR˜‡–¡ƒÎH±_ ^æÆßÝÞ!ƒ¼ß1;eôùñy¸ýåÛÃ(=ù@KÄ·íñaÊÂ÷)“0˜†Ö/Ž#˘¿†nòÉ+£ÜX&Ÿlqd3´Xäƒx™SUE©ÒÄå/ÚHXBÜXï…Å&!•Ô ó|!œñy¾ä±·)$BÚ†t_L ®ôGâ!Ä ö˜LÑoÁm?Þ,Á€^o_˜Kù‰þâÕ¼Ð$Ò¡ ¡áWò ñF†HÒÖ™C‚i"‚‘> mRªÂ 1H®á%SËÌ‚¤Äã'Ü‹OÃÛ¾v`ó@“ü(»™\˜%Bí0ë>¼®Ç™ì'u¤LÃ2ŒM%“Ý'/hlF±î¨5Q•²¶>H¨Ry©ðra´N?´|ˆTÓ±bòž š!^dÞ«˜ó^>ä4μ§5 R×{%B™ê·^ Œt~Ú6Ç”_ÏÕz9ã“›y°é6)xÑÎISxf@®ï¿îq-Û-Pˆ|#èt#Š[ø}I9ïk„û¬TˆGßñz¤pÝÎãê6”p ‹ºEëÃwŠã¤uGÂ{õ˜u܉ß(:²*¡yÚíëõ¤=ñR.Ut1H:Ã¥H“4xçA)³ fáÌÏ`¤:Ý<_ [‰}h¢Jzà–K»‰çb‚u‰÷+}¬UÔŒŸ÷$äHÛaàtĈ¢EcPóKX± AÃH[aP—~KæÉ`©B?SkŽi— Fa…ÂäÚ-äN]'†{ék–ø©gðgi&í~/¯%¤¿B¼œ‡PSúPÌ5íô‹âe¶ÿ†@=#ðÇ›« ¥ä%£zÝk2ñÿz>Ü£­\$Md ÃÅ3ÝÇG$Ý£ˆP’ð‚Ò ¶\ç˜Å"C"¤÷aRÖ áåÇ‹ ³†¦ Z¯éöÕ€Á'7ä~ûùÇ¢³Bu㣊Ä;Sé¡ó®?†£“C',©" ާǘÃ(þzŽ=縖$Iù¡³N’fÄrøðá nç:#àé$[^ÓÝW¯ç©ƒ°–3©ÎÂëõrܪ}'¿ÎPáŸÅOŸ9ý>Ù0 ;ë¹àI7eçÌžÓI.ëŽ$Ý‹¶*®é¡ƒ®¯¢åMº?Ý9¼‚I8–5ûèX—*M%iéÖ{!oñy¾Ùê)]y =Ïû-^÷ä!_L1—Ç SA´>Æ SKˆk­áÙŒ9Y4c,Z­Ú>ˆÚEˆáꫯ.÷S3Z?œb  ĬÓCú hòÐaZ BÀ™¸B!4I䋸 MÄœÁYHƒ[h©{ú3ÄM¼ôQ ÆéúEa:vlÔ+…‘ D¤Ò/Á³YÔmm´‘¼”Pgó`Ô§©ˆÖ¯î«±Üß}Æý:{fI³ÆÇ–k.åfD‘NV|ñ_ˆU4• 0Š­f2tÀø@2¯{éä`r¡sÙáeÒ,&< Ư‰*#£|l!üá"ÆHÊs+˜Güö˜a¨Sˆéó¡Åt” ÓŒ#Ùò*šè2m+aG,—vÓD!ËZlpT,Á¶”Â3À¼":›Ô#“0ˆÁs¦¦t<¿­ßpþi˜OæfYÌ\-pn KA„aó9&t€qÿËü´ t€™\]Lš ¢QÀD³=Þh&ð`‰§+æMí³Ï>R/™æù†ñäS®|ÃòÜQlÚVˆCë%×ø„¢ @8x2 ªâ{ÏÜe<Õò…v¡v:0._À;TÉ%aù&°Ñ03ä:ýŠ¸ã æbaÕ¢¢óFh`L°ñ]A8”Âõ4m›wxºÁ9³{ž_L:Ѹ…‚ù æ„´­R u…6“¶Ë³À<¶\´­¤ÿÔSOÉÔRå%Ÿx˜'!4h´5L4«¡îò)C5†­8ªFÊ‘'¼ñRÃÜ.Ý+GZgã! £|\ø°b~~lã9ã¥ÅÇrBÇ—>öät˜8æC‹7+>x˜V ÑQ]Âãò—8^zé%çOW¤‘i‚µ_`XæÈarƒY/}Fé¥Ë#˜Œ†“¤ùP¨×D¼^1rɤr:Œv†æª™òǤ)ý×ΘvÂh+áÖ”°(EYõ¹O:Rú<–"nâÀþ‡:´m:³C’\Úž¹H˜¤1è@“9Gºî Âù§„W¡S:ÑÌCå^Âb~UŒ ¡ÀÙi¾C˜ô¡AÀÔ¯˜4!ÄÇ€ Ï:^ôð‡¥æqt´u½—Ló|ãñSÖL÷B(mñw4uÜžÐ,1)>ÊÌû˜:…üQ×C‡•w6m„9b`À|/æâ`.§Î6Ð!tì1£gs\0¿W'O˜Óó»ü¢ÎpÀ‘Ô&¶Ýv[!Ï| Z¼»Èé1' s=„PVæ7ñŸ|âõ2@¿±­%ÖtöCâ†z%+ÌŠ{Î'ïÌ}eîï ¼ 2ï›yé ðË&¬I¤•Ù–ò:Ï2æŒ|·©;Ê@»BÓH›2)ÿ)›ø¬–ëG•eõW?b,«ÁúF\’•ÂË–q‹8o|Œü¼¨ˆºõæuí;ÛU¹Ê÷˜ã7ˆæLŸ”wã7øOäÉ”Ó SmÛ™!î_Xñ["ÿ¢Ø«…ÜçõMß9‡: "õ›üܰcßÈœP[U;ç:¾mÆüáí«•­‰FDöÊÖI\zµ Þ«¹üR¯{¾ÅJcÞw4¾ô?ý°ŽåëÓZý´ š@¡n`‚eƒz0õµÑŽ›ÚÆ¶åæÀm%‡Xü×5.9ÈX( 7 w¡ÿ–k±š×B‹F“I“Dë–—aíÉËÇq=)'× Š´hk¬yÑlOÞÖ¸o¥íiꢿÐìÑ¢G& ®’4”cÀ˜³ÓÁwŽ~•%”5EQî4,=´ÂÍÄÍk¦tÆ)ñâ5º \d»®·äJ'Û˜@,"ÍDQ×\ž÷²XxÆ(k Dù|Ÿ×“Ÿçqô ñ3Æšõ£"nF²ƒ¦[Û(ÏæŽÌŠa«¬·°üpkvÞom%Ö8ß?Ö€¡Ù·°”qmãjÌ”ÇK³þ4ŸÁ* †É}Þ)æ`פ`çJÅÛ ï¬VÜŸõñ¹¼{Ç5'‚°eÅ” “fIß]\„…EÅ‹ø@KÀ­ ¨/«7dñôŽ˜2ÌydÚŒç Ç8èë9#×»öܳd%ø»?w­éºg6çÅapWÅ\Yý; îaÓpW`0ÄçP8èǘ\îÌÜ µ²Ì €~ñ8Æh‡“ øÉy±ü¼q1 Äø4fÃîqÝü߯S¸f\ç‚•s“m› 1OcÒüb‘ÄÀHÀ7Çàr¥šûÍ“„7ßL²Špç¶&^ÛÞÜÅÐÜc~5çIy®‹Æq½Ì•á N83o›3Ía\Õs (Ââ™Áx~¬oF°©-r)©†¹Ø\«¾ìoÙ‹s÷r‰[$ß ´nÔ7mt•à '\ç)7ì͇\-kËVRäà Í¿Ú1°yûé•qVvàã€Ùø°š”cÕîæ³if㋊9¡M i¿1›{›Ï^)ÿwØç¹Õf«Ú³Î&¾ zm“UÔKè1É…_8MX,þkâÃØDúj“<âžù¢â9ˆÀÐr!4ƒÚÓÜL#$cbøÃcìÅ8˜ˆ tcÀø¦ ñ­ùÀcÒX(@#Œ4šguV¿„4Ε5לp ýëúk] ƵûXª[_\M ˜>óê” Ö•Ðc. @3@JSëFÌ $д»òÈ$ÓV_97 ˜Ðhb?è)<‚¼>M#–B ´ÉõФ‹˜ýÚä3¬ü—i‘€þZ5ÿùSÓÄ0­áÊZÎï_ûYtµC‹HñeÖ6}Zû€1ibïbcÀè‚amM…VèL[\ÆX¸mX—ÄÜÀJä{›ßø²›Ë0Rúˆ±Žyc= âeò~ç¥6/"Ú,:,zm÷«‡Â(4ä,=´°ú‘8™X¸õ{– Ê+Ì ÷Ký×û±Üä5AÉ5Šn'uà}ºWŸóŽ®©O;  Œ \0õ-7 'š‰×À,êß¡…×óJ=£aßD!Di'Sªr… ì°oBx@ûÌ/±áemm !s­gå€ïd "€ä®àQùÜìS˜A¿›0È[dÔ²\ñvIMpŒ~Ê¡Üe>Çå•° ¦Râ£MBÇ„1IrKÀ°œ&+…X™ðå{hçiƒ•·Þí¨ztfé‹1Ž'ÓøË²>Ù®w *)›Å¦k@©;@½ƒ“É͉&”Û‡çr‹!„å $&€K ¼ÁLÉaØ»q_É5 î%xÚ×UŽ7bàûo~zuõ·ggwä[¶mjÉ¿sW­&ÜšeÈì.µY~Ôÿú3BÐæ;>¨=]Áͬú®IA(0:à‹@ÉEÒ|aŽBøYï¤NÍÝäF¯ufIV=„¹- p¸¯pÔ×Ì[gÔÁÃú®ï@ZKZ]V8% Ž3B ’ª—+z0%m©{ÍѬRÞñ OxB²<òo7Ïš̓´±æ~ +±Àp P®ù¦m@ñ2Û5ZÐ,šäuµ«Í0MxÛ3—Â9:K‘¾®Ÿ±¾Zt®õõPð¬[˜ƒ£ 4£‘Ù„À–àw£òÞÍXÏ3Y((Xz0×™ƒ&”4mi€~ÂŒ–Ö‰9áj‘[YrJæ¥\ C3k±QB BMÊ%®Ç,ï˜@Hh›ƒ?üðÖóãÆƒ9ž›_ÓÕ%žc®Ï“€øáÈiB”U—ë͘жøQÌÜ ˜Ö¨3ßbÚ,𠇄è6`ÉñÝhÏAÛ»i§ER «€–\ŸáfSGlÀ&å(w*¨„ V.^<9X–Œ–M O.x¾í½$4ùxýÓÍÝS/[¡úhó) |qL´ý!Þ,Ò¤{†=«¬Øæ¶v¨‡•Ð÷¥”µ '³þÂCC-Q‰c}ݵ<01ÎÞpÃoWæÇ¼Ì8Žõ±<¾'¯“’Ärˆâ„UOâ¨pWN,&'”Iø·ü¾¨Û|Dù$މÂÚ¼•ÏuÊ…+8å²ú”ñÍ€oŠ 8$(áÇx‰5ŠìpÊæÏWOþßuß1 y°—ƒ¹&¼ ¸âX‹š'úˆ6Á‹WÁ˜JËÌu"ñŠÉ9ÎÅ~¿¨ Ä€@¬Ãëü]IÔd/Ñl&4÷0f´p\í´›+ _Ô hÞ hèAïfÀ!ˆy=ê£Yf*Ð[ïwTµÅµ¶é.0âƒYÂhÒÂâM¢áæä:bµ¥XÁÀœ1ð?}µþ²ktUÔ&(çÍ/mý—P BQ?&‚Pœ76BÊëë±çÆÿA{B+»ýlÀüoœb͵\ÁñÇŸ´ÒÜP0< \©)›àÉ•®˜Ð¶øÑAqŸéa-?,o´ÐÕˆ)Ä„¯­­o‘F׺G¯L8…ŸyÞ½W z7V;îXêÀ”óR(°paÞçõÊW¾2¹ìë„ç#¡D</‰ðáúÅ»ƒeDL§oÉ:Eà‘‰MŨ¸ŸJ½¦¶PˆuÒ÷»ÚA¡B (FYžJaˆ[…ÅíD¦ÌB&ú6ä/jÐB80©'ˆšŽKSC+AâFL0|Fˆ¡1 m@Ûóã\îÃ깬=&D[¶.ÏuL¸ó\¥ë4D&¦½›I€o}ÓåÎrËY^_9Þ€mï´{µùê-ç…}cè;`}ëÈà‚¢e_N® óBV¹yÁ0ð»ÿþxuõ•³(-؃øA˜Á|>öxJ#qIó;í; zÀ†ñÄ`Ф³Èp%䲇ÁÅÒЊe\ȺbB›ñ£Ãâ>»ÚMë™e5àÝGï‹mËוÃ좕„&çi®)ãrön]q¨Þ£À à#XUœ\9Í÷¬1§#,á(Å|?îS„Is\Ïc“XŒð(xñ¾®‰‡&”²ttÁ vàe¸8R,smÕÏ)+¸ý³J ®×.tËBßJ±By6w/cƒë ë‚sÆˆÌÆ„>^`z1ÀúYC§·•Óײ©†hÝH¼& ’­ ÁàlƒAþ¢\7Lê ñ#`ñ¯T -«ŸxR`/ÂGóƼʥ#48mÏÎÏå>¬ˆIͧ~“’ççɸΙD7“$MN½›²&ÞŒ{ê" 6Ê~T xp9ê=s)·PÏáþ€Ècþhe L~ýùwV½«×/hƒÄL¡â €`bZðp%c1"(I¬ƒI¤$£='pÑsÄØe‡Å„¦‚õåÚ°5P¢ì°=7Åð #æñ¡Ü¡ÄX`ĹO±"4³ñ {·¶8T4”ë7¿UÅ¥>Æ06X˜>…D9«ßGæÇüÞr<=Àÿ [ÓizZ;=-Ùhcœž6%m…I[vÄ*÷É L4—fž‰¹Í_Ôµ3Î8#M^Lª6ÀŸ“I•vŸ0q"MÛhƈå‰g Ú7}XM€ê6i‡tº¬@€ ø;V&¬O¹/ê ws?ßaš¶*k2emâJ1`õMiœ L0;éI‰#¼Oò9yÛ)T<¶CB‹ÊZ·Üaû=_­ºötû‡_øŸo¯np÷½ªÕÛÜpÁ>Æ‘r-BJ€>qA´è„hó˜D`Ž÷©‹%§-¦5ÊŒºÇ$£]æQï‚ÁÅØä®mÎaŠX¶rönhVS™Fˆb•rf?¯w%s[ÄŸ ¿µ3‰úùÏž,Db@?‚ï 4ʳ¬=å)OIn¨,AmîT¬0ÜÕ*”³y2üA\ë®vP8µk(AY ˜„T¾žÃÊxÊ)§Tï{ßûR{Y$Y|o bü•wò¾ú ‹X‚冩†ÉM·±8ß¶'Üä¾éy²h:rÈýED>å90m‡«[~>޹O¤BØr^=9ñ5ùåÚ$e˜¨Û’+¸–C×»±<™Y³÷jæ¬W[ù{µ Z¯fzS™Úe®W3’éØýuâ…ôLmµPŸÚḜÒÿZhñ·W é©îZ`Oÿýwûî»o*R»W¥ñà~ MuJ¯fŠÒÿZÓß«–^Íœ§6ךþ4æk©gÓ÷jFª7ìÝêX•^PžêŒŸZÕ«]½ãï’Û{÷Z1’úþC©=RÿÁŸÌeÜùÖæC}0@©…ŠÔ7k2N÷÷Êzö Ðß6 ¶ìÅáŒgÅɶvĵ®½úõ'`®ï‚A׺îÑ~85Oãù¼/œ›óºÆf^—þ^ÔŠº~óëùq- &Þ'?ǵGPo‡vˆ¿½:&*Ã:‹oï„NèÕÊí^-`öjeF¯è—ë:ÈÇPW™Iœ¯Ƚ:aM¯¶R÷ŒÍÚº—úú] g“x䲨3h-šc^4™¶1¿y}S%ŒAûÖá#œkã•ø›sà'.‹?;m&íÍ M,;£@˜èÅyåPÔªžè’–æÌ3Ïœ‘;¿fˆS¦Ú%î†\X"êÁªàÃÍE€["íz-Ÿhš»z‚IY}dä:ÀÇ;n&ÊÔ%?=ò±ök‡Ì@¬e4~þ׃³_‡cçHþÛâßX†h i»òÕ¥û76Ü+vMBšmÚI14i,-bê‚z‚èk¾ÅÑtÓzú¦4ÃlÕ×ÖþAÝÍ€òü9Ü6hHezÔ>Vß Ž6òQ§½Ôojš\O\³($W8.2&ÑØ…f\»}ãXQá¢ÀÊÄ€ ÖýZ¸H1¢ú-qhôs¬ˆï4>-[3T){—9sšÇ‹ûÌŸÛ<6o²pÒÜÓFKè£ý\úhíµCšdó*`rž'õ›¼CÓš3ìݸ–šsð¿é:—__‰Ç<=x~ði‚k¹—Ê•þß:FîêTvXö/¿<{mîµÒõ¬®v4ÛÿÕÖA–òA×¢®qï¹ä±Xn†m0ˆöu]3Ñw4Ÿ 7ÁÓë˜pqv<r@£Ðw|Ä| Ë!Ïþ˜Ÿwìúšƒ —æ V!V,ÖÁà$¼h[œ»¹DK^_9îÆ@†ºqS®Ì˜u>êLü2â`d .{îÖeò½A#ø„L àFe!F.Œ2% ö$`㬠¦™¹-…¨ëÌêæ˜X,Þ«<αóˆ„LQ&NL¶ç™<™ÖÁ T¤©Àˆ?˜wAÜ£¸ÈIÕ-«BiRM†¥ë±Bë]p‰Ã,q)S‡÷ôü¶z¸Q  D½-5|ÞþaÝ]mCø‚9U†Ëæ-Ï\EÐçŠ$ý)¿|.p¡ærIØâ6à;†ûk[8a(6O•,ßoûøjýåžê7»öÍo_m±åµ¼%4gp{_.r„ sSÜŽÅ7ÖÍ1ú*&˜%踔Oúe­qN×¹+S ˜S[îJ©!v‰«Û( Ô¼ç~À}Ès?õ©O%å„÷±üƒ1DЋö˜kãS˜¬p3ônæU™ÏYp0j›ãž•¸ß£vkËã|ß|å#««þü‡ø[ö06õaž4ã9ˆÓB(€-"ŠÞ ëè:'–± ÆN^·nÝ&nØhH(9(!%Û1NŒ‰µ¸ø7ÁÜÁ=ÐÀµ1”ÑR^\$Ao)^½—ú¸ÇâiB`Q6² z7×cñdk™«ð;M×x 8 Pž%ŽQHáÉt°ÀèXrƒxuiFõÉ”\Ìöñ-]Ì Z'¢L£C«ÊßÞ€&P8o2Ø·ÿkZV¾°„&¾ÊµY3ù9cÜ1ÀRƒbL>]`"h¦U–• ¡×>߉O·‰‡ÖÓºbXN?˜gÌ &‡õFP+†Ü$ Çm©H›Z ®ö9ß ÎΈU(Îig€ØBQ³L\oîs %ÆÌdÚpß'·HŽ„b…bƒÕ6h"…­s]íæ“»ò1ð\Á;Y§ -m&C¡é+¶È¦œÆ¡Â‡·«Y®‡¶8Ê`Ðݧå®\q_Û^šŒU”[,ßRWƒÑÈ„À| hJ0ù9“ejO3Êbq!;ãoq:Í6Ð )ÇmË$Â-CY“šs´³4C\ÄòÀfÚ·ËB3¤ìÛÓð0‹3KiË\ì:†öˆõBý´7&I8“½†àÆ:$Å©zLÀ&g“w'q>„“ æZœ -•sMÿüæ;vý'¼qíc¹ è*?éóßÃ;ëgÊ.0ޏq"¸˜| )ù$ÍšÅÂ×Ö=ï„0á¾Ô,ÿý€ŒðÃ’H€óL}Î÷ ‚÷°^ñŒse_0P0ÐŽñ—,],õy K{ér¶``~à ‚~s…7—Sz¢á*ÇÒCˆÈ!¼ ¸·£ñ_™ü¦Ì#Ø7oÁM ½ÅS¡yüoò ¹ÒýCãЯœ/Ë-Ê,ºÁÛñÎ@[»xUü^ƒ–{>zÜL­m(ûM10µÂ)Ÿ”«#Û¸F5Ó¬³ð/Å Ì,†—vYçÀ ša3éÓŠÑL—&7/õq:òBû– œ§µ`‰0x¸[`ÈhÆ1—ôH94`¦ Nß”àÐ& “¡‰CL8ñÞü€Å‰áaäzÇR$Æ€×?]¬<@ Qnö4L|áKð!+"²¾¡&dyA°Ä]ކEºT¾ÿÎiviK•gÓð˜\ žÑ¥½éz×ü<Ó?A€k—º{=ýЦŠv~Œc‡àØ“÷°Àî< <¿Ÿe,únžEد”‡¼|k‡¾ÄEŽ»žïB wÖ=ß,W†rŒ,Þñÿ|ô_ªÞU„,^+Ê“ ë¯Ø’}až¶4Ÿ‚3Ÿóè ä²á¿ÐzŠ-ô;÷Vð–Êž0¢¡Ts.®9F_ÜOÉÙVOŠLô{O÷­ÏŠCU@”‰ÿƒö]eë Ž‰æF‚”¼ïJxËÓŠG’rDü_îîžß_Ž[0PÀ‰AÝYæ”Z»–ÖSÚÓº“¦¶Õ>•½šáI)C›)ëx^ÍH¦rÒ×¾WçâOeëŒX)qíæÓO/ZkRùÚÕfà{×î”ʸºOå<£î´)=¸ô¸µù3¥Åu±´½Úo;]ËÛWî ñ°šAïÕLvú[3p)mb=pSÊSï"•±T€µßg¯ŽŸèÕÂ]*+­«ºêI#¥¬qJ[k<Òu)”Ë¡Öð/Xºmm‹ÔÚuP{ÂE-¨Le˜·µë¸žäZ/åç¥'nƒÚ-®5=§o›§,u¯¾ZO0ýjòúû'냮óy™¹ך¥)UçRǸî1^jËj?¥¾X =ß]zí¯Ú!µv­­ë7Iµ;Cÿ~POê)5r­ËO=öÝktÆsÜTÏ”F5žW °ý4ÊC+`cdÒ©µ/úòG{½«§;E÷b¤ÖöYÍ–˜VXÌöÕ1’½šÑêÕV¢>Í›V<µµË¸wjí¶çt;÷%ê]ù—ÿíº¼¬ÎÏW»xœZM­•b½Ú=}\ÔJË^­4ëÕÂQâ¿j¯ƒÄÛÕJÍÄÿÔJéDW¤Ÿ®]ÓRJo4èc-¦%+,¿Q <©¨e3jånz·Ú ¦W{Ôôjo¨fÅíƒÖÖÞ=#§ÖžžÕ9ëÖÐDK©Ë2$í®¸’p=h¢Hë^ÜI™DÍ,Ê݇€UI KLÃ0ŒÆ$¨Kð<“g=:}K‡¹ø4Ÿ¾¥Î³æÔ!i¹ýoú–FV1×d0â ÇÌ+ÀŽ+Xds°˜È¤²”!,]ÍwÈÏ  oƒ.÷Ž<&%îcaª'ÔøÛ·°õOüí ž[Ï.IóBûBC¤oÚ˜ËmbgXòàæf]ùýª«½y¹ü8úf~.ŽYÁX™æ=%¯¨…Ûdâ*#]¼{[ü¸m|s`Ië}Þ6[ðÝÛ¾}X a>ØÓ¸ñlßuÔò7¼×#G-ZÊe`á5V0³t¢µ’•XÜü} ÙîIh/­9ºjQY9 ŒŽ«¯¼¢XB KãÎ.þyQçd|ã¶É[C†F|!ž€k4w:Bx4ôXV?.í’ÒŠ!ñQÅsŒÌbHÂ5.îÉ}.ó,Xq½mÏi@ä&I×dË3—gÁʯåÇåæê[ŠyÆÀ«ÏàðÞm´¸–˜‰Ù¶6¿ÆàlK3ïhÿgR…ý|–ù8 z”1ÆD훌* µ?aðÙIÅ N%dÔ½·Â¥6±rk´™ÎÁ-WWŒÏAÉâÍ©M ”ã²kÞm× ýù=Ü»1cM×SJqw\EõÛù€÷5.¦Aí7—‰«ÌÛ¯-˜=Š€áÃraÊc2æÓÆroÁ€~•óh9F@6 y¡Ÿ%t!WÎÔ%HâJGÎ-.îs¯1€§ÔŸÑ6B…ä16”qE zddÃÿE"'îÜÆCúÂU[òŠ‚b]Ã8§ÎwͲè•-@û( =ôÐäú×àRíŠrö”á„B‰OŠ{\Ž™ÑŽÛgõÑîX)q!:5iÜŠm뜴=!1cBc#ÑÏtpA×9ÄkséÔù–– ZùW~|ÞÑõº—þqxÁZ³‡˜¨ÅÞHý‰P,%ÀhŠ9#ŒÆúK©ýsmkoý•s½uÙß'$­/Ë!írs±E3ha~úaå70 ”MÁÄTctüñÇ'6ºƒ¹-‚f½Já$QÑ.×õ_±„„”&ˆË•<Ä?1–â')îÐ01¬Jk×®Mÿ ;â8Âa‡7†w£øˆd'â.¤ûu.@Ý<0  ,ôS í\ŠvÄ‘¢U”,AÆ_.EöÛêi8òrÆhSʯóØûeÊÔ.B^„æ†ù©† žV‹ëQ¸+Ê_•…$&onI‚îHþ€ù• YäªÏïvŒÈdê•@[ΨY°G&\î:@–-BÚ¸¡dКF‹ÛÂìðUJ/ œwÔ#ªõ—nš¹ošÞ`Û;í^mq­…®)Õ¸p—±X¯¬“Mͬä!’¼XäØÜϺb¾—H…+7š÷¼ç=)‹$œVÐL×îa ‹%w\ÙN†sc¦d£ø#Yoæ›÷uT\kkt.$9ô’&švÝ#Èp§%±üXˆõ5¯yM¿Jç´]bñ¡´ ˆ”œ$‚dSùùrÜ LwI­Ý¡re¾LŠpV`î˜J79Ù¼¯ÉúÈ#L1,<¬F9D¦*Ä€™6‹öO\BF˧žÜç3¿¿ë˜f€†@q©à)CÁc}K»2hY¥¼@ÁÀ81€IÀQVPD«¦미´ºúª+ª-¦£9­­¸õ~GµžŸäI~ô¬)\® Cf4¬ŒDˆÐ®*”q¬'\`¹p[_ ÐVZºÜç¢n‡,¦1Ïfiâ2Ã%Ë+«Á ¸æ9Òÿs‘™ xŸXÜZ¬¥ Úˆv‰} a'êdAEXeô‚ 5£tm%¬q#B ÇÀm<®Z½ÍLWªáw•£cM.0? L¥0$Qëš)“¸ø>Ï\ r ‘àhÚ<š9å¸ pñáÎfâ“iÓçs´$<—÷š5kú·,†o)ÿÔ&ðW ˆ€yšAx`Õâ‡Ë=¢@Á@f,NÁPgÝ©¾ô¥/¥xc• ÞL˜Ðõ¼r¾``10@P!Ø„ÐÂc ÍÕ¦Ù6q¨ú|3Ö”Û\€Ä7Ã!eY{X Ð¥€ĸ²5ãpМ:ûUí܇U( äq¨hzìççV%çÐÔqJQúÁåq=’Ïpí+ÂP`eð~ëv\ \-(Xt L¥0+ˆKžÀ@\L>iç˜Ëƒ>G r‚—c |…û]œË÷Ó$ô ˆIŒšk!Ì—¬_ü¹Wb­¶o3èÜ´¸- hÔþA÷ufwՇћM°8ˬ,uÁ± Q>wÜq](ç ÜáÄúX#|ÖÃÍ!èKj°aÞ¨’ûYÄù »?¿ÎªD‘GÀ Ú&¦Õ9±M7;¤¶ “hc.Ш/‡hg~nÐq— ‡öyï¬×GñH¹X ` ` ``¹``*c†¹²pMèÚ¢´m×3d8A@1·Y¤žWýË¥žÛp좺-td³æœ#8šk'×™HeÛuŸïÒ˜­N.9±ªµ²\q$˜m°¸{Y^#«bÔ%Á‹diÁÀfÓÒÖvüè”V2:.$xJ#ÀýŒ‘{D{ò8T 7s '¤,á ^Ó-Ñ(?êÞ³ A „)Š ±I„ Š,niáî"¦HB…¶ôù¼%\S‡9bRY%ïq{l²x±¤ ¬b³¸FÅÑr,wÅï±_k¬ï$žZxDl‘MXö´I‚Ì’Â0,Ø- É3ŸùÌDKGy¦Ì±Ú;é6¶µE»#[ªvs¥­××l+ZΈ+ ˆŸ%U #½3hÍö#ms›»Ïö–±•g‰é ÈÆ(‰G“EJ\ƒµøðjݧq]Ù¸p{‰—à^I³<Û`qmÀ@æAÕb0XªêÅO£ú²_D ìô¤W.ª ?Ê«ÿñü¯Të/ÿó(EÇVfï½÷îÇ’Jlƒùó#Õn3žTªßÏ~ö³)Kœ²„ý§?ýé)Ë”.sRLϸåª$8`ña¡²Ž ÅœµJ$6 hð ÏÕŽˆýÉŸe~Ä@\+í>ûì“_Û1÷×HF•ú–µ8Wöƒ1ð= ºêÏ.´Â¯Ö‹c¦X6îœ6.¥ú¹ Íñ9.TI^"3ã¾ð…´& ±W/ÐÖ{”Z­®ÀìÙ tÐ30ŽßùÎw’Å 5/—T¨Q3Žûʾ‚ý»\—ºïÚpåzwØmX‘yƒeŒ`²¸U‡kµ…õß³Ï>»†$áŠ&qOd•MTºê€ÐdÇÿa{Ì—…Ämܬ#–Ç}ƾU‹œJË­. oM0Ë‘´×M7ºfl©…!mÞ#R†slZÆk,CÚY¬W»O§%ްsÑ1Ð[UÕ»zÓé£×°2JRæîÖ’ó¬]»6Y=š±shcŒaØa)1oJfÒÊG\y\—TÄxd%Žù–f|I»Ÿ[Z<æ1sQ6ê™í^R°ÜóÈX%p±Tçé´¥ñ7w°Çy1ã’£X”öxÀŒtøÚG¹Ú…›Ù¶¹–/ÂÐrý²å½:1pÅ.¬¶¼þâø¼³¬˜àº²M´˜"Ú^É btÛJ[`6MW𖢏>èʨΡ¸Cæ Ì\±X+¬xNCiÁãÇ@3þ³­?{js1Ôȶ6¬EmnmqO$¹1Æ»â•0x„a€Aj BÃî™ËuËIÈ¢Jb-c¾÷½ï=—ªÊ=³ÂEË):(Mõ©ä¾*C#Ï JCÊ ‰Qôƹ9TŒYY†]'þ”§<%¥´'äs=o 72ž”g… l°þð¶ ¼%:l H¬µß~û%wñ(o?ÈÝu®é,PæïBñÈ Ìe–UKÛ *@üí‡>ô¡Šk/%#áH|Þ€F"ôµEŠú ”„ZփܴÑþ¶ºVÚ¹e! uÖYIZnûx\ rÓ[™ùž£±MÂ|ëšÍý4 :¾LD|H­…Q`86¸-,NÀÿ°€l·ØZX„£Løtß Àl±\qºþM°¸‰É3LùO*Ú¦áý.J°äѶ#N”¬4”æ1ÂM&í!¹I¸†ù§ûØÇWWýåãQS¹¿ñ}þ©ZµUY p*?NK£Ž:ꨤ(áR¸Ö¨–&,éS›m¾Ee+0{ H[" >Ë + !†"‘Ò€C°A縭šGY[‹¦X`!bnqó(ëo( S– ƒXW¸Ç.¦2)ÎeáaÂëšì½Ô¥=¬Á»í¶[_ˆñKV ýÞ+”cû’6p©ëj7zœ++áC&”a‹FçôÛ³ lÀÀT C>*Ê.”Åcе®úâ¼,?:U0&•ùÔu’øsF‡f ïÄQÎ^Ù&ä^fV¾­À€ôG P`z1`"ì È>ýôÓÓDO%•üI'”„lki ºoP`¶ _°·¸„Ä ÷A`Éñ¨ÁâÚÀý€»\+“É<Ö‰óeߎcŽ9¦â.ÅΘg;óÌ3Ó÷ Ë}Â7A€m9á’iÌ÷ô½¹l rw¼ãgÌG_<ÿ7Õ¾Û/i¹`˜šÚ| ÚNÛ Á+,Ñj®ó³çDZ¿ÅÞ‡T›­ºÆ8ª*u L=îtØÛ«U×ÙnêÛ9 Dk9„¯éeÎB' bi@xÉ4 ïÇ“†QÖÝ6BÇÑGø3Š«æú_žõÇsÛöMwô|¹JÈ<á&÷Þ@”%©O _ªøà “‹@­IHt hÔáaîÄKJç ʽI'žxbÒflµÕVIƒä7ª^>£¬^´ËÒ³b ›ƒ7®—ýt` + [ö¦|!]“W® èºÏ[éwˆƒ­˜-åºñ¤nVÄf,î>– þÚÆغÿþû§ãò3÷¿ÿý“vŽ% ÎÜ!7æîMP!“9!·!¢Ün¹ÝVÕî;Þ»ºâêÍûehsàšg^Š:ô šÈ*×3Ad>&WNÖVs‘~˜PBcVå¬'¡#æ\ÉFLMÔý|K™diIii LÎ3]k‚Á™Ç¡!à ÊDË\Ëÿ” #àVÀ4ÿÎ'<á )½ª:ùºÎ €ÿ€¶c.æÛ w¯œßip[ˆ€ì<3Õ(_`Ð}iš â|ü7oSFEFF築AîèJÍÿ—ÈÓƒÂ]»ÂcÈ;¢Üs€'ï?æô'LŸk–W S) ùàÜ7H÷4ÝÜIH[G—]ƒ'Çz¬{à£3bÎ0…¬F‚ÝF †ä/xA²ÜÈ2$«×Ž•),S®ÈÄÍ00Ú,S -±ŽÉ'”v—ÆèòuûJÓŒA"¬ –+Ð;úÖâ¶ÐžWJlÉIŒ×³ÃK ¡„»ÀY;ãŸÛ%Î\a³Z ®6›îø„ï¾áÀjý™ë+.È}ÃbMũޣ,ÿs¡/ó}6¦‰p. eݺ:1Onžoý+ñþÿ»`ƒµb¥½;ëFs„¼å Mßã©c.ìÊ®Hø üÈÐF‘´¦vá–m-W"à ÅÛR p(yBÙd?™[ñˆÒnãÉð•ÙO‰¾‰Â[‰SÂ÷qqnZÊ)¹£zïÙ^{ùË_^Ù¸[£ÇøaíÅïžvÚi çQ'þ¿ ?”_¼”V 4û^—µ-ð1•nr4±±NW£i«›ånaÌ€¤öø¯Œãp·Ëïi³ö@–N@p¬TyÎx£hÍšZÕÜoÔ`VOhò=?·*å.÷õ|¾´1`ýa†ƒ"ýoÀIÛï5o¼cÛérnD „‹ÜˆÅK±! ô Ñ“•ñÏÇý[7¹|‡ƒO¬Vo=¾øÌC>gmòÀ9œ¸ì7?­®¾âÒj‹-7Z#çPÍœo¡î BfíÌÝ›v½Y>þsI4/·)´‡0Üv-îeOYH!í§5v.·úæõrЉœIйÎýø§ú§êU¯zÕŒõ_ò:Êñp üðßž]í|ćêq¹qݨáw-ÍúRÞŸF} L.Ä´ÝG˜ ‹M\7wR˜³$q{c¡ÉŸoí")´)™Y’ò°åð™¬,®ˆš÷«“7ÁŒ’Y}‘ÒšIÚ3È=Ü¢£ÝêÌâ= ÍÓ¸ÐnV3žø@sA-Úåì)Ú#µöJtƒ«¼?Äÿü\àk*-C&dî#Ì¥ÜÁä™Ð9]Ë èÌ&V¤€¸ÿ»öƒÒ·ùòMmˬÃU)h¸¡äÐöòëùñ ²^\í>9õáÀ -P0P0°´0@»Hó8Ÿu\¶ºùíÆúÒ25å ÄX+_àÊhM)»¸³ôGòÍÅ €v˜fZjÛ©„BÞ¾O‚Q=´ÐÞbF0oâ€:hÀÇ„QòQÊ5A07¡$ Ö2AÅ%`‚¸üÈx…ŽÑnÓhh´÷C‰[“RO}pà}`_úÒ—ú×¹ qÛà ÁmÈ{˜ÃÜ›¹ÁÜjZšwácðE“šG(ŠÅYGm<“g÷]Ï7ïR,wÝë,+b—ÚŽü« rGÏËãØûð$`Ô.<¡w[ià»´õ‹¸û©´ ‘Â¥ê¥Ñ2 G6$“Š b9×+‚o5©{íÚµ©cpãCÊ•G«Î(€8ú bîå. µ¡çHL`Á+¾ÔžO£&¡‚T…M }pÿ&÷>.va±i–ϦãÈKõ Œ4£|ø¸g%îÿü³oU×Y³óJ|õòÎ+”jÜK?sƒÃ q‘áÇšš·ÅoмRÍ÷\ Fhš1 øì‹I07,ðº åÃB¤(‡ˆ<ëYÏJ> m/ŸŸ#e[Ƚ„ u!ˆ²w€×¼æ5IcFC ©ƒgŠýi‚s„ˆŸ& `6~¬ã™Ç…90õ6]çòëåxÎ?ù°ã¶0ê7窣cŒÖ­[—²5ÆÚ £ÖQÊ->.ýåùÕ8­C\­iÿGkaC,æÞüm®ô^Š7±¦²$F*ZBR¸WçqªžfÞ§0d}’˜'bÆV!k¤Š‚X£{´Ïy¨Qž¡ !°CYXô¸[ry£9÷¾‘@Å5Š?€¦j P¡§hWK oŒ" VÊ~‚é f/…¿²±PP„ùØ¢¾ðãÿ8ö¹ Ú8ê[è:à.çM·Â¹Ô1­÷Nò}ô9s˜¹KŸË¯z—©†¸°ñ‡di1˜,XEC¸“,€™Ð ^ÜkòFظÕžFAé‰Úi<š;BÀÙgŸ‡)ð”åˆæ±éF×ôµö‹-@†’ÈRÒ,«L®dÒZEñ€òä“ONÇå§iÂB·ÌIÓµ²ûMÆwE#`ƒXÀWöK ß?ñàjçÃÅ'Œ'nˆ5ƒO~>ßÍ#7ßëéÕê^w…ÀÁú!¨cn5A7†ÅšŠS† Ö™<”…G‚–y{Ø)]u7ãP• ÅZ ÐÍÌ@îºÇ¤~"#z› Bq?:ÆjT`nØ|õ–uN“kÌíæ%z—>›q¦_Ú(pñqú“Mþñm„¤31À*;×陵-Ÿѯ̣\Íe\ A6.‹úœr¡pôöS) i°ÆçDÄ ç“8"mk¿X¤+¿.u®hÑ<7´ƒÍ²ü¾»2›äeiâš‚P~}\Ç´“\ C4›`4ãÂîdë1éK¯LŒ§vYhš€`Ð| *\JC‚ؘ ôå&¸f¦ÉuÚqý'X¥&•ǾÍÄ!ÍúÊÿéÂ@3Ñ[?¶F±Ìs'Üä¾Ýóð8Ÿ“×Å5G°4Á'¬ç1ÏkJ¹òXÓ®8Õµµkö0`ùñÜ/&ÎG–8™D1¹[Zs”¸©)Ì´)KÚÆzÜßÜÇ<Ñ<™à¾dnÊëÓæ<±Pó¾ò0îR'OØâZ[.´ ¯êCè– O„/ ¦•pãÀõØ'aZªèˆ>›³ÆÈf©â$ÚmÃÇDß²N§9,ú˜þFnh D]óßÔ CñÂãÜ‹ßYN°çž{¦ǘjÚƒˆ­ZNï¸ßEü¿ýLĵ´ùöˤH°aYDX ùý«oK#*ÖlðeI•\ƒÅ“‡èè+bW®ˆV@YÀ70QöËú¾„çC !Tc A±¦ƒâT‡á&Ü®¹\s±–ØÀøµh8—=Ê,®©²æ)Q7%×[Þò–äê&ào|c\랢Ïü#Î`MåÄÚ/\ Ì +MÂl„lÁ¬ÚcN1«‘Œ#„$V!ŠÀŠ0´±Ÿí§P) ‰xq¤ÿèc„!B¾e ¡(!וÕ/»!õ­(aÈ /7°°"h.¬¸ÜÞsœï³ùªk,šÛ‚@j‚×#ƒöu¯{]Z×CF'ÖBÉÿøÇôºRub Ä2Ðvˆ?;î¸ã’Û'‚"öà”SNIÁ†åSç„‹båÁЈ‡“ØÃ * "—=‹€%c#A,L#æ¬@ÁÀØ1ÀEu³M­–cNV!ÆØ»CÙÀ‚þÔ§>5%4X[[xôõ®XSnÛ]qªÙ#Z¹ŸZŽ—9 nŽ9æ˜Ä$z¦Ìm”!¬OÜ:¸ò5a¯½öJ1JÆ*K2wmLÀ¸ µ¦‚¸ËÙóEMÄ<¥“å§` ÁpÆ>´ò!Â+a'!t-\äÂ2ÔQõŠ<-ŽHÎ…c_`ô±†Â:Qìõ;}-§è—öM(ÂP#åÿ²ÇÀÎGÔ 2¶ºî¢¼'BÀíM–)Úb‹üv9ºŽLåÜWLб€¯ÿÜnÄ6XW²+RWJ‚Àô‰#4a°0kîË*ð5Gòëåx:1°ùªÕµ ?^‚ÙF8æóöç½rŸêïžûîu2&Î8ãŒêgu 1㊉û‰d7ƒbMÅ©ÂÁ‹€óÛßþ612¬BƒÖH1'ļ ­’ðHº g®üÊ5鄪ü¿gELãæz'æ[¥ åK¬%f-½ý÷ß?.—ý0pñyÿUmw—û× €M°9T7õ·“ŠÕ1ªhWlÎëÃ6J=V!{×›}wê_v‚ ¤” Èæ/–l Ì0 Ÿ… V Â5ˆ¢Ù>ú™>Hp²uѳ" •žµâ0°X‚DsÇá–Æ"D"Œp›cõiÓ8¦ç»ßýnÒ,#yœ\îCl²\SksíŸkÚ<7k©üþ÷¿¯,‰ÉjN‡&˜àV„¡sÓ}üwÏoµêÚèq´”;Y³oÌ·Þ«þòÇêê+/_Pa(ÚlŒä¿âÙkåÛâTYy»R×r¿‹å1m€HKæ0 Ę¥ì\ËȆ*NP ÚU®å–¯(0w üôGW[ßê.ÕêmÚcœç^óôÞŒ*Ǹ !'΀0®öʆ”q\`¾úÕ¯¦¤F–t‘ÜÈ<Ù" Ž6äˆþ¶ùÔ¦o…`D ¡iÞ–Œ0d})A›ÄlÐË-Ö5ăßš &˜FRð]¾XßB·¥Ç»š$¸&€*¤©kn Ü—|ÜÂÒƒ‡[«À_A÷6bà§ï?ªÚzÇ'â¶0h°i`h‰¸¸I—yÉ%—¤oä»s—‘’]4KÍ‹^ô¢jß}÷Mñ ꥅævC10[ð¬H’ o½éMoJÙ¬ÔC`Î-NΕ|aaü0¬Ì÷‰ÿû½³ªëÝñæ[Mÿ~©–¹T(((Á“`@œcý¡ðÃÏ9Î…!åŠU(°U¥8CÞ#²L‚¤ ²ÈJŽT`‚ž†Õ'„¡Ø‡×£o¶áo*…!åÙÏ~vZÜÀHîླcõ椓Nª,„*K–¸‹Ç=îq)\9L¥ókÖ¬Ik¥XäNŠâw¾óÉúQzTõŽw¼£/x´!Æ9ŠŽçYÌm2IkJZǬv=3¯ïè£N錢®k„*Áp„9~å2ÂI¯Jè:ôÐC“„+å*áuéøãO÷òåæ3* ÑÏê@`Á²‚swÛm·tÝ;IÙ 3áó±$q­ì燄Íý°Z NæïŽÔWc€ê'`û¶Üâ|‚¯>ŒåLŽ9 ZÀW9}×B„©zsBkléïÑOʾ9v'sœ÷q=áÇï:²²®Éêmn0–*­Åc®§kÆ­û’zQØñ´o>/ÉBj^—Ñm)eKñB¯%g>’™Z·n]rëÍ×G[ ¸[Ì6®®cø6_½ÑMl1Û²PÏ6·nÐÇÁŒúOBgìs(?^¨vNãsÌ·â…Þþö·§pmDï¹Êñü‚1Êz—Óøn“hSÐQuëgyËÿçåÚÚ1þüœmO™å9A»„ ™ rV !0` ˜þ÷¾÷½ÉÕç€H -»µ[¬ßðÄ'>1•Ç੃àAËÉÝHGÅZ"ã—6psA9äCR½ƒž™ üí‡pÆâíï]¤Oä¦DèBt¤:&ìÈöEƒ/ШÇ{K,öQD 9F7àÃD&l¹ ÑöT師 ÖvÜ)x¤Û AÈ9.MA(ÊÛ›<·Ù>i´­+ôÁ~0U1É|c y®Tð½c‚Žãñãb|~÷æ'Û8aZ2kuÖYÉÆù¾ãªK îHµ;®:G©Gzðˆ'¤¤¨,Œlaº@IDAT0:îòâ.J²Ñ[8™’1ÏÙ£96BëPÄsPF”m&ÖÔJ|nô¦À €;MÂ/BgpF1 ÑŸ¢oég6x‹~ØÆkå=*-CR{rñŠ rAœ¬:9X«a÷ÝwOZËÏ}îsieq/mãN„Ñ̃Æ×Õ-ˆ’=HañV$Ü‘ö¯Ø#‹:($÷Qž9¨îüÚË_þòÊJå‘)D¢šXV–1Ú|×k´ü€ÙT›"ÉÕ*ÏŽdðpÇúþ÷¿Ÿp• –Ÿ±cÀ Æ6ö"-|ÀÄЕõJûŒ·SN9¥â›L(zÒ“ž4°ü\߉6^¿$|­Dˆ9*&h86I¯D.×!¸p‰ÈÍZ\ÁuJ×h’Ñ-í&а†±,Sxl”å)aNR–'÷\éþ… º½sE/†" FÊ~b~‹ý°òåúF p§lÉÇàÆ«åh\˜JaH¼„ ?€PB[×: ¢’»ˆwÞy©(Íš5Z0`¹6ð+_ùJÒ€·Õ—Ÿ#@q±  ÉãÂgËAÏŒòö.Ìhgr¦¨y­ù¿k2!±BÐ8æõisüf]+õÿ]k·…j³ùi\áηÈ7x7aˆ0˜šXnž6}P¿Áp„¿ôb F‹ñíeÇÃpyg‰$º4í‹Ñ¶q<Ó÷×h×)K0£OA°„ æ7feãžq<_[l¹Uµùª ®ã¨3úô8êŠ:¾u̾ÕÎ‡× ×J‰…cOŒ§$„‚Ñ“Ÿü䊋(0.)àÖ­[—„ó&a‡Ð@«*%¾*šâZ⛲ØsÍ–IÔwí^,ú„-Ò§vZ J@nÙ|dÚ¨¬E‘YlŒíÍ uÐó –^›…T=KÌ‘óâ`¹c†”¥øPÎûPR¸Î5Îøä²‰WâypF,0~sæ{«ïþ˜šäl(G»³”*¨ªw¿ûÝ €†Šx_)~MÌRi#>âfº€…å!?ƒñ0Q›è)€ñ¤a£ãr†Èp•ÖØ í#™³6q׳A»8ì™Q·öy¶çîX[ $`˜ÐfŠ‹âÒ±¦ö9þ#ê£h+Ó cø‰ïû1T9þ*Æ E£¼g0´Áäb„ù²bX0!ð¸Ži±…0õ¬„}Ä®Á‡˜·A \.5|Dá@¤D² ýÀqJ!4G߉ûçûÞw}ÉǫͶX=ßjú÷s­Ì÷í_˜ÇÁÕWýµê­ŸÉÌÏ£º‘n•Z=`õÜ@­§“Ãcó˜>]pÍB©!ÀªG˜¢ # ô$S%d¡æßðLH…?è%á°FFô p`á) éôºŸùÌgúíjT×ùw—]vIï¨åA Í3yæ'>ñ‰þ½’2 2„!ôJr‚X“VR®ñº£™Kû••ƒM0ðËÏœ\]ÿnµËû Ztu$”Sމ CAäcøhþóöGyde=kª°ü˜Ì Em`r'üÈ$À”U‰['€Ï4¿gÿnZ¸Q‚@iÇ®r\ ´G¬Ò(ÏŒgÓ€#¬2Þ!rtÐ ËM”›ïž Eâ.g$~ÀDÌS:9áŸø®±ŸðãµzŒ,ð®ú— ” C´µÁøFºtÚØÐâ.ê ,àÃiæYYg‚ ãµk×.` &ÿ(ý „!ý!úù‡0” D®+ÂÖcÌŒSҦ咵ˆ0ÐôÃoºyI?À¡yݼYA•ËiÑ·”ìf0¤ÞÜõZ=‘Íε¼àNõA]áîÝ?Ùrд嬾¦íö@?ËçŸÜõšPæ=Ô'»)KVÓý:è kw†Z>F9U0P0°ä00Qahqt;‰l´SÜLbò 솦=þÓlq-r0D0ÈëG¸ç˜ØYŽFÀ11„*î=ˆD¾xé gÒÚaäk_ûZz6-"ée/{Y4½o5pBÛ›ï1A® ÈÏå*_!ž;„˜X4áÞÿýó[ü8¾Cì¼-üíß_Ýx·}ë<Œów[†–ÀZ̆oézœg"Ù\k~ë–¦.›SgžyfŠ«0¦ÄìQNŠòX¾åð²úonÎÉ㈸ÎÙô e¢ÿLÓøXß¡ùM×k×¹´­íÈõK)ðsà~-3 OÐsÎ9ç¤þœ»³E¹|¯^ 'æ FˆÈ5 røæ7¿™<$òsÑWr†â+ŸO¢L~_×qWYýWF7ó˜\.ƒè(«U‚‚Éb`=öH|9¬Àä00Qa¨Ùl“n¾5¯çÿ• îü|×1æ‚Y4…ªý×í̘eý® \!šuò̸§ùì8?Îýa‡–´™‚}i¡ßÿþ÷§•ŒÇùŒAuaì@ì•]Ìk¿øôI•ô¾ãr[ÀÔbLôûÜÎ5ÂMkCq=g^ ñlAÚ\ŽÂíÊXØ _ˆ¶Lòú¾¹+„\ ëaÈy}³©<ˆý|ÚwÑ—?ZÝè^¨+Û0çS—{1ùã¶¥÷SûF}?™:_ùÊW&—4™ ç”K]Â$’|ò“ŸLY>Í©*Mð\K/p妰’œ€‚l˜¢Ð£^J6Ç’)ðxÚÓž–Ƈ9›«÷jîq¬/Êå@Ù§ïp¹Û¿Vvqå&°Œ<ƒD@ãº@ðâ%1Žþu–}Á@Á@;$ì3_„¡vüŒëì‚C&Í|ÃØ&égµ{@“1lþŸë ›¸wÚi§”®Y¿j[è„ó¼7„e!qŒlDˆ>†3OF1ß64 `þ?´ÜñŒøæñ¹í½Ÿ~L¸±Ç‡ ã]#a–B¥¦°´ÜpÒö>,¶â ò‡¤ñÎUL÷VÖÌå¾{lÆ=á'ö¢„b¼„5<üüo¨¶½Ó?ŒMЗÊUb–Üíj¾í¼ÝSßP/ººq-­ùÖçþ˜‡b߬“ òªW½ªD¹å;ˆê‚»ßýî)É6®j¬@Ê‹5 øp!3æYŒd¢âä(áОêྭˆIe “ÕN¼§%%r76õ›SdŸ³ÐòóŸÿüJ[§IÀýîw¿äz×Íc#w)̯-…cãm!a«ío]m^'6)P0P00½ج:z“jžª161|eá‘e‹¦íYÏzVʶ…˜͉ý¤Ú¶PõzoÄ«‹H/T;ÆõïïLœ=æB&"®Ü¹Qp òî„[Ü7®¶Ìµž¯±¶ºËÿ>†Q¿åö{Ǭ@6Ç!P”‰¾®íË¥û4áÜ]Ÿ÷¼ç¥ä#´Ì{î¹gbà$™xسûz0Z„!ý“V"{‚PŒ×BhR6îÏ;Œ³ok‡v±:pmœ0nbl¡)–@WÐi³ ]m&ÔˆÁaù‡ûa ä~ô£”Í-_ Ù]à~½æoñùõÿùŸÿIÊAº h‹ÑÖIºšŠAµÞ,¨ú+çä“ONïÞÕ¶i:¯K.éá“2&w[õ^ã{ÓôÎ¥-ËÓ8/ìn|“‰›+|H`²qloüúÕ¯NÂBfC d@ÃH†ݽK™iD0¹h˜|—*Ä7Ä< :DÚ;Ù±Q̸ñ}ã{{縩¾ÿ°vS7ñ®ÞæŸç‚¾L Z)à}Åóëú æÕ9ÚyL–Ô½´öË bž³a'ÆŽþÇ1V”‹~³œð°ï2 xØFóÚ 7AJ[7P–Ÿ.àZgž¤¢¯äåÒ… ýˆã»ÅC®6[5Þq4¡æ–j V$&. ™pl~;[BqE(w‘Ã8N‚ˆ-ä—æ» ùÜq?+¾!Bb#ø„@Ë僅ÈŒSL¾k|ÿq·gšê‹wÔWOŽ£ïÚ¯t!hš¾×B´E?ˆþ‘ï]‹ÿQnû¿{þð þÙ~,1ª“nk©¿``¥b`âÂÄštšKL&ÑyBF:ˆ!)g&—òÇ9öØc—róg´=D%Û°…uÈ5[þÍgT²ŒÿÀ-‚º^7Êt]/ç—.ô6ˆó±o+3mç¦)qB7ùc> fל„–„² ¾ C®Ý)㯉ͥûß÷ þ"ܵÑ#J:›þJºè3ùÌ—.JË 6Å€˜\I`î{ßûnz±œ™‰ C&™Øhb,Z9‚O€ó&*„Ë™·«¢Á ,MÇ>‰of ÁÇ÷´…`ä;۔Ѧã-ªêŸ<¾ºù^O›˜ÛBרOË{OC;Ö®]›R¿se*P00W ˜‹@>'åÂá'®;O82?¡)…®$Ô,«Ÿèx !ˆ@J:ß= Úå—ÊË 40`áhÙ0‹0Ô@LË߉ Cž‡!4ñn‚Žv¸,\çC…`†¦kï{Å÷ Ââû!0ö"{›ëApÜ3MðÛ³?\ÝdÇ·…Eø(Rôr_*0 \ø™“«›>ð€j³-Våµèç8Çðf[ÔY7Ÿ;鉶Äü n0ÁhIÐóF˜p–¢" ¥kLU%¾³~¡-òÝC eëA›”ׇ  ŒŠ.·øÀR šþ9ü¬^/ómo{[Z;ï#ùHZ”yÇ:ÑÊ®»îZYdÜšeèŸ%e,d,‹ê›Þô¦”Ÿ’ÐÎþë·o}ë[Ó˜æ­#Ž8¢úÄ'>‘Ü•-­¬5Ó„&Xpù¸ãŽK¼× _øÂê׿þuJ!oÉ )ÿ_ûÚצ ÍI–ø¤­.óásžóœ´§%Z3çåtýY‚?:­ò¼ãRß ñ‚âûÅwD„òÿÊÆ}KùÝKÛLJ ©~R`2øõ™ï­ntŸ}Æ&èï³Ï>)+›uuÆw~ö©Õªël;¯êÌ+æ—˜{0¹èGÐ×3ïês¡lCøsú2¯F”›§Ac‚ö˜_šSPƶ´éz·¿WµÅµ¶ž \•FÌ çœsNPN;í´êa{XuØa‡¥Š+F„ë—8(ûN©³/žqÆ•5Ã,6þÒ—¾4•?ôÐC«G=êQi;ꨣª8 úä'?Y{î¹Õî»ïž–šxÉK^’‹8¿ï}ïKó¯çŸyæ™iÑh‚”uÔN?ýô$<\¤½wŸ:^ö²—¥ôþG}tõ‚¼ Y†:è êÔSOm­ë¬³ÎJ <ËÀ©ëÖ­KÂ×Ü0µt0d¢B|‚‰6!slÒB ‚ˆAårÒܽûÝïNgÜAÈ‹Ùåâ[3û\@Šo¬A¬³ÍåÙÓ" MÇw˜M+Æ­Úòúã¬Ì=èšt#ÎaTBÑæÈ~Üï3\–²“Á€o4'„!ýÂ÷·q›³ËZ¥|l“iÕ†Zwzò«'Y}©{‚xûÛߞܺ%Á"¸¼ñoLýˆÀCÉMyÀP±ØÄzd,5„¢XÓìÁ~pj¡Ø)éY”PÖzó›ßÜo½úôÙG>ò‘U¤çgù±È1ËÀG~éK_JÇx`z¶?ÊŽÞð†7TŸÿüçSÝ©PöÓU×íoûtÏᇞ½üãÙ]+çpâÂTš¨!{`ŠÿMBæ|Àr!Z´R&ã¥¾È ˆoš¢üXÙøæŽ  a¨ôq`Àdn EØqÌQ芾fîÍ-FÊ,º2<.‡:â›GŸk¡>¡Ø„ìãš¾´j9à ¼Ãd0ÀD©ÂšC¨ÑÁ†+ÜÝï~÷tÍ>à{ßû^_0rî7¸AºÄ…MŸ»ì²Ë¢hõº×½®?qéúhðÁ,K\ì,V,MAxùÿø~½Îßç>÷©¸¸Ýÿþ÷¯öÞ{ïJšÐU×N;íT}ãßH•LœÆI¤ÑoÖ±œÿ/ˆ0:A@þ›¸òî\t€åF°  DÞs)ƒïÄÇ7 ˆãØ;ŸG¹iØ_÷v_mqÍ«ÆOC›VJ^ýêW·j­VÊû—÷¬ª?ýèkÕ6·ÙÈ<Ì'Áübx"nα…ЦØÂó BsÅöôßç»GŸÀ[„à‘ÿ!M+mš~,¯¬Š’€@,Ð]ïz×䆶ÿþû§…ÃÅÿN:é¤êàƒ®÷¸ÇUO~ò“«wÞ¹úô§?]=õ©OMñ:—C9¤ºÃî,;„ª5kÖT_ýêW“0dM´.°à±Ê•¹üòËSü&7½,þlaÖ³Ï>;õÿ¼>ãÁܺêââGiÄ2ÅýKôÿýßÿU¿ûÝïªë^÷ºÕõ¯ýüqËöxÁ„!Œ AŠãØ;g² XNDËÌ<¿ ¡x—¶ÿÍsQvZö·Ùÿ5ÓÒ”×D£Àä0°õŽw« o±Êq[µôŽçW;þïu\Óé\°aŽÉéÚázbA(öÊÇ6—g–{¦¾ylÑìõûŒ¢8üŸ4üø]/®nõØ—T›¯¾æ¤UêŸns›ÛT”xbp¸ÅIJ@óÀ>0m‡w¾óÉÍŒõÃþ0%!Ðÿnr“›$!\ÓÄ‹Ã$dbÞõ®w lñ?ÿó?'Aˆ[Ý\,CÍånyË[V÷¾÷½“€¦²¿ÿû¿¯>ûÙÏVßüæ7“åŠ Ÿ$]u=èAJmÒ~J¤§<å))A„Ä9,bÏxÆ3¶q¹\ܬ&2LÑMa“挽öÚ+™I÷Ë‚€,\•÷(XÊ’zÜVí¯±¶ºË„¡Î5€f„ÐCÿC+Šòñ¿ì—B úDà±ùOÊÿ{ëˆ&qö÷I·µÔ?Ì+¹ò>¿ëguV9Vîg„!YäÄEzksÓùçŸ_íXgšun•0Ý +Í…^˜â”(Þ-:-Ûœçk‹¶Rʃ®ºgâœBùõ¹Ï}. Gó”¿ãr<^PËШŒÉlÔòÓ\î3ŸùÌ47¯´­` ` ` £ëÎ &|Cʳ`vós!üD™ 7§T?h Eþ‡àÓ¼6Í-MX‚脼 aDRàd™ËcŠôAq?³[ÜâC‹ç?Yž¸ÀåÐU Sb¥X¾V L¥0´R_Þsq0ð“w¿¤Úñ1/.n ‹ƒþòÔ‚±b ÜxBò¿É´D™±6 T6ÐÚ !׺ʴÝWÎ Ì7½éM“IJ"nHRƒHŽ0—úë^M+ Š0´’¾vyׄK¾ûÅê–—ý_†JXv¸àGW;ü?ϯûöxÖ5“úU@nS¨˜â´m³Uã¡ÌÝü8ok†rl,¯ã®o¾¼Þ²¼ÍRÀ+ôÚ–6¦[:m^R-å/jÕâÓ†þÁ‘²s6mûÖ·¾•V«žÍ=QVZQ™y†µ¡Y>î+ûv üá¼ÿªÖ_ö§ö‹s8kµt)aÇ w}ñG«U[mtãgÝÃêÂ0—myâ`Ø·_¬ë7úû‡/Z_¬w.Ï-Xj(ÂЄ¿ØG?úÑê’K.™ðSJõ³Ç€Œ8,9›»¿øÅ/¦5fsO”•²ÓJÙÃÚÐ,÷•ýÒÅÀæ[nXOcé¾AiyÁÀè¸å#Ÿ3Kèè-(%  †a CÃ0T® ,1 p:ãŒ3ª—½ìe›´œ•ÒªÚV£–jS¶0ëX_À ·ºÕ­Ò Ù,F‚ÏeÁÙe—]ªýöÛ¯úú׿žs“Zô}ï{_e}©8÷¯×]°JöÚµk+‹Ô®:=ã÷¿ÿ}õØÇ>6•‹6tÕÓ,ÿḺÛÝîVÝéNwªŽ9æ˜~Ö°f;­áý-4×ÂØ¾ð…æéò¿` ` ` ` ``bžû_þå_&V©xî(ÂÐÜqWî\¢¸á=÷^–n ²ÖXÁJÔÇwÜŒU°Ï9çœêiO{Z"5sŽÀó§?ý©:ùä“ÓÂq,5ëX,NºNåŸýìgWëÖ­Ku[_ÁstPª¸ºYÁúøã¯=ôÐÔ+ºê<õÔSÓÊÙïÿûS¹hᬭž¼¼U¿Ÿ÷¼çUï}ï{+Wk>¼ûÝïnmço|ã”Zô¡}hZ9üCúPÜ­oð¦7½©²\ÿðÿPÝë^÷J‚ŠõXV¤ÿT×á‡^‰ÝùøÇ?žß’ŽÿøÇ§Œ9ïxÇ;ªN8!­a…ÚêŒkÍý z¢¬ç³\ þû¿ÿ;­èmð¶vZcÐöï|§zùË_žV gMzþóŸ_í¹çži…p÷Z/^¬Î}ÞyçÅ£–ìþúw{`µÅµ¶[ûá{Ük¤ýáÜÿ¯úÍ6Ãckh©¨`` 1pÅïYÿ– ¢)l^iÒ  5¼¤É>òÈ#+馟ûÜç&ú¹Ûn»% UtyB¸öº×½.)!¥©þþ÷¿ïT4ö!yH¢i<%xL€ûÝï~Õ)§œRÝç>÷Iÿ-âÊëb=öH‹´òz dtï\bÓÃVðO†&üñiÏ-”5áÇ—êW¾÷½ïU_þò—«‡?üái·í·ß¾ÿÖÜÖLàV’¶xZ׺1yêÏHML`bña±yñ‹_ܺæ“ÿŸøÄꪫ®J®tý×muæ×óãAõD9+k8˜èm÷¸Ç=*BÔ°v*çýï|ç;W,D¬V@ TB¼‰ƒúîw¿Z²û-eüx2ÉAÂO~ò“êz×»ÞXñq“û>®úÝ—?V¬CcÅj©l1ð«ÏŸZ­Þz»ilZiÓøóŸÿ\ɨɓ‚Ä=ü—¿üeRÈQ¶=æ1I1á]žl'tRE á­A¹À‹]RÆÚ?p@ºôío»úÏÿüÏŠ‚‘’7/}ìc©‹¨Röµy=p!çÝÁ;äàƒ®(J Œ†" †§9—ú·û·9©Ïù寇ÂÈøÃêæ7¿yr_£A¢2‘›„?ò‘¤ë&tñ=â‰FI1,&Hl "`Í“´„²q±¨€~ðƒ)vÇÄ+þhä÷æe»êÉË\¬¬MK'Ý3M7¾®vÒ ½ô¥/­nw»ÛUÆ¡6:gA:³Ýwß½zžPYlîÇ?þq:ÎÛTŽ'ƒ­n~»j›ÛÞ£úáÛŸW­¿t|™ï&ÓÚRkÁÀÜ0ð‡o|¶ºä;_¨nú€ LîÜj)w-6îr—»TÏ|æ3«m·Ý6¹…³–âŸH´èF7ºQõ¥/})5±Í‚‡Æ£ý褈SKàòÍ»Á’„œ[ßú։Ʀ‹õØ"tË‚©”vbg¹®£ç.¯eyhüà?HÖ$JLnáÜÍ Æ@Ygh0~ÊÕeˆï¿ùéÕíz]µœ²Zm³Í6)&H\Máï|gÒD±œˆ‘1Áг‘öh<èAJîwê‘à€²õÖ[§Éõ•¯|er¿sޱãŽ;&a ¡ I3¹·+"或- «D(/àÐ~qA=Ù÷ÙgŸô¾ÑÎ_üâ‰xàÉjvƒÜ •ÂÄ]оÀÂcàV9²ºàï­~òž—U·}Êëû øë/ª®üÓw‘8¹å¶ÛW«®³mü­®øÃ…ÕU—þ±ÿßÁ5opËÚ=ð:ýs—_ô³jý—öÿ;¸ÖMn5cm±Ë~ýãêê«þš•Ù¬Úꦷ©6Ûâod±NDré…?¬7Û|‹ºÌm­Þ™îë­¿ªºôW?ª{鿟Íë5”®µýNýÿW_yyuÙo~Úÿï`‹-¯]]óF;ôÏ­¯×=»ü÷¿èÿw ýø–׿YÿÜU¾¸ºâ’ßôÿ;¸Æ67¬V_÷†ýs}T,*þ~ù骋þ»vë}ìKf|ç­+GK9íà€f„úÄM›2­ÍâW¿úUrçŽw¥ˆ–+àÁòÀ.””ñLÊÀÓO?=¹¨¬9ädíaI¼((/ºè¢äõÀ¥&DY4•Åêõ¯}JlD *ÐÍjäoœÅ»Ë•+Ë_;bmu—#þ½Z]3ËXoºËt­‰“ŸÿüçÉU.OÃÍT¯îÕ«W' ™ú¶Ûn»ä’ÇBdBîSެvÍXVŸ¶zšåcòo \y;½ß k]m^*çÏ?ù°ê6û¿ºÚü×K“Å‹½â¯Hî‰c©°QÉ•üÝ fþ‚?¦ú¿Ÿ|cF©ëïò êf:¨î‡o{^uùï~Þÿïà&{<®ºÑ½÷Ùp®î_ß{ã?WW]63IÃ-÷~Vu½;Ý7•!||ÿ„ƒ««¯Ü( aâHëæ‰M=ñÄûqµ,7”u²¬r(!Ñ1™Y¹Ãñ¶à¶MÀY³fMRN*'æ‡ tÁ$Ц‰U¢î½ÿýïŸâ‰xP(ªS8†ØW Í\ìÄ1í»ï¾)kl3F8,?30P„¡èÿŸ{Þóž)x›æ¾Àt``¥CÓí­½@fº“ÁÀ¸û6• f‚‚‚•„¦0d.ä¢MqG`azÑ‹^”„¥6aÈ’„B kÍmo{ÛJ‚"ÂZ(±k‘ëïz×»*Ira®¹¶óö 4y®Ø#V^OzÒ“Rùø&<6ÄŽö°‡U«V­ŠKe?E‚ ù^.ÌÄ|18þûÇÍ0Ž¿…˳FÉZ´“ÁÀ¸ûv™¿&óJ­K„"nq£$Çb¢Ï“»Òy{×dxåfÞ•ÜH9în¼)$GäÙ0èšz tc $PèÆM¹R0P00F ˜¨Møc|T©ª` ` ` ` ``"³3Š äáJ;í´Ó&‚P\s4ˆ.Š÷_Yb»\ßÕ7èšëº1P„¡nÜ”+Ë7{àu@vIwºÐŸ7ÖOXèç–ç     ,E p³kÆ×.Å÷˜ö6‡ÂiÿB¥}cÇÀö{>iìu– ‡c@¥&‡í÷x|µêÚ3¯Í÷IüÛ#Ò|ë*÷    Ì2Ƙ<JÌЄq,ÓG‘²nMøQ¥ú‚‚‚‚‚‚‚‚‚‚Y` C³@V)Z0P00w ýÍo~“2ê̽–rgÁ@Á@Á@Á@ÁÀÊÀÀ9眓ÖZo»xoYb†÷åÉ‹„oóOÕúÆ$‹Ô”õØ~ðƒ)›ÜŠzéò²sÄÀ½îu¯´Hëo/·ˆ" ˆ¨Rlù`à¯úC½ØâÆ•Ÿ—Ï›M÷›XŒUVœ“ÃÀ·}|µþò™‹Îçiûí·_µõùÔSî-(((˜=®¾úê”N{öw–;fƒÂ™Ì[s(+·ü%—\2‡;Ë-Ë W^yeuk\cy½Ô”½Íÿªºú¯ãôßóž÷T_|ñ”½eiNÁ@Á@Á@Á@ÁÀø0P„¡ñárFM¤yðë_ÿ:-–å8Î9.P0°Ò0@Z½zõJ{íò¾SŒ" Màã<üá¯>𤚥E|ÿûߟŽ=öØêC™ÀK•Ó‹Ï5WàžþV—     ,xSŠÉã¾CÀñn»íVrÊ)©fëªüå/IÇιV`q1°ã£^X­Þú‹Ûˆøôg<ãÕk_ûÚøæå•     fŸüä'Õnt£ÙßX04+tVø‰O|buÆgT^xaÿ†¯|å+)­ð#ùÈþ¹r°8¸þ®{UUm¥(°°`Ún»íö¡+ìi;<âÿ­ýëí­Ÿð„'”o66l–Š   f‡›ßüæ³»¡”žÊ:CsBÛð›öÚk¯j=ö¨~ô£U·¼å-“ Ôëõª“N:iøÍ¥DÁ@Á@Á@Á@Á@Á@Á@Á@Á@ÁÀÄ1P,CBñþûïßw•»üòËSÜsã̶µøo³tZð½ï}¯:ÿüó—NƒKK     ûØÇ*Šô“Å@± M¿ í·ß>m!“Ü~ðƒ =­T; |ý¥ÿXí|ø«U[]w6·•²óÄ€ä!×¹ÎuªcŽ9fž5•Û»0Ð[eµÙ%c_~Êù‚‚‚¥„‰‡~ùË_V7»ÙÍ–R³—\[W-¹/‘_óš×¬ûØÇöÝâ 8=îê+¯¨zWýub -Nì'ö %V±EW¯{Ýë®ÈóZ@~çÆµ?ï¨GT;¿àýÕ[m3–*¹ú~êSŸª¶Þzë±ÔW*)(((˜ /1;|Í¥t†æ‚µïá'FhóÍ7¯$U(°¼0`‚Ê׎ÊãMË$˜¨*±pÕUWm<¹BŽÚ ¸pÜV&®º_Å¥ÕÕW]Qm1ê CÊ}á _¨N?ýô”ÍèÆ7¾qµãŽ;öï æ¿øEÿ¿ñ‘š¸à‚ ªßþö·ñ7íwÚi§ê7ؘ͑ëdsaê;ÞñŽÕ6Ûlè¾õ­oU—^zéŒzîz×»V”NÀ8ûú׿^YË*NwÙe—jÕª dN¿S&§RÖÞínwëãžEÿ¼ó΋*ÒþÚ×¾v%+hÀŸþô§ŠËgÛn»mu»ÛÝ®ê÷¿ÿ}õãÿ¸ÿßÁ8ðDzzç;ßyF½md0µ¼!vß}÷Qäg?ûYõ¶·½-½Ó—¿üåêõ¯}º~ÔQGUpº÷Þ{§{O<ñĄ׃:¨O¿>üáWG}tõ׿þµÚo¿ýªÃ?<•yÎsžSé+¾›ãë]ïzÕ©§žZxàÕ;ì0ãù/yÉKª>ðÕ}ï{ßç»þ˜7¾ñoŒž±ú_.ë|ÐÁ؃q¬MùñbµqšŸk¼Ù lŠœn:ºçc¿é3Ïah&>þë°]ç­1tûÛß>1\ær"ô¡]‹ûË~á1à{Çdn™ûÃþÐoˆëÑ'bß¿¸‚0ÏÓ/~ñ‹+ 1–íócÌ*F[?q~\BѸ¼ë®»VÏ|æ3Su÷¼ç=+Ìpk÷Ç?þñø›ö²eüñýsÏ{Þóª¯~õ«ýÿ>øàêE/zQÿܰ‰PõŠW¼¢r`†ýèG÷—(p®N>ùäêÁ~°¿Õ¯~õ«T&g¶Øb‹ê£ýhv”ùö·¿Ê¬_¿Þß„!L|¸ ÈúÔ§>µ?~" }ó›ß¬¶ÜrËt¼ô¥/MÇñs‹[Ü¢:ûì³ãoja"‡qàïîw¿{z§¼ÞüøœsΩÞúÖ·V§vZõ°‡=¬:ì°ÃÒex{àÚo~ó›êIOzRÂáOúÓþí[‚Ëe—]VùnܺÝG‘·ûî»'ÇùOúÓÜî_+ûdº"øÛÍsÏ=·Z·n]õ¾÷½/áí¡}h*C zÄ#‘]ö¯~õ««§?ýéék A± ~÷»ß%nøóŒ•Ðÿ¦Å¢›ÓBÊ´òâ‹/NŸ:hcìÌS¡ò“0`\Sb؈ £ÎÄqì)vÞyç4OéSÎǵ5ÌçÜ{νçÜ[ïZûì}öP»êÝUõÎo•çËãò>Ç}×,´ŠÀ æmÐ÷âZs_–Ów­¼¯߃Û^÷ä3ô‘ÁN÷¿G;}ÏÕî£è R¢Á¦éÆ ôõ…î’7Æ•þçÎÌær£CŒ[L$"÷Ó?ýÓ¹ÿ8o íÖ¤xºõ„̓]Oз<é£õþÆÀùçŸ?8çœs²PAˆ´ð7áë”SN|âŸìµ×^Yø ¤—]vY¶…@KüÍßüÍ,ü<îqðX•UÆ,KÏ…^8xÑ‹^”Ÿ¿þú볕÷ÄO¸ÿå/y¾×ZzÑ¿ÝhY‰ .¸`ð©O}* DIJ¬òÞÿþ÷ö°‡ XŒX¦*¬ðO¸]Ë´Ìh$ ¡‹þ—´ÒµJ/a¡þñÿ1ëåÒ„þÒ××Õ˜wÐÕø‡Èxs.¼úp¸î-C1ØbïÓÇ -Ï9.ÏÇ5û8ŽnSþï:Ž{ A&Ú¶€ø`mÿËkql_{.>jy¾<.ËŽóqn£ïúâ3'„àÌ·ÖGb£a¶™ìinÖ’ð,Â7¥eÞi§–úò"ÔyÖu46¹4éGúÚØv¼üq{O|Ò¬ë_ËŸ °_Ï~ö³³µ,¬X—^zé€5Æ5û.½~Ç5’$^LâÌKZ¸+ª¸M>ô¡ÍnlùÈG¯ýëó¿æšk–^ã¾ï~÷»ƒï|ç;ƒK.¹dðªW½* CÜ)Ÿõ¬gåküÇ<¸ãŽ;ª0´„µÅ>@#Íaæµ —èd¥•“W–ñÏ“?½1Ÿ(é*zªßˆúè꺆4Ôâ¸üoP‚œùOú‰{ã<³<ŸoZµù—É{D%Žã<¢ÀG¢íËã8g06¤Ü—ÇL{4f‚Ì Pö\âØˆ=èAKî4¥P¤Øœ‹cïŽcûòþ³îÿÐn†  ÑGìõß5¾q×3õüÖÀHUØúF6´Wî(Çê¶OtŸùåž¶,A¿»Äze0 >ÈšvbXW¸µqcã.ÅŽp"~•‹ØïüÎïdëZÂz ÿQÖÝ|óÍ9^èßøÆ€µ‡Õƽ,M¬@—+¯¼2[€íb‡Œi®wb·Ž9æ˜ì–ÇåðƒŽ¾ï}ïË.sb·¼›uŠðÏ’ÅZ„Ñ{éK_šã«Ð¹ ‹à¹‚Ÿ …¡9®Â䨂Ðä8ó~<°þëÁûß„…†‚AÕ( Ahðж:ŽI A4h¹LúˆM¯äÞûÜç>Y!Œ„P{­‡?üáù„è!yHÞÄ"±H\æÖ@S÷K¿ôKY€ !(:€½ ¸Vó•/ÖŸm0}MŸò½‚õ=úƒ€ø·˜ ×+“O>yðÕ¯~uðîw¿{Åe­¤Ó|ÖøÑ?Ã?æì/þâ/òg#[Ï´ ¾qùÞ•”­1ûÊÁˆ–ý(žQIÇåO=ü }¯ª×Ö1XhÄâR¸Å½õ­oÍñZÇ{lNX iÅ8âúF€ÒÇ$~ Àˆoh‰ó„ªM›6e%à0Øyçóx“pˆóau’übÿý÷Ï嚹ˡ©¼"$P¤¡L˜±ß~û vÝu×ÁÙgŸ÷¹°ú³n0ó¨ù/!‰Iô•‹¥~û¼ç=oðä'?y¥Åmõp‰°®dÜ%?øÁæ2hiÕæ—U“Óxä#™ËqM&„ˆ&Ø$Àò¨<ã‚_?&Ñ ”`Š”©}K¹ú‰ó ÆV™n¼_;ÜíÕ^ÏÁ7@@áŒëAHhï(0>üág·i|­çšg¸”5L{à1,o¾¼4Œ©‘YxY~kPÇQgnFÆ”ò¸±8¢y‚k †(ß5ß%,=ÊQ^S¢êë}6uY®ôƒýŸ5y‚P!c úp:âZ› äþ 5Ígå5¥€(;þ—û¾kå}õxù`!\+þÌÜü™ù ­Ð¬‘}¼ŽòºhÍrèVÔáA„®™ãÅ‹¯œÐLÊhÙËú\ÚàÂR,©hIÜk¾Gçýz…&£ h >¬‹6I€–©ew“Π]ÊbåEÿÔ¡ÌÜÈÛϦ–ÊÜQNŒý0ùæçÛ“›I>—\JòßC™i‚&†|øøÇ?>_O–¡aJòqÊ•¿o$ò3i¢ÎcC=cKnCߤ”µùüïþîïÓ¤‘Ód=L!_ß²eK>—Ràæ÷*#Yo†ú§:Lx¾–‹a¶òýÆebŒòy÷&Íò0¥ÝÍ×Fµ7i›—žó¬ql\;Në¡ä2â'¥.&%Ç0)@ò©$ô“À¶ô¼ö§ôËù(cóæÍôNK>NÚ¢aÒôåëI+¸ôœû~ÿ÷˜”ÃäÒ³tn¢ÉU#ŸO“ú0M¸ù¼ç’ûÑ0›/=ã\Ê´UÞfïzÊê5LÖëܧ’E,÷1ýÅø›tìÝvâ^Ã|ïoóžz¢b b b`50ŸÌ§¡•I‰4¼ë®»òüfž6ßuÑʨ[²Ìo5£wɺž/'‹P.ÃôÑšIéÖqÇ—ËNîC´A]Ñ“ä:—ßeîw.h€ùùÐCÍçœO‚Û]Eš2'.Ñ1÷&¡exÐAå9>­k–ËH‚ÖÒcêà>üEmòîä±ôn÷&Eÿ0­Ç–Ë@?Ð"çmx‚$ð “‚céhq´Ó=i™‚̺aOÐG»ødÑ[ª÷¡ÃI±ÚI/ûê·ÔˆŽå'ê0 z¹ÿ5éjÛc´°s ˜>›Eˆbþ4 c˜¤ðaÒ–1hÉo93.sÙ9®”É*­™1L‹çe\&íÁ0¥gÍ8ÆÃûzŠ–+ Á !Üdž´F™Á¿îºëòïšàCúÅ_üÅ!F?6“(áÀ})¨x‰ \{íµ¹üä¶•ÿ‡ðeÒm¦òüç??ß³Ï>û,Mˆ)^`˜Ü·òÄÄÄ—|°‡)}n¤L&îATL¦§Ÿ~ú0iêòó„·³4LVÆüßä¬Ïh«Éž@‘\ ò1aÉ>ª½Ÿþô§‡IË•Ÿùä'?™‰jÒBçòÓ‹¹>m?ˆM²LeR}½‡P DCÚ…è©BO¾© <ˆ/á%i¼†É"ÿOAäÔþw˜,\ùrkËUˆû«¤ÑËcH= íö®=÷Üsxã7¶U9ŸsOÊÂ5Ô6„ÁóêCP5ö&†&íÛ«***&Ä€ù Ïfþ2™ÏÌk~ó¦ù®‹VÆ«(· \pÁ0Åè “6˜<òåREk&¡[ aÈ\ž,kô6V®ïoüÆoäw7…¡”8$_Ož6Ô qh¯}¡¸Ìýø'ÅñåkiͶaŠ=Ê |÷¦5Ø2nw Cm´)Þ9Î|CÔ•ߌ'V&\¾ýío¦&ù?Z Ð$×=O©I0ó_ýÀ(ž 6†0Tò!¥ÊW.å_Zc,ÿík_ÛJ/GÕ/W²çÇ»Eô;ý[$GèŸmtõž%ÐÓÓ󩢄´ì.’¨leîâ$xSzNkp]±öàM&ý “a€Û“ÀXAp)ƒ™TÓÉ&FøOƒ+›W}ÛFƒ²Íú¦- ¨¼%†u,tp+ ·û0½Çö”µIŠY®aI(È—’%#7'¡+AœÀBˆ]«lŽ·~³´´»ÜÁ,„è9ï©Kà‡mÄ$àäÓÜNK "Yó‚ŒÎ ŽN'lû/†Íâ”Lç²GéGÜìQÜj±Õ¾ör`ÎhÍE€Ë°º}äN8m«¿Å!¹ÆúVøÀ–ãöÀ…„{ róŠo®ƒIˆË÷q#·¤L>Æ2jq%Š8"¼@Ò ¸V^šhób£â+|_k­®\Rû ú’½ñ}¬ï™z­b b b  ⹸:­Ÿ`Ã;˜ÓÌmãÐJÏr‹NÌpžƒ“à”yüIÆ¥5ãЭ²l¼eŒ2÷iÑè¶÷K%Î:ë¬E…ñ§o{ÛÛr(FYžc®ÜÂ7ÐxîjܬA[¹ùBñÓF›ðl )õ–ø! \Ä“ã^]—µQœS¹X¶óhxúÓŸžc%¤@sƒ¶å‹é§‹'E=_ò!ÜØÅY(p-ÚßF/Ç­_.¤ã'úœ}ÉÃvÜ>˜»˜!È€Py›”49h.ix3C÷Ä'>±«=õü21 ÀLêTæÐêàW\qENǪ³†¯e’öó0©‹ Ûm¿ÃÀ6 è“Á¤:Ž ^ߤ­%ätAøë¯û0éÀÿdáÉý}±=ÉÚÓUÔ ™Ÿ—‚ž«1;v“/éB Q‡ˆýÑÞÄÆ€¸®_€ðS†—(Óă{´‰`ïj¶7®ÇžµØ¾ÑÚ‚Èðuvž¯t¼72]¹'ŽÎI”@èæ™&h‹²á@V¬íˆõXøi#"„"“½u\–æ¸Ròî ËÅ…%RÄD.·œ•Ú¢Žèu”¸ƒgÂP'}¦¤‡A§ƒæ>Š6*¯äCü§€¼á†rì1šÝ£êüh_¥¤MÚ[Ûsws¶mWÖà\TTÅm˜ƒà´ÓNËše ¿UAhö&¹^emˆÎ$‹IÙ©b0̾³{Ã#|OJžðÀe½@ûõÍr¿¬‚õ —É1ßü¦óâ‡pÂL”BWrmË·%7¬%‚Çq-Êé«CÜ3j“§”×Éo9o´@ˆA™¡g’w…eE&¡e³I,ó\´á ˆã¸çÇÙ«Ÿ¶À³€Oï“€D;²&K¥5Vt`eoZÄøŠùOßZ.lGY±Ýd‚þrßUŸ«¨¨hb œÇãálæ¹Q€™·ä&ÞœšÜÄ3­m{n\ZÓölß9ÂU@ŠuÍÊ>%b3‚Pr‹Ë–1øèƒqë×WFÐÕà×ôÅ>˜;a(*‚(ùŽfs`hrûT¯M4ò:0#¼G(@IDATÌ`t¬èP±ŸÎÛV·”Ÿø/»,ë…e›ë«m«¶Â¹d±æ”×·q€UÇäé[xoGã<ËT--µt¸,5ÌÞ/JYq€Å§ &Iuån¦mˆZJr009ÓkE£¡r/WBˆuSœ£ÁÒ–£Ž:*[xR`nÎîC{È*ƒ¨È2'í)—Oý—0ÃõomÔÚú<áM¿O‰%2î­krôÝáÑ<¥ž,CÒ²š¯@«ˆP¸/ÛH®rù¿ 9ðšˆÿP0«g#7ߨøqýÜsÏ&+'Q¸!ùQçd1IÐj ôlQÿV T T l…óJÒôoun5þàš´Rr˜¤ÊíãÔ˼l®MÌú01ÝÃäþ•çÑ´TðL  =}´fRº Ð=ÉÔ5-FÈ8|‰gýZ“Qo.ç]²í53î)ÇDZNúͲõÍQ‹ô5Ÿ)ÿ{žË[ þ7Ï•×ûŽMÄã€~ÔÖ—Úˆâ8åÕ{**6.dëœ0§Q&‚¶ù­«ŽãAñìJiM”Sîñãò™ê:N}¹Šu BÞíz²D•ÕØê¸6aîË$ ñ‰ŸLtÜî¸1¶%P¸q«[.LJGñMz¹’úEß‹þ7ªs# ©¸-¡Ð.Ѓ䳔Z[:@© ë·~«U‚Õèz}[ À»Ì‚Ãe"yÃÞ08ì°Ã²f ä{„L:W|¯EdÖÔ}0œ¦V®æöIŸ%¬Ö{fÙ†õX¶ [_Ò·¢ŸM»VRo‹Ãõ‹ýZÿc9Âd#Vt' éߣüÓWò®Qí¨×+6 Œ»1Xí6›¿b F4æµÕ®K}_7¤øf=<à€ºoZæÉÄ÷óæÚ²eËß± ß‘!‚gë«ÃÜ¥ †Û^#Ä«ŒϬû‘ÒéeÆŸ&3¥„ÎV‹%:/Öˆ¦„0”RžóœçäUq­ÂkÕc‚ SJ8Î+×ä™YdÿÀ4°ŠÉöÆ2¶ÿþûçÕše cÓ¶”9ãÐ"Üà<+½\øÎî[㓜ý¢Â£OøÈ`ÇŸlÏîÕÖ¦r •“:° M®2³L$á/0«÷Dù³Ü³\rë\ 4ñ±’²¦õlô©ÒBãlœw¸W† “O>y›ÛÍe)%úà#ùHHXÚSšÖË1¥Ò[ßúÖ¼6†µ4(E9älQÞ¼ysŽ3ôÒ®2½C¿¶f7 e¶ršïrŸyY€°ooÝ*mm«§Ì…ÚÏòÕÂKx…ŠŠÕÁ€q¡Rѳ:oß8oY MŽŽ“¾‡A¼-ƒ›L‚ög6ÎkfrºŠÆÆ¦o¶ÁÜX†T®¬$‚Ç"±\JO‘­BañD8¹Ž±(}ë[ߤë9µ!!LÌŒT·RâÊpQîã˜à!s\ßæýˆs‹ÄÎcR+7ulÛëT´=>®¬`R!á~!·ó ƒÊÌlZ:á´À·°ù>ñb?­w¬f9;î<¾ õ2€>L+Ah\<ÐÔ铬t¥…ò½ï}o\ZÚûÖ}š|ý¤Ì¾B£Ì¨«‚ô-çÚÞãºïÛ•Gl™þ=©: ¥±ÒŒåƒ3ãΦ]ík{'œ5•#Ê2.J\F[ÛöM|´Ý³ç´Cß ÆÁ¾üŽmuJë Þ÷¾÷ þäOþ$gîKkG-ÝÆ–ð") 7X ç|gó åÁ’2%­•”]Øôço~ó›Y²x,žÅ¹$³ IZÃåM?Ik7dË3ÿ™‹ºÊT·´NHVL]yå•ù^•$œµ•S¾Kÿ~Ík^3¸üò˳ÒÆUŸoÖ“–Óœmaa÷¨«…ûô[.Ói®œjX¬©@âÕʪ¸ôAêAÅÀÀ@ÌYæ/ó™ÿöãÒɕЌQ4¶D¿ÖmôÎühÞé£'x3t LJÊ:8·Î]ïi£‰£ÚÐÇWt=;É{,8Ë%Ú‚ëàÕ¯~õRlÿ´”ǹà?ú^ÐU{mÎG?-‹˜+aHÅTº¬x4 ¬ôJŽRV![xŸbëJt þ¸¶¦€Óöâû„¥¸†Y6Èa›Rúð¦æ?’u«íc¶µgçÊoÇÓ(wÑʈA}r.¸¼Éë¡ä¶èÿk_ûÚÌ”é_âµÒ§yRæ‚dky>Õi1·¬µÑ¬Ípýõ×ç,¬z¬‚Yb^Ì. <"¤ «Rs´ò´\ýEÂ>Ƹ|OZT4[R)¸P²&z'×SnUÚˆEÐŽ?þøÁ©§ž:ò³Q2¨³÷›üöÙgŸÌ´«Ã'?ùɆT=# ÕÕ}}ïÄP#”V¾6ÉbŒƒ¦EPód ÅàsýT&Ÿ%ÓK`pŸ¶¥…à¶ÂǬ×[‰¬tƒqÇ4AåMozSVèøÖiÁÖ¥5"XàE»á9-æ×*ÄRY¿ è{”C,çÊòõ“üãùzù£_rM¾à‚ ²Ð¤¯´•¹ûî»Çå­ö}åÄÞÏrEˆiѾ¬¸J‹ýnSO¸‹XR1‘ê÷º×½. D\5F^^xá`¯½öÊé×YŒô« ëãÎ'³l»:˜ëm}°\š1 µæ +øE]”×Ý¡D¦”A‡k ¡!æA°9YÁÑ1Ê ó_̳mô'?Ðó3IÓEçÚhb_̧f”Ÿ»í¶[æ;à$ ëÙIßCo!ø°*á™HèïjAÐU}/xµ8×Z‡tqÍ!1byÕâ¤]Ì+àZE7üa ê¦ÎŠSX“Õ”×1sP¸OLä01$ä&†b˜ðaÒ.䕃砊WáŸn¿f8L+8 i /­ª´ñCý3¹ ÓàZYº«Z}:­+°´rb؇IÍÏÂaZÐm˜&º\ ÷&«M>Nï01¹Ã$Œ“ëæ0忦x·|Í*ØIx¦Im˜âÀò¹äÖ4L.¡©I?&y^UÚøÉ=h˜þ|ì§|OòJØÆ_âóqbhó½§Ÿ~znW$òwN®¤y…íäNµTV×ú[™; a¹>VžNþÃÔz|˜,AÃä²™U×]vÙexÖYgõNøKÖ…|ïW\‘Û™¬º¹Ý¾C²Ò}Ÿ$¸ £?|Ö³ž•ߟ”¹{°‰\Øþè;Iˆ&!dxûí·“Õ:ÏIÑ’û\[ÕCŸWOBÁ0 Ù[Ý’„ÐaŠÌ}É|ZB²­w)çÒ¥$ “OwþŸ˜’|œ¤¡þÎ>ûìa¶ó±û1Z<¹YäïéBW™I€& v~Ö=É Õ[Nù®SN9eøÔ§>u˜„ߥ-¾[=sÁéGNJ†ar&Á>÷»¸¿IHÊ+³§¬™qºî+Ö’[Rž×¢QÉËe˜¬'y.F+“9ór}´r¹4c›3&¯™\/´)Å1“0ãɼ€%¥PFZœâË—p˜â¬óœ^§Ló|—”×Ã6ú3 ç“Ô¹ÎyOIûÚ€n£½ÉÕ8óI`Éô=YÞsuûžä=î5ÇãJH‚äðˆ#Ž(OÍô¸IWõC¼A²ÄuÒÕ~q=•¸Z0³¤!-ßZøò\=^] ”߯ñ¢Ãß}ðƒ|wüøœf›ý×/Ã]®ÜuhVÒD›ãh¨Ód˜­~Ox²ûQù`ÜwbrC ‹gí¹qsO±$%îòÅ9ûÑŸBñÝô«¶Éº¬6÷¾C=4oÉ¢”.`ˆž>&F†+Ä)’py\*%bQw_B×]17„s®Î2÷„1nlú^×Êâ§oJh#Xt•Ã%ÞEãz}CÐãÚ¬§¾@!ð’—¼dpÓM7m•FL”q‘¬NyuèÚƒÿ®gëùÅÁ@ÌçjÜu¼8­™ŸšÂ¥-è¤ccª VJ3&¡±å<å94’…pÄmÖüBY‚~4¡FÑŸæ³ÍÿãÖy’÷ }]màŽOùZBI‹ûž-ŸqÜw/Ť¹˜¢³8‡•ϬôXÿ‹~7ŠžzW†VŠñúü†À@K{h .LlÄs}÷”×døÃˆÊjÚÇ] ã— P\ELÖå„ßõ&•`c²  S B›´îžÏ7ÊãX œÓ6Ú·¼ÓD0î;1Ø›“·˜˜V+íb')œ’¤Œ[v”·Z{õ*û”ÿ“Ô•£ß¬Yû 6×Xë@r=Ë[´/2¼ù/)4¥‘Ü‚µN$¬ŠùÝ+_½>ÄBDhRn@Y¦µƒdµ“Ô%î¡í*'Þ¥,åèßÉ5eIàò®f=ÕGŒZH‚fÛõæ94ˆhóZý¿þ0c­wq×­ÕÉ%yÍÖ½ƒ3ó}‰»r~kârµh†÷†×ƒãH(C¡cž‘i•åYl+ºZ*éÜ_Â(úSÞÛv\â¦ízœ›ä=Éݼ³ æLí+A4ô=›o(~úî¥`5׈B‰åQñš¬C« ú\ƒï* ­öשï[H „ðS2IØ$LÖ8 'È‚¤{—§ŸܱI¦ žJ¸ € õõ†•6§ÌêF¢ýJ1H™`T–bV™Ó— êA2a @åºÅ"pÉ%—äL_0°VÖ—@ABî ãF\Ö2b^òÚ‡~xnÍ—s„+Z>nܽ\gI#4I2Ñ…qÞ?«{ô©ø^1iǾÉTôÕA_üÿ>ûáÁƒ›‚Wï½óV·NÒOKá4 )ûNdRr >»úf<«-mÙ-»Êñý@ŒµwÞ9»v °M(™›æµåþ—hD¿­°þ1 oƦµ˜FõΕŒ”ÿ‹èˆñ‚™^m(ç,84Žc,·Õe–4£ù>î³,È8ñPô Ÿè¨ïN¢àDsAÌËÍrúèò¦£ÞSÒľ6 ·,îè0÷c ™|! õ=«-ã¾Ç}„ núêÀ-™;÷jCô»ÆûÆq†VûëÔ÷­9¬1´ýN?1q=LŒ%ô¿©Ÿ¸ÐÆ´ùÜÉÜ‚L¬&¯ëRÜG˜ä¬9&r êÊ'k‰Ž{æV¶µ’™ôÂ×¼€9mË ÖöÞ®s˜nYÛø¬ËÔã=b3"sÏ;¹äad¹O\Æiœá…pÇw›f‹;ž¶c^7„U”z°(`¬6.eb‡šøÐ˜ ƒ‰°·-þÇ'ß=xУŸ¼0´œ²Öâ™h»=Bf|Ùs+aYœD¨[‹ú×w.6X¸‡“ù> ÆiüŸç½9Ò¶V`ü—}ÐLÇm0KšÑ|zÊ…Ëœ‚.Š·b‰ÐYôCœ¤Œ•âgÅš šÐGš÷®äÿ¨÷”4qTRR¤¼¾$¾‚Õí õì$ïyò“Ÿ¼´„B”/þ*%Pˆ¿«¶1[öÇ®—o—njï¡]OÌ༎É5ˆ´ÎçcC,ˆ?§À5Œõ «‹I<+f™†—¦–¦˜F¸$«[³¼m˜ü—·»‡È*Iÿ¤-b]±™,õOV›YôO– ¾wš´Œsõ!dtÁ])V‡E'¬:ÌÒzLÒirGh‚q'°‘P $îoÞ7é8ÃTzD%Dp*÷„¶:•÷6=«F™´ø\BE“6]øh¾oÖÿá[¬ñ…PcÆWŒ1íœä›üå–ý>öƒƒv¾ÿ¬«>“òõ$Éž d3ôëJf‚òZhc»eÌ[þÇt<ÉX,Š]õCõ\‹ñbì¢)A+ƒ—ã`Îe•éª×¬hF ßšŒïK‰f( ¹Ý¢?%x¿5ÓxMøæ„Iô¦Ïº¶\úS¾wœã¾÷”4qT( Å?âØF±4êÙq߇{ï½wöPGîyÖãÆîÿj€þ®nÑU¼Að®¤1¦Ëº¬NÍÊ7Ö㊵ÆÀ‚PTµÔ8Ž-®OsOƒÂUŽuǤ͟™¦ª6¥D%4»„fmÖ’IàÀì¼=¥+Í!!¹+v c+P~9Â^>Øn‡š§â¿vD™„©"Ù¯Prˆ­2þ,ø:MÀJl!!…5 š€adÕ$§´âÍËõó¡Å/­»6mÀëwú£ùQ_d½ˆq8í÷­§òbîÒ¦8Žq=ª³¢Í÷bŠmMðþˆsu²·OrOý‡>zvh{O<_Ò„QmSqµñ|ìG=;î{ „‰‹XÖ¬#ÈEoµ¡h½¾‡ž”}1Ž›ã¹ C%æêqÅÀ Œ;±(¦÷2·.™À¤fÒ&ëJÖ[ð”.J‘]a6hö/‰ÄÑhJD@CˆAë‚E„´'ˆâ… ˆ–yµ€g¹瞛ÇÞ´…!ZÙ´>SvOi†¬ olIúQ…¡þ/.u>ë÷,„!Â7/d “ñ}³É@õ×rí®bl«KéÖø—Õ´ŒSÜúêôþUú8ÈJwù¦KÞô0=Ý’ª04]|ÖÒßþLZïæ ÏIy©w˜ËÚ2Ó#ö6&0^i±ÝÌËhÄÂvHJ3Èö Bë[ [X„³ˆ} ôpÛàŽÈ3 ¸®r/dî˜êÉšÃeع;î¸#Ç!8&ÐùÏÙ´Òz—ëܱ(;lÀ^àqdñË'Óë°gÙ"#¹D\kîûîU_.,,T´·¥5™Ë¯ëⲂHµ/ŽF;­kÂÀ¤÷«'a0²dzO‹§÷ÑX«?ë\dÒŒ{ì¹ÝRHˆ2K |GŸÓ^‚ˆý,ß=­²o¾ùæeY̧õþy,Gâž «ƒBgd-3¾®ÎÛ—÷–ñ'–W~}ªb`î0ðß/ÿ£Á¾÷¿æ®^µB¡é–lâ/xAf°eý±0àÑG]60³rÃ<|ç¦?|é¯nßêº{Kë aKjê² k@@;WÞc1½YBhß1å„ © ÷T‰M¸Ã‚oûÛ9ö-›áᕯ|e’d½’Êœ N0êe™£îõNëŠ$dŒ¨,B!°ž“˜\+{‚R¬å¿¶HÖâ-ç~™áJ¼¡‡PKHlâAî@pé}ñÝõ]f1žÛc=²å&Wd?Þ‘qã"úç ^9õ"—ë†<õŠÔ+UZU«Y1P1°>1@Ë~ÜqÇ žö´§åÌ€WŒ—l| Å0Ûø³cV­ÏSnåz_¿ôƒç¤¬{åuAËe6BB—sÍ{JaÇqóA±Á<ÏòK`ì1 ¬ÿR›Æ{_ÿú×ç² -2*²LlÙ²¥µè“O>ypùå—g×6 L|‚é 7ܰt¿cide£Rg ¿bæ ÚÒŒ÷cõÛk¯½²+ áêmo{[^Ûi©Àpä²'ÆHJ`AÏ2?:n¨{e˜$ûæÕÅ,+ëÌ3Ï\*Šk+úFâÂø›ßüæ,À"eØ ÷~8ÚaÝe—]–¸×ŽÛ¾|ÛMƒîÿàÎÛdU ­|×MR W Ââ@xÁˆ.0¦Óïyï{ß›ã)àîsŸË‚'¸6À üe­‘ºž•ˆûA‚%°¸XSKySc âP,>%#ÍòwÞyçåg}S–!.<¬%†Ôù∤¬ýØÇ>¶´Hò¸÷®ºêªlÕºøâ‹sÛ kúÁˆÀ@8ä®Ç"hí.õd5" K]/f­„qîgºõÖ[3•Ã4qN`b]ãbŸúÔ§|ð’PdÎGÉ2糂RÒmÑ?gõÎi—kAÑjº«„q–_1,ó²ÓŠFWG¹Ä®¤-!G,91k è3ÇQjÌ3P-C³î µüŠ)b€²çž{f7qäïÇl­œp K«LcðhÌ+,âLXb$K`ÅÁt ¶·¾†ÜõQ‚ÐòÞ«Àhn]€é* 0æÌÙ¦04ÒÖá*¡¯¼ò¾®cîï*¿8/ÐûꫯÎL¢sâd;»å–[â–º_&ı°ú Ú·Vb‚‘¥Ÿ¦`°ÌêÍô±èëѰMÓ2—\­´°×^{m'ãAp hQ+cƒu£ÍU­1‘xÀ=¬$Vm«³GZßÜ{ä‘G¶.ŒÜwo$B°H1œ‚8Žkùä?e;Ƹ='\Ї¹ÒSæt)MºÊksŒ9ßnVýÎ^ á<úæ¬Þ;Ír1Ó꽈ÐçÉ¥5À÷裟+¡ƒ]4Û»Õa¹ÂŠu—Ê~Ô,‹Õþ3ŸùÌ6ITFµ¥¤ÙêÖF³‹xC1©ÜsK0ÏÂg_ÛšüH<¯]eœ×Ncž"Ì;P…¡ü™êÏFÂÀ#:°Ã½ïÖâβÝ&î1+üõ#(˜;w×¹…D|ú8ÇTŽ)P,àXœ†ì‰O|b®²òü¼}Î9çäs˜& 3!I²Ø..@yb†3î7;L¢å`V”˅˰˜GŒc·1.¡ivžæãSa:€[1CÖk Ýd—¹®7Üû?3Øþ^÷麼¢óÇü1]QA#FDc ¡hÄ#c_&Ø`ºá‘û qP%ƒRÆ]‹j%x«ÕsU3ö¸pM%£/ñÂ3ŸùÌÁÞð†ì'K™$M‡Ì ’0ˆû‘JÚ˜¤ŒoBTß½’:xŸÅ¢ÃºÅXlÖV%ßÒ<Ætá¼YŸ˜·ÄËyÖ˜˜%¨§ Sh‹¾8n}gY·õZ6Wï“N:)[A èþG £>Ï2˪(ñ裟¬,¿è eÆ%—\²„6qiè^©`AÇ"ãYͦŒ`Q•|ŘRßqú„¸<ã­f!Q'n¶h®ÿÆñ•W^™ë(c¢ÿ„ Æà÷Pñˆ‘¥­fóÖÀWàlþçžËaU%à™ÏÄZû‡×Ð×-#à\+)x(tñ#®æ$VÁoh ·n1°€ÂÝ Ĺyæª0ä U¨˜¤CæC›‚yk@À €ŒÎ?ÿül®›!f „Ì—¸.<&˜L€VŒͨÿ„«˜Ø­E´)¹Ô¸FX1á†Õ„WZLâ´D1™;~Ç;Þ18ì°Ã2Ãar6±r)21›Ð1!á’Ç d,Áä“zy¾¯àfLI[©ÿÏk?0Øá'îÛviÅç0¢ï±âÂz ˆ>‰¥ïÓTöÓz 3õ²—½,[1y.[cèÎ;ïÜæ™ÓO?=Ç à?è ƒ²å“’!„mã7W믱…‘hÌ¢…¬­2»a. ›ã$“˜”ÂPôÅY½«–{7ôcJ}3&üÀ¿ïrGe5E?Y"Ba`ÜrÕ P^XGâºè ‹f£„&‚9šªïr½ÇBl»âŠ+òümþÔ6™ GÚm^ AŒku¡Ütï8q,ÁHJMšÍêæËʺFÑÚ|ûí·g! µÑ|ŠYŠ®Ø1õÓ&–#ž  ‹qͽ܌ͣ6õÌœ‹Æ#Ô ¾h…Š)cÀD&öÃ$DE›$˜Zp5íɸ&@›6† œ«acˆ¶¤}ï£Ùbi¢ó3.p·ÃT̓ɚŸ¯:c`_4M´É´vqo”O›Ç…ÅÄU¨˜7‚jÛ8 ó$ï‘ ‚©ÛVc÷Þ„žEˆæ˜Æ™ bZ0ä彬Nåe„åÕ±EËëÆ,!(Öâ"S^7o`¸¤òæšg|u½»ï^ï¦íë͢vD|ë5%”šaçÕ!RŠû?éý„/–lZqu%d4]kÅr5ñ`ƒKç•1K(ûß,ß3˲YãÂ"7Ë÷L³ìW½êU™.*“Õ”…H¿Æ)e!+*á ‹~RæøÑ)ô `ì›éíó…ÆOÐÐ6šM¨"|ÌõA–!4›Áb; ŒýX82„€†~š3ÐV &H ÂSÞC¡PBI³)I ‹pC°aµ! Æú< X” áŽÊŠl.¢Ø ßÅø’ÐöXª$j‰±‹Gˆ˜È¨ó<óÕ2_©î7 þû'þ`0üŸi{Y{hQ þ"­k´Ò%Ðv²D €ÿ˜¥°þÄõæÑ0áqi h¾#ÎÛ7µcΉK  BÔ?6Z,±O„#„ƒWB¼;4Nåµz\10.Œ›I-ã–÷a®Ú¡¸ÞÜZXBi^_Îã%¡¾çÕs·ÝvëÝ£îÅ()«„úÞ=Ík˜S‚Ðrƒ6kAHÝ0sÁÐM[_nÛ'}ŽàYÒIŸ_‹ûCxñn®aíðlüï?裟²è¡?%í+sŸ …Êí¢Ùè!‹ á$è¡LŒãÎSÚ@Bïí2Â2÷ØSl–8qŽr²„’f£Í\÷ÌUBÊ8ІO‚h™í±ìK%?rÄGd¡ &*‹y$Haý^$¡Z†Æé-õžu…o.¥¿}Ò½é‡WÚ`š#Õ$17Ι¬hpJ ²ŽFÀ¸“lܯLælD0´1Žšn˜Àãzsoò£é _j×ÅX`¶hųˆ¤`0 reÀ´TX |óÊó?³÷‹Ûí°ÓÚT`Jo &T¿ Æ”Ï~3ûÙ”^W‹©Øú^Ùÿ¶¹aŽO¬†Ð8íæ—‰5æñ¶MZÙG?Ñ–‘p¥ä^à{‚Rø@'Í1}4=ÄìS:F\ÊÆUüÅ3Qqöhm¹ž™gÔµäÊr¤àÅ‚Jic¾,× +ï-á“ ÜÅitÕ_}Xñ¸ñ[;L\«äNBMaÙ‹ræ™G¨ÂP|¥º¯˜"ÄÏ‚LPÌú´J|çs "<õÔSsœàC1EBM*Ë$BWþÇ\Õ‡«÷‹hÄM”BNÛ;ixd›9üðÃ{ØÃ²Yß9B8B ´Œ01 0íš8ÛÞSÏMÿxýÙ}ÿ™úbU¸`®`Plú@ðp…ŠYb€Å £¬Ï• ³wÖym–˜¿;þКVR¨·¹[÷ÑO:(þÝåÍ Ž&íc‘áv)v†+¥è£ÙÒÙsãã¶*ÆÇ3ÜÈʼn­x§5À¸üi3×UÖ±mÀ’ÄrE"†[k(“xŽHœÔîú\÷¹Êá¸ÇòX—FÁgœ‘­?bÑ~Þ/aQ’Œ!ÖJ‹ræ™G¨ÂP|¥º¯˜"XNLœ‚ÞM&`“h˜¹ ,&Sæh1þÇáS¼œj(ߤOàâ ƒˆ˜¨Å-}÷Ý7î|ƒ‘‡ÛA¾¡ñÃW˜FKwÂ…`23x ­Òl_^«Ç‹ë¯¿~UÄó ( #d5Þ«‚´ú’©`@‹ø sjã ¾8•͸ò˜ÙEª3šÅ­ŠEˆòC¬^<æ1É ¿6úé›YÝ¥x$ÐRèEYÖc½ X‘öYY’z€Q4[R#î`¯{Ýë2Þ+kV@¤´ŒDÜt H]®¦„54&ZÈû³ŸýlŽ9‚ Ùãðáõ–†%­§ÀeYâXºÆÅ½Í½wª“¸#&‚˜c€GˆØÌxnžy„íÒ`¿'Z4j¼Ê{Ÿ‹€xŒ VŸQAb•®òGùñëL¦ÿYh…™¡›TXú굩ñè·ÞzÂæÁ®'|dlíyÙ?e2Ñ蟬/ÖÍè럞ÌHè1Q—Ž\˜èW \ß.´BLÔ‚F-²F( 0ŽKa1Šó]{qL\“L”þ I£éÑ.Z)Áé²ßUÆ—dÆ—Ô¦4ÓJŒ1¸dŒMÚ·Ç«åìïB†ôÛ ƘÍ1Î0>•Ìþ;lô7æ0ÖôΘ4c<”AÀPGóöj»,7ieðrA+Y~ÚÆ±z¾/¥|F;¸Ÿ¡‰£ ~šG¸Œñ˜(éV”)#žß„>šÍb¢\ ÅY+fàNáF{ð\É º¬6êŽ6³ð e„`ã4»l;Ë’ïD ê¸ÊûãNx‰à+š8áAÂ;&–Y-¡®ïªmmtukî,ZW÷ëø•ߘYúá&Ú0²Í ȸ'òúÇÿ•î3"â4^ ?^Ú¡"ÉAy®ïXàgü÷Ò>1Ýs7íFp„Cu? JA/5ò]åP”Y?‹•uñæòi ®®`jJ åpUݲeË6 vÕ¡ž_ýpôI{[œ_”ßvÛmYËÏu)˜_ ±$ÆXŦ5Ú+Ÿ;t 4ÙÊÀf*z–4 À˜rf™b£¹(gÜWî1¬¶q ~šXRº ÏÝ·fÃkR ÜÑn¾ùæòÔÒ1·½åº¿ÃO )‰—x˜ðÖèšÃ¼TÝ#FÙß6!ÿᬠ(lm“œHÊÒ\¹,†æG¨ÂPÛ¬çÖ5zÈÝ‹þ­·Fš”1‡˜>ZÂÑJ\ïFᇿ038a覛nZòQõ\½>; Üo—]g&èKQkmš`¬0?á6©E„kÁ” òÓà‚A3\—î1Ápr‰³al4¤m@[Šá+»¶ûœ£Í &› „ð©wß?·Œí(ÿüý÷ßߥ  ¥ðCŠ-„…E@æ”ë—¾‰n¤;f)ã¡0ÌõÊt]r­æV ^®T¡t˜ßdò)Ú¸eð(á2e|Æך{[Ÿ€Ò¼žþs›˜;­ãÃRiÉ Â¨ø¡H=‹wN»L ÅòN¡œw¡ºÉMû믳ò€õæ&7é'2‘‡ç¤nr“¾«Þ¿±0`|MÓMn–Ø£Ew½óS¦oµvŒ¸²ø½—øïKÙ^ÿ~«ÌÓZÓJs; í5† ã&ÉH›{MYN»Ÿ&ºK б ÅÚ#\<Ä îÅ1ÐFW«›Ü˜ßškMʦM›Æ|bôm\ˆsĽBÅ@Å@ÅÀ"càÖ[oí­>‹‹­Î>ûì­ BÍ-„°q£üÜÛ¬4bñh¼ÅFÒÈh¬¹äpÙ¼üòË3CËMƒ¶[Œ[l—̬˜.£R>’Œ`ê¸uHYo£YÂj½GŽ;î¸ì¢eUïÒ 1ËöÕ²§‹`:í %ö¶>&”ÀRÆ!°Y“„é–[nÉ.{±²; µ`b‚’UêeU Õ—¾ô¥­b´ŒûŸUÕ+lL èw,Œb0¢ÚW¨¨XŸØÐ· îO]À_ÛF îØ„¶{Ë{ä^ïz—ò‚(ŸqŒˆ—¿¼Ž¡õÞ “aàßñ™Áÿïw'{hÜ=IœÅ¤MfÿéY¾§¬È{² 2üÎ¥j£ÂßèMIП­ »¸ Æ3„¢qß)iH9ï¾ûîÛ,&ØVŽ8 ‰GJ 5ÄËëõxýb@Ÿ#h‹³¥P´~[^[V1°q1°!…!~ãü1QÖTq.€ð±yóæœ&P&-LË »Rrß@pí»î²¸eìºë®9K‡÷ñ¸êª«rÊb„W°Ü«_ý긔Ó!ZSvu”ÆX`/ð^ëÒ8ôÜ>ûì“Ô–®s‡.¶­¢ú"× å_\S@S`$K{.õv å3åù8nÖ¿M€'¸Û@Û{0Òwö½Ë8kS ´)Üǽ5Æ6Yg«ÔF…ºýš$èo»°Þ"â#¢ØÓ†2s’þÐ\S£Yw<–$ JÔ¡*šJŒlœc>gOæš¶yiã`euZÊݵ©€ü¢,Ã%ô)ź¼€|ó¾ç”_ÒB}bÔýeêñüb`C C²«p§¹ôÒK³+¢Wvh>åR "’¬IR³Za—9¡Æ3ö]÷Æ'ÿä'?9°¨•¤ šsn€&ý”SNÉïÊR@¯Àa ‹ˆµ ¾õ­oåÍ åÊŽ>úèõ_dHRo›Œ^æ ê7¿ùÍ9 Œo&Óf=€P+; [‚·õ'ñ~ðƒóºC„p÷ŒÏ*‡V\¦küñ¹ß黄f™å€8 Á!´p#Òœ“ùÅbtÊÍú;×%ÀŸzê©y,è£Ö@h¾Ç8 ¸K ª>GuÔ’Â9¸¡Pí÷RBa—R@šQãžÄ~ˆ9!VX< `BK¦3˜Òq[Ìë¸÷ˤÅ-Ï\_‚yW¬°ñ0 ?PH5 ²_n<¬Ì¶Åt@…Æsžóœ%eÄ(Úqì±Çf¥2Z÷„'—¥Q÷–ïññXL©‰V捻¥ ±æEd3r‡+œ H|àsªWÚJŒ$F“k͆MjZnI}õTf…»1ðà_ú`ǰ*èIз"tˉyˆrY¹T²¤Â3™‹%À]"aíVEVH}­© /ë?J€/Ë.*7VÀ%IßFÄÔÕ:0úºúsÚ¥ð\[¬ëë«i…ÅÄ$¶Y0£”`,üQÀúBVm/sT…¡#ç¸ìsÁ oœÖ¯~K)å¬mÄÂC!Œ>XÎCÌß8´ƒðÃd=3™ y&Ppðv`)2–xæ EÞCgí$kš¡ÏÜÇYyx5à¥iøÍ›7çëõgca ú^ìµ¾<ÞXؘ}kÑô‰b7À"¬ÉãÐʯJ¿>ºWÒ;Ï¡i”ÓÆ;~Œ2‘0U®Ÿe—´Š2;Òö,OmñNñ|ÝÏ76¤0dA=æRDC.q>•Î,ˆÏb ­ :­8àcî÷º×ýÜÛh°M¦â)akf`DËÜiÉ °˜tÏ8ãŒÁi§– ´X%“ƒk4ç& š ƒO0:×%„¾M`S‡ ó¦@â{7AŸ[.Lú,mXÓ‚B°î‚Q|×sLXzâ1r¥ðÞÕw»”Ê!,ŸMàZQÖæõõü—ç¾~°ýN÷‹ÒVs[s‹ºÇœÿ§±ç¶Ê'ÖPžhxûÛßžçÞP*Lã}µŒÅÂ@³/΢.FfW[1®ð[ÒCñ7×\sMN–K;ÚjñC®}á _È|8¥³Î:+ÓB´ƒû5žª(ï¸Ø…BÚ^kìvÖæûÚò9¯ùnWoíN<ñÄ`GÈÀÑP‡Æƒe ˆq`råÊ´zà 7daFf8ÏÐÂÓ$öÝ«{î¹g.ƒâÐCͱ\Ú¸2aØX‰˜w \,R²cÞ)¡ÂÏýÜÏåÄ b‚\b/˜w]SÖ§?ýélVÎëÏÜb€ÆŠÀ]MœÌ»Þõ®5­«Ìˆ&~¦}ÄHB‘6XTr”_ÆbÄ3öM ÖKÀ/\P¬¥QÐ¥ðÜcûØmÜ›((Œî*ÆÅ€~flÊ.‡pã¬P1P10{ PFp‘&HLEm0ߕЎ¶š‹AE{yH~EéM)¶¡$ø:a  K¦PçÍ®ç,ËxÃï|ç;m¯­ç;.@§^Eq´:²X I 2q zÇ´T€D ´‰6î9‚ó¸ÛõÝk@Ê»Q- uä'Ë Ì·ÕN>ùäó@ÈqV„+Á'S‰uÍÀ5ˆ»4ëñLÝ߃¿9畃_yÙÛÛß{ç{N®Â‘Õì%+ d¦ù:OjÍ™f5Y‘ò»& •®Fåûð–ñ‚hHt<—"±”b1dÑ xÌc“:ð±‡G;‡Ù|á _·tî)Ô‰K “÷†{“XÖÓX’Œ¥æ˜.ï©ÇM ˆ¤k­¤hÖ­þ¯XÏ+Í¥Uì3¥0Þ¿.Û’-‡v´áŒàõš§~L+€ÎHªE0ãqCa.D‚¥ç’K.Ù¦(qÀBB¡ÂU3æv›뉹ÅÀv‰Q®uítLÒº8š9K¶Alü9®1C–ÁZ×y£¼Ÿ%˜1’LÐÖ …\×’™_î7¸õ„̓]OøÈ`§û—Ô ìŸÜM–ú§Œ4„âIû§¾LÀ…¿µãÌ#x{¾)ÁC2~Ó]Ðà%ö`ùÔWXfÄ&… RYF¥rKà1.ж)1'Å·÷puZ€3.¯Æ!Ž¢îbŒa &c_9ï¨ÁC9c°ý½¶^GjÞqaŒQìè?h€o4À#dW0ï_qñëg<²”ã‘Ò3¡¬ß“ŒÇÅÇÈd-hÒÊàå‚VR”ucÏŠŸ¦0kº¨.—v”µGçÞ—2ñRâ±à C%x?E7e³÷ ‡ðÞ2žµ¼ß1¯"qÝ„"ý¤Â|` ®ïJÙÛFW7¤eh>>W­ÅFÅÀ¤V‹›o¾yÉl߆3¦ÿåLÄNA¡ÒZ‹C£•£íj&;h¾sSJ~PB™ÄÓИ}®m“á@Šú6`i“¾œ0DP•Xäúë¯o»uCœûîßÿÕà‡ÿö½…†ÖúãXN¢œ¾¾O1g=,šê.ëé$í ä)K¢œ-[¶äuè&y¾Þ[1°ž0@ÈŒp…f»–K;šåø)¶5Áû¹ëPVö BUX| Tahñ¿amÁ:ÇwLþ´uŠ uÞyç Ž<òÈìn)~(²ÒMû}³(ÏB±bæhú¬³dͰqӘϢ>µÌÅÄ€D"%#ÔÖ ÖÓÛn»m)™NÛ=“œ³È#WOŒ͹$:’ëT¨¨˜>$¡*—P™þj‰‹Œ* -ò׫u¯X!øIÛÞøÆ7æêŒ*lL p¯a™$¸p»ks×lÃëhØb ¸Fn6Ö")—¹Œ†4ÈZ^w,öT9\7¸}ZëˆK°À£ l[¹Þ\Ü_÷+Ã@$ðYY)õéõŠ ™Mn½~ÌÚ®ñ0ð³{¿d°ã}8ÞÍõ®ŠŠ¹Ä€X5îjjT®K iÄIÎAÌòi½8B U#÷8¶T¸n–b#Ì´Èx†kÏÀ]Çý@FDïàZç}GuÔÒZ^ÎyN@57ÖÊo¼1?çÇ2 !ÁÇ»Ä,*T T T T ¬.ª0´ºø®o› ü×=^8Ønûæ &µ ÓÅÀ}Òó;þäšn¡sZ«Œ gVY¡d4<úè£sP3·7‚ˆŒP„¸¸€‹/¾8 $ÜÔ8ÊÇwÜ€ ›¬‡BJ“@IDATŸÿüçsfDAÒ„•Xœ[úvI,s`í8IEhžÏ9çœ\†„@ê%ý® kAÛ‘­Î ²–‹C:GpºâŠ+V¨¨¨¨XE Tah‘]_µ±0€©úú׿>õF ö¾å–[¦^î"(5þ¹çž;µ*ûVä­ñÑ~tjå®vA?ûÔW ¶Ûacy?KãÎd‹"q[#qÄÙÒCˆ‘Ê]\¡† ïüâ¿x›OÄZÃJdÝ!k‡ˆç)“„x@ ‹u¸Î"¥üYÅ!±>ȸÆXwë—ù—ãoÞ†î¼óέÎÕ?+ÀeMÄ’V¨èÃ@†ú°S¯U ¬*Zái­7Æ}¥`Á`ë+, p%’,ÐÆík_›JÕ @Ê²Ž‘¬z2{±*TX Ä»¬3k€Huk³v :)rË haa´ Œ¬IMàv'“Ôî»ïž“‰XK®™NÙ»š™çdY,Ç{™‰J<‘8& u¸ôÂM˺'êW¡b b`z0&­cT¡b UêÃN½¶.1ð¥7?kðÃÿ{·ßÿj4к;%“T¾“«Oã-]t¸ßX™» Ê™dÙ0®AÿP–%=ÒÇ •ÿË{+#½æ5ñ4ôm µpW}Åx4ß©}áÊô‚¼`›TÄê`%q qø˜â¼ÿµ¯}m`wÓ*¡Äé§Ÿ^ÞV!|°Ú6láå¾ ^sm³ˆ¢{,Š]&5ÐDãM?áN§¯ˆí±]yå•[a€à\Zz\dIòž€fÙqž Åú¤~%ˆª‹6–©Ç‹€ÖQVÑË.»li~^I½ÃU£ècÛœïn£qÍÞøî¢®—õð¿ÂúÆ@†Ö÷÷­­kÁÀ¿ÿë? ~ôƒm…Œ–[W|êÃþpv“±·˜/~ñ‹KeŠ/eʺCÖW¸öÚk—®!,âlË’ö’—¼d)æ  ö¨½T`ãÀâ¤2Yaô,°ÚŒŽМ[ìX]XMBp;óÌ3sÌijŸýìì’DËn½”€® tׯºêª¬µÇøq5’I+€0£MÄÃÜb^ c¿ú«¿šƒài÷¹BYTÕ»‚øš×¼&ãÊ:FRƒ‡»fØB¦gŸ}vb·Ø«v´ÖU"¨:€¸8¹R-üõ™ÏÏë -Z½§Q_îh¾£El-ŒŒbý4vB0á*GúøÇ?ž…¡æ{Ý÷¤'=)»¡ZöÚk¯ÜošLeW·ÈPe\H¨pà6‹Üæ¿ÿ,ܦ¸ÔZ÷«BÅÀ"aàŒ3Îì²Ë.Ù=ÕÜ{衇f ¾yy\@g¸»²òrs]ôqÔœ.¾U×¼vï% ÁzÔ£²e=8餓–”rmõ(Ÿ­ÇëUZ¿ß¶¶l0@(Ž“/þààƒÎÚ(šfAÚâXVûí·_føiŒ1U¯à˜ëNXqìÃB2*P» —Ã;,3„C¸f0:a‰§>‚É nâ•A‘yúÓŸ¾TÿÓN;-3 ®w¡»†A=å”Sr™´éïyÏ{–„ÄC9$ ˜KÌâwÜ‘‰¢ûNÒÝà .Ež¿ôÒKs¶/–(k'±ýÝßý×eÆØ7бV˜NÂ&h d·ú<­áM7Ý”ïY¤Ÿïÿ¯o ~ôï«#èÏ#^$4ø«¿ú«ÌqãVYâÔת„Â5® Rs—˜)Œ™$—}÷Ýw«[ó˜ÇäS8à€Ü_•õÔ§>5Ç%mucÇë Yí¾®šU*1R –tBŠ c¯|å+3½3Ï]Ñ>´ÇâÙh’…ˆûè£g(;ºætÊ5´ ý@Û¸A ”]¬ÿ\ÍÕ ššõˆçê~ýc`cEÚ®ÿïY[8g ‰‡€‰Â”Ó^c¤,¨*†@8ëEOYd0b‚®Á+^ñŠl éjZj»î1/ËÚ9 #8è ƒ²µÅ1ÂB;^xÿú¯ÿú’v]œ"$X•ÐAè˜ÉB§W†Ì_–û¥GV¾ôÃ×¥tÇçÄT`dá§Š,H±X,KFAÀz™¼¤I†Œ®÷¶@vE–0±Y‹¾÷×kkƒ߸Édéo\ß—Àn…UH-‡\Öº´Ø²2¦Å»;ï¼s¾Õø,ßç¾—½ìe9ÞŒ%“–9 ,Ï9M[ÀK_úÒÁÞ{ïëh\èkÜBi¦+¬? PÜ|èCÊ‚‚¹ªýŒ%Z_¥€!ë§Ú)œw©?GXFôEã•wy?¡‚$Ïï ¨ân,ýE]”7ó)‹Q<ïyÏËñ~îé£1FÚætÊAééÑQãËf€p¤ž¬OpiÌrñ£LCgAY|¢þl TahC|æÚȵÂ@ÉLcFÀŠJ×8uDLX‘b"zšº b%\/µ»îï;¯Ž,('Ÿ|r¶ÎÈŠ…xXÊ÷Z˜R9B(¬OšB‚×6€ qpNÎc 0«ß¬÷2 ™£Y‡X¯XeÄÝ‰í´¾á„»·„9ãB}Œ¸Õ¶9Ýša¾—ÓPˆˆ P®ØBs~Ð/Â[¸\Ç}u¿ñ0°ýÆkrmñFÇÀƒvKþÿ‰8¬ÜG@Å ¼óïˆ@ ŸñŒgdÍ 71À¯™•ÄÄLhàꃨ 2LøÒOXs"‘@³lu–Ñ+¢ƒxȰ…AˆLW:«îÈš%ÑáG;#Ñ`"NÞP#p «¾¾ì`‘j›{œçøQ€)<÷k;a¨©=UÖ<\ÿoÏ|Uôïqך‡:Õ:´càx@NŠÂ ¯TŠ´ß]Ï®ÄnRqA·Yfî$˜˜ÃÌÓÜ™Ï;ï¼Öj±QIò"–Œ Ðó¹¹Þ5V™.¦ß|*öKš >’M=ÌÍ܈mÜ¥ B\E k„.pêÏ2¤,ϨgËy´I<w¸q@ïdQò~®Çbaµ›»ôr >ö•G’¨ˆ«!Úf'< ­3–+4]ó½Ñ Õ2´±¿mýŒ1@ÆÈȬCºüòËóÅØ XˆKM#­Z ˜#î&lÚ9“xÔiU±¢idA ?õ(›¦ZŒ“ëb3dkq}Cüú  BGœ´›" „m¦Ý¤9ÄxHo½9YÊ)šr‚— ¥pÄ<Ž0% ø,ApI˜Bi8iÚöá”v¶V8îOê¸hðàÇÝ—²hu¯õ­˜ TÌÛæñƒM78–…Q€§œ dþæÖÆŠDIDùÊ".FÂAó 7!‡eƒuýæ›oÎñBöO<1ÏiîeiâGp¡(âÞFCùC1Ä%ÌÌÅ-ÏüM0¸õÖ[³‹bͼÎBDˆ1?s×dáï#ž‹´¶îcåô~¸å¬úè£ù¹ Ð qJhˆ?è š8U@ôMÐô‹"W66¶KÌÖp­Q@Ûˆ¶Âà4xiQ˜rùš$¤I¬°ºÀ bGh2M¾4™&[Y™ÚLÕ«[ÃÉß&ÛÖö÷ºÏØ–ý‘Â`ëŸaúÕ?Y)0è|– ­3¿oŒ|]ý_ßGĸ& Ñücí(ËuŒˆ"“6 ý¤¡DD€cõ‰ ò¶gâBsWŠçá×¼ßy̆ HíÍ÷~à›¿9ùf ¼èƒÍóÊ )õMD“޲…Ð"€XÚqï‰'eÏ ´ST2_¾EŒ1í\Ä16)¾|sL þˆ3A¸×`Þ* ˜«õþI1`ŽS?tSR.ØAWËçÌ/®‹ ¬|e‰™õqÜFWƒwÅ´ÑÕjZß¾¶b Üþ¦ß<úøvÜù<µü[1·ˆkl%˜¨ 4XÖMa%aE*ƒBËû»ŽûÖ<‰lV|€mãhEãÞ¾ tD½Ö³ÌÄŹ؇ðÿË=áÉ6)p…aÃv¹Lм." øƒÁv;ì´ˆU_Wu¾øâ‹³@.cä8€±•R˜åuË–-K)ìÇy¶Þ3ßÀxNlO%pec•°¹ÆzC‰ÔLÞ!þ†rmzAaB("\8Žl†bK%N )Ÿ•_Ù].k\𺮕u]‹ã6ú8N=0¼mñNñ,šÜç¢÷ÕýÆÁ@†6η®-ý1~ôƒï†ÿ±µ«ÕÀzã^INl¬µ‚ѹôš$áºÀ½Ià»kž·0+d€„$ê‚Ï#ùÈ7G@ªP1P1P1P10{ Tahö8®o˜3 ®K¹µg^«·½ímƒ[n¹e¦ïáÞ%þGÁç2£aì®Êþf/ ‰ã’iœi¥fP¸lsw¥¤ ã‚€e‹JY.Ík…õ‡¹š™Yz".C¢Y²ŒA®gâï"Cã¹çž›û…ö$㇋$Ë P®82Ù¾ÄòÈ &ÀýóŸÿü€•é¸ãŽËu` "ìÈÜåšgô¿ÈMD˜NÐ’€„Ðd¡Õu\¥ÂÖW\QžªÇ3Â@†f„ØZìübà¡/>s°Óýg¿è*æ‡ 2+t½çž{æÔÝb 0k2“Yƒ&CšÌ\„ µ\|hÏ1šã³*-X@z¬Ù1Îó‹ÏìýEÁ%k ’äR° ,—Á‹ðB!œÈÂÕY"¥øå^g³öŠø ™»¸ÊO¸”ØÃ"—¬?Ö[‰…)(¿ˆ¤z'¬YèÒ3ÜïaÌÚ(%†î¼óÎòT=®¨XC HunÁÙ ëUZŸßu¦­ºÛ²2ÓWÌ´ðû?´{¡»i¾X6¸rÁ>esÁM Ÿ¦0 ]7íó)§œ’Ý}¬ !#&®iZ1rÍXtL] d.Z.4×÷éFÚÚïÄhöáª«Ž„@ ZÜU›Å^~úéQìºÞÿò O[A^‘Ø´•Y =âƒ((D%cî™þ=m%Ôp¹³Œôöå"ȬÒ—™Y3)!UÜôlÖ@1&=Ê .ue6Eå~_Éø+ëR+”x]€ÖuÑϬ„ž5é ±gÌ—€¡WMè¢CÍûêÿùÆ@†æûûÌEí?‹.­" &Öh4Á¿ök¿6 €úMozÓXUÂ(±n`ܬõƒéÂ`YxNù€Æ™f[Šh ͱˆ”@“MsÍbâ9qgŸ}vÖvKǪ|q `T|CYnóh AD]¹ªÉŽD³.Õit]íR&¢&‹žzb>1’áúäú¨:Z<°Ìâ%ƒ`.PÊX¯ðSßš)_¯íÔ®˜“ʾ¡‡{š{¢¿9þ¾cIŒø פ3&˜tÛýÜÚJK’uLʲ”!>06ÌŽú¼M¹Ñx¬%#VÈù¸í±U¨ØÐÿ-LkÎg­µ6ºpÎ9çdšÂ2KÁqíµ×æK+¡gðç:ëè±ÖŠ´öå‰:À¸ÐZëíA¿íGÑ¡¨{Ý/ª0´ßinj ÈÜThùþ?}sOMþá#4Õܶ`cÎÄζØê(°Ú8â æöÊzE&oá†nÈ‹Òt³|ˆ²8esšépÉñ>V–"H<‹Òe—]–«2*¾¡¯¾4gRcü¸YiÝB~bhÔÔßšI «]®)CœÏ¥—^šëFKWjûêçÞ[Æ`À™w[t°ÂúÁÆ.â‚€¾Õ*¤¨×/õ ÙÚ¸Ô%1àVŠ™¢ÙÅ|õ}Y0B„êw¾óùv‚×ÓŸþôìÒFáÿu×]7øâ¿˜¯Ë:‡¡â¾jÜÑ|¤^ò’—daNý-ú+£\ Ædi™*¯ÕãŠõ†®¥ e\¢Íp1¿òÊ+3íDŒã3ϵ×^{eƩɰ7ßF£Ì*ÂJV Zn®@"LFÅ7Lë}£ÚõŒg<#3†4ä„AÉFÕ‘K!mc€ÿ^Œj…ƒ Kº4´bD³îÆh¸Ð­†X¸(:$qhºÙ­E}ê;+V \Þ¤–‡ÇbK™1yꀈíùêW¿š«Äs€«y¹pq[]GÑŠæ3è„8$V`ü'BtíFÜ7iÙÍwÕÿ󇹆=!ÑŒÙ,“f—Ö¬/ýb”Q÷ËÀÅï÷~/3Ü|ä1ñp@䨶¨°ÝöI°KÛr`9í‡ y9 9a€¿ï¾ûöVAß'üHnÀ=nS 0½õÖ[³{Oïƒ+¸(¾ÁZ,â¸ÇYÔµŒoXAÑKŽj—x èÄspoÓ'Ì€¾:ŠÇ€§2ƒ–‘ð¸RkZ¼–ûåô¯YÖg‘Ë–± cÓæ4q>è+² ²H®ÚÄ=HïÍ]¯BÅÀ¢c`ܹ =0Aâ1wüã³ñ{\]Ymñèà˜ià8ÙûhE¿”’›HBÄ}N’±‚ñä €¡Gh“¸ÁIÊn¾«þŸ? l—LƒÃµ®–*0“ÚHÝ´6.EÍe’²G´¤(¦å5 øZÓ;èÖºóú~ÚŽk®¹fpÞyçå½xæ‘ĆQ°q[4I 66‰-"îÿíÛ?ø‰ÿ²ËØŸƒfHß”uJÿ4Š1Ð'eI{îsŸ›Í÷ˆ>OÀå‹«LsM“¾ç¼['ücögsííŠoè«ç¤×úÚeN  „§6!¦¯Ž˜\”Ô©àÔSOÍsɨ­Ië?û}ËsÏ=7 È6›6ÛŒ3 Œû¸pÛë÷H1C— vºßÇ}d.îó=ÅÌ_¬#æc 0Æ0(\dF±¹hL­ÄÂbÀxKÆ"o,J¾bÞ5mb[& ‹ˆeV¼I+y4”´²ks7Ï›%ïyÏ{²"îýïÿRMÌé¬F„"ßd\è£mepÉÆs¢µÀ±o4Û|$tLZv~¨þÌAWËqÌ]Ú7dáû6Çñ\$P€•·©`0Ù&ÇxÛH嬄 [²øxFjSZvV š… £1`ò¡õ—‘åꫯαA´ï‡zhö‡+çÑq|‹`Ìà|‘aA(Ú96Ó(ÙJ ÔwAVÃ=­Ôj1(¾¡ ãÔµùLßÿ¾vÁwŸæ¯«ŽÞwÔQG à"b+¿îfºñ¾z­Õµ¶>Öv®¯~?ìNêkϤ×.¾øâTÏîg¥ÀœÓ&€ÜǽB—" ?Í0—&Œ>PÚuñ6V€÷¬àÂ%àOjH—B±p˜B!r·ÝvËÖxeùG‚Plˆi˜C ‚§° ùFeG[4ü}ï®/ î»éÑW;ú¨ÃGôŒ‹¬8Ü»ìsà…UYó}¤^­Š`й€øGeî›uʾÓ?®…"H=W˜ âHÍgm@dL®T Žy´ü–Þ‡öøfæƒ.àzÚfuáþ×Ô8dz}m1ŸtšÏrkeáçâdÇ£ä²õÕ+ÞY÷ãc@? ò ú¬åã—¸þîŒ1RîGÑJks•ës­õœ½þ¾ÊúkQô/-kÒÕ˜“ÝÓ¼Ö†‰¹’¢ò˜sŒFœT BÑ0÷Ø„&Lbˆàt<‡è ,Ò$" jÝχqRF‰0%0Ïž–ðD;Ax“iøò·!qVçøÛª'Â~¬âN|ßúÖ·²UÌÚÜMÐð~ÎpÂßYL•InB  &ž AÈ=H–4{›sqç½#:Ö¬Ú=ër¿rÞQƒGŸ ®ân¿ßIÞ}4ú¡ÿ›6m¼ò•¯\„0,!Å~’w¬×{Ecî6D›í£ïØ›kŒÙ[qÝ~9ðþÇWÉ:ôñ<=%'R¨óÀ\ýõ_ÿu¦”ab(zÌõ²;Šß¼M mm!e˜ößzX¼ ¸aKdrWŠ÷37z'Ï4æ5¯yÍÀ¢æÊ½èEY ‘"[ îÍ›7ç4½²Ô™‡\³Þˆ‡æȩ=íiù_[Ü`qHVN—~#ÆS`9à­œ|(ÿÐ?Ò¨D-¹€úÓ‹ßZ:vÉd¸–ó:ñí+tc xs[ð›*­ìFX½21‚NÝ ÞÔüˆw7ÆœèZÌ0¤11xTŽ‘ aH#œwŸk˜}ŒW9‰@d“g)B Âà¿}l„"º„§,´åYåŦ|Ä'ÞVˆFÕ#„ÂYÇÞuíPžú•›rý·÷õ±±f©—zei«My!ú¯>&f‚O<ðW%>{ÎySYWezÖ®rÊ a(ÞÏFÌ•[ÀŸ,  ï±ÞŒÓ„èŸÑöø¾€}ªï¦/ùvñ£ïT¡h0°ð,Mµ~µQ@±•ãMû/óGÌ#1¶ŒÓècËÁÑßüáa ™@a9m%\v,f|Øa‡e!…@±f/~ñ‹³áÆ\úò—¿<¯Wò™Ï|&guGFâNiN¼.­«F’Lf=öÈßÁø•iîòË/Ï 2å‹g ¸H|ñÑ~4» ûž¬1Ö-BC€9@Zü-[¶ä_ûÚײàeÙ‚]Rp6áÈ3–0ÿ÷µ…–œ ÅõMÝì%¹á @qÇ Zªðà_…¡+“ûf”¨¾³¹žüæ7¿9÷«ÉJÚ8w›¿‚÷ˆc㫤•è¤7OðZÆMð<[µ¥ËÅši þÿjÆ ‡Q î±s# ©h ŒƒAŒ}y sNÐx ! ¡‚Àã8„¨€B@Š2• qQ~y^]ü·Bh±÷ç½Û€VN”Çñ_ÁäD;ÚþÓjЇòqmÚiïCšTbS^y®,3Ê5ù8.÷ž‰òàíC² QG¥¦¬¨ƒýF8…G[0¸pýNÜãôç¢/9ÞÈBÀF±bì”cMŸ‰-ÆzÙ·bœMЧaš‡Òª«“>¶÷Ë,Šèüñ¹þU–À*daFq’ðj“&W¦Lž B’Ø„ë’tøá‡çý1Ç“3–º7ÀR,=ÀB[’p.¼ðœ¬ƒÛ5–·W÷¼ç=o 50ðý b„V'Ì6:Ò×ϲ>‰ƒb©BoöÙgŸìÁJ!Ñ ëW¼Çý€0DÁWa2 PfY׆Pl­3}‡j=*ãÎü6Ý)^Cwµ¹­—àû†dÇÕ_K,êg~æg–Né/µ¼õ…ã2xÚà])ñ°þÛô=÷Ľú[Ì0¤’‚J$ÿc:oÀã@ !ˆ@нfop…PƒOÙŽ•ƒ/öÎ…Ó<†@ç€{š×â|ó¿óÍÑõßù¸¦Ýñ¿Ü;›óŽËÿ£3ØÛà/öð&öðí|Ù¡ü÷Lù¾¨_´w=g¸€ø†7‚£>œs]Ÿt®ì[ëGã¶ a.‰ó¸Ï-ò}1ËñX}ǘ‹1é^à…~ HšÃ]­YæÑ\g=¯®Ð€«1aH|cÖxÂ5ÀXx­=´÷Þ{ÇcK«Í; 8Ǭ2ð¼ïZËSzuÒI' .½ôÒœUÚ×€¾¶¸GöÏðâûPƹV2Ï®a¨Y¡*L†.èbmá<€b .Ï Í«®º*nÉ{î—Ü-¸2FWœ³öKSKKc ¿ýÛ¿½´4€ó\òšß’5”ÀÚª,+à<@“,AØ Ðu´7 >ó™Ï\jcÜ7é>heÐÃ’V¢‰À¹àçðkøªà¿&}_½ãa@ÿ‰þ|Zj3Oë[Á»º/údÐØ&ÆæŠâFe1Iq8g‹¤Á¥šyÏ…@̨}lŽcà˜1íƒyu âá&œ¸–oH?åµ87jír_ÛÇÏk/ˆI8öåÇuì¼Í±-pÇþ—Cç‰ÿ!¹Ç1¼ÇÏ•ïˆúFm¿ýŽ©_íp¯‰ª­Íp ïØÛôAýÅ9} Îëa*ûÏD/\‡7‹åãË ob<›Æ›ÍØ2YcœímÎë?Æ›{A<»Qp5i;¹ÒÜ—ð—ù—ƒÍ›7çØ!cC–›Ûo¿=ߊa%³\ÝÜïÜCòÁYg•Ó²s_ èšïXò½?2¶¬šZ÷(Ã^Ùîg  Á$tSÜ×Ϫ›z– ²Z‡Kô'}#\ ˜úÒJ¡œ £1 ¿p]4%Æ! HÈäë#A¾ÇÂJ)°´=söÙgl}ý¤ïžRk»m—h£ôe1ÈÏ´|ñ¯ÚßJåb¿ñOY 9k1×Ïs¦‡ÑÏBFÕÇôñ°Åÿø¶*cy¬knÒ8vƒ͸ÿmøCÐt³‘r5òRžV­c›€jQ΋ÇlDZXkýqÌ9ûü¯·ãx{ÝíAÄ}*ëxü·‹}1[Çǃ?öÙvÜÛÑ!ü·øÇt Øg,ê²ױ޶´ïiœþ?pŸS›5Ö\-ì̧íîá)žè3ú€u<óúUôGÇ“þÊÖÆé„‘ñ£ß½(#ÖîÐC-,÷Å’d€Ç5 "èÚÄ=ËÜ{I‹ã€>Ã’Â €•ƒ+¢g÷ìg?»Ù}÷ÝgÜ'w…Ñ<+¾gKm]ÈÆa5Ôão o¢o ÷ß,ö‡ÌÖ–±&Œ%y;Cà€~f ù6ú•18ÆaßT‹2¾ ¾¯½èFN:[jïUrö{‘P/L˜hì«_(Çc_¬ãœøï–,Å1kû•‰íXÛoñ?ÚS6þÖ¶Ø^Ì:„žx°õ:œµý±ö`Ûÿ‹ýÑ9”‰mke¢CÔû»í‹úƒb½˜{œ„s<{‹þ¡ïXl×®š¬A–èGÊÙ—4}ðÕïY Ðpwüu¼¯õùQÏ$®½[Þ—ˆ× –G’i³wÝu×âÆÓ& ¸¨ Ç·š¼â?€¡ ;Aëƒä§€vÏ‹¥Ç³Õ^q;€N»Ñ&÷ ´°\9‡Å=rƒCýîÅq‹scŠÜS:Å9q T/w ïb?žzù¯wÿ“þÊ×½îu€ÞÐN_¢¿Ä»h­YBÀÒì·DùiâÑbïUÜŒ¥á­ù…†AÒV‹õ€îú¬¬KÆ‚^äÙFf:e<óBþ÷»Ç̓gi ðÕ‹¥‰»Bý+)90_~KŸ]¨ý“öé“@!Õr×|Û™å’8ßIý*¾©ÑÇâÛÇâ»Ú‹s#?RÆÍº‚¥ÅMÅ6⦭-ö×Ûõ‹Ö>ÿëºêí¸¦:‚вõ¾8¶uýb q~Ü»ãñ߃í8O9Ûñ®í(uÅÿ¨7®[.4???ùÍÍ­7|`sÓµ» PÝnOÙ§/é'Öý+ÊOûÚ|²iÕé`§…'ñŽÆZŸñþùoü²ÔÇl/†~÷Ão6·Ùx›Åœšç,‚éY^öØc’¡k›m¶iL°ºRô¶·½­ ™Èj—Á•jO^wü8À2 ¨,…Œ]¾•ÖmøGAV¡Z†Ší¥\;Ïè_íE‹o«íú]¶—F ÕŽ›±/^œD㥠Ôñz;ÊÇ>uP²û£\½Ïv½ßvì+;ÞÞÇÚk÷Úë(÷YwŸqN¯÷ÅýÇ1kÇ£L{»®«½].”?7à@<¼ >ÛÖwê%N¬ûLì›ö5![Ñ´Qô÷ïbô£øï¬2uyÿçK—|t¿f³}>Ñú·Ÿï)Yn‘ðÜÄùXFdÀ“,ÕYçF¡mÙ†ñàÀRPÜe=~…@ê} ™‹ò§þ>ú~&%Ã}­^BŽ~W«ûeûZc†êÆÇ Úç%‹«ÞŽòq,ÖñâÅåºísY;ÿËÎ.?õñz;жDû].î¯.Ûîµ×ö)¡.çÖëØV.inà—ųµÄv¿>7w­ÓUÂÇ–»Ä´Q¼k±vÿ¶c‰w9þ/?«­×K«'ÏN$’ ç@{œ‹ï%Ù$d­XÇ÷táWÉ3’³¿£ÑïÚßÓù|WÇ µ;A·›m¿díÿêhï‹4Öõu”sêý±/Ž·uûß­½ÊÅÃŒuœÿ­ãAGù8ÿëu½]—³?iqÀGK»ïÔÏ¿Þ^ÜU&ó,.–i¥xc±ëiåMÞwr 9°²8ꨣš=÷Üs–Bu±-2žY| c­®çBi›ßÊÅr8Ï DŸò¿½]ÿòÝÖ†ºÝ&tcD¼|±Žsýï÷‚v+çb]·µÞV÷|ÿ·Ë ¢]“VÇ·Z§¹ñM—nÀë~ün÷—IããBïG`·ñxÇzþ8—ï×OÆù¾V²íâÄ IbÀí¦MæåÙvÛmË$ªícËùï½÷n$ÑN‚¦‰‡“’£È½öÚ«yÒ“ž´¤lríûб/ÖõñüFÖÜÈí¥p [ÿZH}7ü‚,äì1-L‹u¯Ûèö¢vÛ×ëü¥ìo·­ý)uOû¹›¿á_;èru"„añ#ŸÙlΞx≳wä¿sà&7_³¹ñK €x£†T¡TÖïÿûËÄ©ÝÀôÒuæ¶^ÍxÀP€Š¹zM&G5ߤ![o½uóªW½ª9餓}™¬/90–ÈoäX>¶‰lôð%Â1f›µ½pQ[Ž¥}Ý1fãè5}€ÐèÝt¶h8°ùþŸin²æÚÓp«³îQvªßüæ7³ö}øÃ.é¦ëW]uUI}_ïsžùOj2Øe—]6ßP³mÎK?¢8czõ«_]Ší¸ãŽÍ…^Ø|ç;ßéwZK$’Éeæ@‚¡efx^.9H ‹7ºÉôÅcÉæv§;Ý©X_ùÈG–IOñ÷xDs 'Võ«_mô 5¬@²|ðÁeÿæ›oÞ˜8£óÎ;¯yë[ßÚ\qÅÍ~ðƒ¤>ýéO7»í¶[ó¡}¨ùÔ§>Õüâ¿hX‘>øÁ6Ï~ö³›Ÿÿüç%œ‰ÅÉ|B;í´Só˜Ç<¦9õÔSgnɤ§{Üã àbº÷½ï=s̆ÉQkð4ë`þI¬0îyÏ{6k­µÖ ·"/ŸX~$Z~žçW˜ÿyÆûš¿\7Û¥f…›4—ÿÆ7¾Ñ\{íµSq¯+u“WÓ±b¬š®9;Þô¦7@³é¦›6[n¹e#V¨M’v°?›m¶YóÛßþ¶¹Ë]îÒ.Ö°úìºë®3I“³Õzìc[RÄJ€7= +’}袋.j6Úh£Y׸ßýî×üú׿.ןu ÿ$F€ ·¹ÍmF %Ù„äÀòr ÁÐòò;¯6¸ÑM:¾¥ehžD6aøß¿ü¹¹ü´w7ÿó§?²Ú‘¯ëÎw¾óLÍaÕŽÿqȹÙÍnÖ<ìa+@çŸÿùŸgMQ\zé¥7È>w{Ü£˜(t±!nxguVÉj·Ã;”}?þño¸Ö_ýrìøCYçOr 9H¬< ­ü3È,3n¶Î›ÿûÛ+—ùªy¹äÀp9pý•¿hn¶öí››ÜrºÜ\êy׺qøúë¯/±@ÜÙ~÷»ß5Çw\YjkOœ'ëÛ~ô£ø[ÖÜïÄÌ:Ðùcά§?ýéÍ'>ñ‰â.÷Ìg>s&Í÷-oyËæòË/ŸuЏ#nH¬MIÉä@r 90H04Ï![±Œ¸õ=7o~sÑ×–ñŠy©äÀð9ðÛ‹¾ÞÜêîÿBcv™9Å«ÉâÆ:ô÷ÿ÷e¾«˜X ‘¦=á OhN>ùä’Û€‰‡»\/zÖ³žUÜã¸È=ç9Ï™)&ëPMÜ6Ùd“¾ó“Õås;9H$†ÏCÃçq^aÄ8°Þ6»6ºò²kÕä7GûïxÇɿѸÃ?]}ysõ¹Ÿnîú˜=VàêK¿$ÀÓ Ôÿc{1k®s’'p‘ãâ&©ÁßýÝß•DêÛe—]šç=ïyy°Þð†7«Ø …¥çøãïÛoþð‡7·¾õ­‹KœŒuAæ-úÉO~ËÚÿ>ð³öåŸÑã@ôÁX^ ³EÉäÀ 9p£Î\«YaÖ•H$’ËËë~qAsý•—6wغ·cy[´°«ù I@`îž?þñeáÒf1ÀòŸÿùŸÍÝîv·…U\•–¸@L0²æškVGšrîk1y«9†$Y0iëÍo>÷$¶,OÏxÆ3š¿øÅ³êºÞýîw—4ßcøÀš‡>ô¡³ÊåŸÑàð#õù}îsŸ’H`íµ×.@ ÖXçrËÔ|éK_jŽ>úè’ cPu®·Þz³âßUoÖ“w¬1î7íO,•×]vQówUsóÛ­ľi'öBlQÐ_®½ºù¿¿¿:þ–õÍo»^³ÆZ·›Ù÷çßü²ùøÝÌ·¸ýúŽ[ÏìûÓU—5ÿóçÙÁÓ·¼Ó†ÍovË™2×ÿê’æÿ_= äš5ïrïFâ‡BÁñ¿üɬ$7ºÑMš5ïzŸ¦ãó×"ü/ZéÕºŽ¯q³æ–w^ÝJÀýõ¿þÙ_ëüÛïMn¾fs‹;n0³ïþt]ÃêPÓk®ÓÜ|ÝÕ1ÿïºß6þí¯ê"Mòï¯ì ÿþöHov›;>»ÒZnV–YaLÿ„6>„ÎX/õv‚–nÔΠ%i‚e.’rûßþíßJjïÚE.Î;ì°ÃšÃ?¼€!Â-«T¡àÎh®ëþgÅz9[¼í¶ÛÜŠ8×DÁËyy­äÀ(q ÁÐ(=lËŠpà/×^Ó\õ­OtÀÎÏ\_\цOÛwæÿ}å˜æ÷?ýîÌ·Ûlûæn{ÑÌ>™¼®ÿõ/fþÛXoÛ§5w|ø.3û~~ÊÁÀ4;½ôú;½¸¹íUÊüïŸÿØüìc6ÿû— u¾Ç÷|Úf„] ëg'8 Ýø&ͽŸÿöæw¸{©çWü¨ùÙIÕX¨¹ñ7mîÿ’÷67YsíRæ÷?9¯¹ü³G̴͆àû¼üÃ3 ê7?øJó«³>6«Ì-î°~sŸ6³ïªs>Õ\sþgþÛHþuZ‡†É¿u7ß¾¹ëŽ«û_¹à˜ÿ„ ê6€ –šD¿÷½ï•¹Fé?úÑ6&W•Þûâ‹/¾AÓX -´š¿è¹Ï}nsî¹çÞ ÜríiOÜTRoèwýnP@¼}5qh§œrJ±:Iå^륹¬¶Øb‹Ætï|ç;›<°ô£O~ò“eòß—½ìe%qÇSžò”æë_ÿzy¦æÅzÚÓžÖœy晥O¾ô¥/-±pO|â›×¼æ5%ý|\Çúâ+_ùʆ;§zMD¬H /Û"kí«_ýêÒ†7Þ¸lëËì¾ûî7Pˆµ¶´¡myëæ:90ªH7¹Q}2Ù®äÀ„qà^÷ºWsþùçç<ö\q;á&÷ç?ÿyÆUîÿüŸÿS„9îj„2îrÊÅ×õ?i~ÀËuÖYghBþüZ±ò¥`kIpk®p¬yÜ1Åᕬ€ 8_ @bE4±ï¿øÅ’¬ã¯xE±!æ«rŒ;' ÂŠÈ¥òsŸû\ó‚¼ XO?ýô2±ðöÛo_ælce”LÌ¥07Q°ä¯ýë‹{éQGÕX‡zh{mï¸ãŽÍ _øÂR/WTù¸vëL>ÌÝð/' £qû ƒjN:é¤F}öãÑ{ìQ’‹p”,„K¨z¤œw  KJŒ ÃSÊ6&&€„¥Æ}Lòºp À¡ 7 Yk„xkYßhÕ-Qöþçnº\"wu8ð‘|¤Xnu«[M%?5€…?1B·»Ýí °ÇÄ ø †ðˆ¸ÓîÔ<ÿùÏovÞyçrmýÈøìg?[²:È ™ 8âˆ0´ÕV[5ÿôOÿTŽÝýîwo¾ýío7ŸùÌg PQÿ#ñˆÆ„ÀÒÀGa©t É@€ ÷( q.bén$A,Mâ÷öÛo¿fë­·. I–E‰B‚ÊÉÎhþ.slɶøÆ7¾±HÖ!u;vÍ5×4¬XQ·ó’’£Èt“ŧ’mJ$’SÆ‚V,´ï„OÚx ¡•@JH rÜ~ ¥…(8ÓMÀ—,b)¬:lô) ž8ŠcÊø‰~Yƒ…ð%H?•„UƵ€Ï}rË-·,Ǭ{Q `Yp¤†—´ƒ{Û¥ ÃM`9ûì³ge?¤,ðÞh»õK^ò’bMúæ7¿Ùl³Í6d!Þ1 Œm¸á†Å2ÅÒÄ2$Ë"¾|å+_™ižr”®kž-.w¬iøÉ2„‰—sï †fØ–#ÊC#ú`²YÉä@r`Z8Â&Ôv¢„>‚#7¥°Å1Âå_þò—€”`h~½%, ó+=Y¥ôä„uˆð®Dÿ þý/úäR8!>è§?ýi‰Ú|óÍÉv븣ÉB( !‹÷4éÜç Ö]wÝb½q×:$…ü%—\R@Š© €-Ì{å]âr§pc3¡ðSŸúÔb…â6ç~+àˆË^ˆ7âÎg¢`àçꫯnŽ=öØâ2'•¼k›» ±±±Z‰…b!âî—”u$õ'”íK$’SÄAÐ!¼…f›&œ`JP#¼:n 4E,ZÒ­Þö¶·ê¸= @³è_ú¡ŸåE¿²Ø,)ƒˆ€‚·½ímlp‹{Ç;ÞÑœwÞy%¶g‡vh,€—5niâäæC€@#Viók_ûÚf£6*îq,?âxjb¡âZ'ñÇ!‡R,Dâ†W>ŠkꓟüäÒï°ÄÍY‚ Î9çœæö·¿ýLµê±Èˆð/)90.Ș¡qyRÙÎäÀ˜s€Ïüž{î9#`ŒùídóÌ`° ‹O$S¸þúë 7ÇlθüX³¥Uh~ã»ßýn j¦™ ÝÀP"–¡DR€%å• ËÒ x§¿àj×ÙïX»l·ÿÀ 6‰kæLÂåMùš.¿üòæw¸CáK¿6õ;V×—ÛÉQä@‚¡Q|*Ù¦ä@r 90e†,@N€(@‘ÀlàǾ”„b=e,ËÛ]†• K#°?@@ä¿2ö)åpɱ+*ƇeGìORr`š8`hÓm@IDATšžvÞkr 9Q„…ȱà~¡ö¾A „FôŽh³€"à&@QÄ?Šl;ÞB¨FôÖ–Ü,÷—?—ÌÆ¬` 9NcøÐ²ÉÉqä ?á")9Ð1'„U)@ÿY‹®ÖŠ¥[}¹ï†0‡Ízë­wÃS¶GŸ²èOð Pdí¢MSÆ¢¼ÝäÀTq ÁÐT=î¼ÙäÀÊq€ßù/~ñ‹F¤¤ä@7ðqÌ6ø±Mkà‡•(,IÊÚNš›²~‰ªßç>kòJÔý,ÜßôøE™Øïÿ¤Ó³žõ¬2ÏÒ¤ßgÞ_r ÍCmŽäÿä@r`( +ø=)9Ћ!p@àØd (ŽÇºW¹5¸ö% ˜VŠ>æþm× ÀÿkTŸ3É|;á„&ùöòÞ’=9`¨'kò@r 9H,7BrB ¤-~Úëånç¸^/’Œkû—ÚîèSê‰ízm»ý©×Ìó“ÉÑæ@‚¡Ñ~>Ùºä@r 90•h ¥˜h*2 ›æzhIšÍ@öÖÛ³Kå¿ä@r`9`hŸjÞSr`D9Bƈ>˜oVô›XxsG¶y\¿"AÀÈ62¶bxä#Ùœ~úé™Z{Åž@^x¥8p㕺p^79˜.üë¿þks—»Üeºn:ï690B8÷Üs›;ßùÎ#Ô¢lÊ(qàßøFsíµ×ŽR“²-Éeá@Î3´,l΋$’Éä@r 9]°¼æ.òï|§ðâ«_ýjDxrë[ߺ(Ý>…/{ÙËfêŽsc­-h©ü‹úþáþ¡9âˆ#âoóÔ§>µ9ï¼ófþÛxñ‹_Üì»ï¾3û–«ÿÍ\07’cÆþçnŒÉÆ“¤äÀ4q`EÀ€ZÛBíõ4=„Q»×>~¬ißQûبµ=Û“Gpe{Ýë^×¼ð…/lXlX…W8®n[n¹eîõ¨GÍRJÔ`ˆ«›zdƒâ¨°}ñ‹_l€- åœsÎ)€&xÄ‚õØÇ>¶kù0Æj@èè£.@ó›ßüæ ¨Ä\÷ðÃoXð>ö±EÕS¹Æ§G?úѹ÷/}éK…ç,‚ %îÀßb’%°Îþîw¿+J‡½èEÍ6ÛlÓ°.ö¢¥\«W¹?9H,'– µÍ-W®31©^€$LˆòËÉi½VŸôà…ÿk­µVóÀ>°a´ð±?Î);ò'9X‹“N:©ùð‡?ÜÜò–·,V .l,V_ùÊWš~ðƒÅrõÌg>³X`–®3Ï<³¹ßýîWÜÕ¸Ð-ûí·_±ikÒöÛo_,gÀÖÙgŸ]ÜäX¢d÷cqêVž0Íj¤M\û¸Ã)‹¾öµ¯pÅJÆúô‚¼ ¹÷½ï]ŽMóþzNƒ îŽ2{±&.”hVˇ?üá =µ9ꨣšŸþô§Í»Þõ®æê«¯..ª,½h)×êUgîO$’ËÉen€ (¶iˆ[kÃ;ìP©}\ßX ßÜæ€•pOãÝd©Ù`ƒ Š@ý”§<¥YsÍ5‹–?Ü Å6}êSŸ*.oíò÷¿ÿý›wÞ¹Ù}÷Ý‹5ÉtÈ!‡”}Àk“ñ iaà Ê¢d“AÆWî‹À¦x¬üãÍ;ßùÎf»N2Œ§=íiÍ‘GYÀ‰¾á™ÿä'?iöÙgŸ2>{^¯xÅ+šN8¡Ą̈ÜA)÷Þ{ïæW¿úUsŸûܧyÿûß_¾¹Ýêdi|Ç;ÞQ\ 7Ùd“‡¥‰ã~Ú®‡%P|^\Ë÷[(%'wQíÕwºµ“å‹«¥ûÒ7ƒ|k¸tê{÷¸Ç=bw®“ÉäÀð8Ð|—…:ƒäªNÖ£UÁoUG㵪£]Õ¤¡ŸUTŽËÒ†¼Èâ8àu\'Ê3»îºëÊ3ô,;ê«<פä@r`8ðŽõ¢8ö›ßü¦Œ©ír«Âª_þò—íÝå=î$}¸Áþº|Ô}ƒBýŽu+?Mû:±ïívâšVubÕVu„üU/}éKWu@B)ß*«:ÙýVu,|«:ÀcU'îkÕç?ÿùU;í´S9ޱίêXïÊ7´×±ªc,û;îk«: ªìw~Ç*XöwâêVuÜËv'žjU'^®lwâ½V½á o(Û½êì˜UPUÊtɪà-۽ꩯ¥íËæ*}©“ÄcU´—s»µ³˜Vu⺠/:‰BV}÷»ß-e}cÞþö·¯êd<\Õqá\uê©§®ê$)Çòg¸¸ç=ïYä²á^%kOŒþ2<¬5SsçÖg\ᤂí|PgAÏÌ‘ä€lVV½X4ÔsMJ$†ÃpKíV{“¡-²íÕåXšºÅŒ°FDLR¯òQw}<¶û‹2¹žÍ|ä#%ãžTé\ e¶c5á)Ù‡¸-|eóÅòצ׼æ5Å 'y„ìyˆû²€w®mÏxÆ3J\Yû¼˜(ÙeTÿW\1S¤[3[ýêQTœ‘¬îƒÕR㉺µSVF.¬H¬TVbã>ñ‰O4¯ý뛋/¾¸$á"*¼˜:“†ÇV¹Þ²æäÀˆr`YÀP!&t È1‘'!Ù¾<·psL 4Ï.[™H,jwÂúŠ’XCŀɌ ¥n—Ô‚‹WÆnà5ê W2`÷I¡þ¾÷½¯¸2K®Á®Mb¿N<ñÄ’Šý‰O|â¬ÃÝêœU úÓ¯Ÿáq·ësPqñóè×N@žK Œ€Îë . xÆeHG‘Üc{ñŒÆmÁÛqkóJ¶·ýÌýOO,‹³wt, D°6h&Wsb<‹x!‹ç›±Cãñ³•ÉäÀð8`"Ûn$>HRÉ1Äà˜÷i·NÖ@)ÔÍß$I…ÄuâŒnõ´÷E‘8Ÿí:Ö`HÌ™ñ˜ÒJ6Bó`Ita_Ì[Õ®§þçÖûúÕåÅ™ÝñŽw,`oƒN¬šTçÀkvk§ÔöíD2 JÄÁBQ'ï&E͵Fj™†0Bp¬µ±Þ…6Ï· ž–¤¹9Ðîþ[ÈG(þÏ]S–Xi, r“†°*„‹\Z†VúñÏÿú5ˆ|h3pzþüË’ÉäÀts€ÿ¶·½­Ì'e.(`àÖ-q†ýs eÞ-ndÝŒ$5àböЇ>´¤gW×V[mÕH\`>­^$©‘$õDº’rôª§¾– vÜþÖYgĤzGÝÚ©,«X;kðÔ-yG¯6/×~rL€œðàò؉Û+ûãx”Y®v ò:æA»ôÒKYåD×€'@PLÞÿëã͈1¿¹eI­Msª ªeÒabò6Ù_'B™å|Ìy9±Í÷2›Ô²\YÒls­àÚÀÕã¦7½éŒ¥hb7–H$†À5åR›zío—óß÷•%e£6šuX|M¤Ä”îp‡;”É4¥ë&°õSfùNk—ñ½¦^õÔשßÃPívöºÇ^ûëv¬Äv t´Ñ÷$6o»=Ç•hçR®)^HßðÍOZÄä‰éãò€¨ö¤Y\­yÖ°9°,–!²f2ˆ8ö fýƒá€çåÙY|<Ëø8 æ YKr 9oH m‚Òp“™ënz нöw«ÏµÚ@H¹B¶#íºR«÷BÊwKÈQ×c›õ'¨¾мWmj·³×=öÚß®o¹ÿÇ÷.¾ñ-´fá»ÛÝî¶ÜMÊë ô±o”ÿ” ú @4‚MÎ&y>ËÁ…D…»{ØÄÜ»ÿþû72áø˜H™Ëá½ï}ï°/?1õ{fžW|â™NÌ æ$’É%r@P'ùkÞé\ÝÌU•´0øÞ¡øþ…bp¹]ý=ôЃ5ŒD2¾üå/ïÉÇŵ™·jÔHŸ–mP² Ù ÅÎikd]\îöF²©qSy¼ÜO`á×[0¤Y!<L,1˜,¼É ;ƒ?2X“Í]sÍ5åEQƒ ä¸~­ñï8:NÏ+žŸ—;^ðqºlkr 9H$ÂòK|ïâ[Hà aw!u-¥ìi§Výa$Ÿ2Yo?ñ—¿üåæä“OžÉ¸”ûô¹RËŠguV Ã`©ÓV±\+AÑ7ÈKÉ!ÿ¨^‰6å5ûs`ÙÀP4£= Äþa­ùo>ÿùÏo6Ûl³2wƒ4¤½HÏÔ´û׿þu·C}÷]uÕUåe¨ yAÛƒ¨—¤ß‹k®À~m­¯5¨mׯ?ƒª7ëI$’Éåáw6ë’Î2@(}¿-õ7yá5ö>ƒ¥ƒ' î\DnøÖ·¾5kþ(ç°àÄù¬•—]vÙLUb¿.ºè¢òMŸÙYmˆãfaqÏAÀ…}wºÓbW9.KâüÇÌì뵡®~eµ à’Ü£–œ‡a “`ÜVAÊJ‚ TC÷l~« üøÎw¾SêEž«¯å±Vo ˆô›š—ƒ¸FÖ1X,ª‘qtŠa£äG>ò‘%­è _øÂ’âSf›vØáô’1©òö¢›ˆ 2©`P“Äñ½îÌì]&»ÍmnÓ|á _ˆbåE¾ÝínW•Éíô •,?>øàRNZÕ+¯¼²1qÞ±Ç[öÑj˜8qýõן5QÝÏ~ö³€*ëÐzë­×¬»îºMgöñ29òÚû²—½læúÃ܈€gW?Ëa^3ëN$’ÉÁq€Pé;’´0Ä7Ïwb¶Ÿ°ó)-¬¶þ¥_ñŠW”å\%fØtÓM²@7b !l³Í6E~0_SLšjŸ8&Ê`ë :éÎÉBÒ”û(˜wª–Ãl“…d”*]öA}™ÔW,ùqA“y×ÍÆo\ê$Ët£~eñ”£MyÌcJêyóJkˆB[Êv ñ‚gÿ¯~õ«ËqsQƒHÆFÙ»Ë]îRb¹ìgñÂK »Ü·L… ËC‡ûd€¡ ‰>Tó[Û’FƒË†êÛÕÕ•a’¸ ¥@—Z¢¾î«^õª2aí-‹—äÿñKƒ8£ /¼°hYøÔî½÷Þ%8Õ‹+)èãÿx™<øQfŸ}ö) “­”§^êüàp™¡›90¥Ü'?ùÉÒ¶ù—ivÚi§™G›ÍÄM#êõö·¿½´E,Ôç?ÿù2‹yh¢ÃZÇKÏoX×Éz“ÉäÀ8r@šë:±À¨ÝCÝmÔÚ6í ¹…ìÚÿA¶û'?ùIóž÷¼§yÈCÒp‰3G“ï·ÉtÍ%Nàâö¬g=«È¦Ar—ýöÛ¯X?ô¡•xiòpð¹Ï}®(w£<`¸y´á»ßýnsüñÇÇáYkmn`¾( ŒW ùÈä»mêW–RøÈ#,€ê#ùHIµÎ‚ä~Bq®>.püã›Ã;¬‘Ñöè£."@'v˜˜@« ðIßwßÞQå†I†ðÓ=XÈM!C óÚY÷â80ôlrÑÚëÅ5wðgé´^ÆO}êSEÛÅÓRÐ6HAªÝ^ò~ðƒåå4÷Ã+_ùÊ񾄼Ä@%)æUÇe â¢ÇDÛ-ƒ€cæq Z&V©{Ýë^Íé§ŸÞ˜¿tÐA™Ícöð—¾ô¥¥­¬\²•@{Í|^*ÀO<¿nU9–”H$¦pI“Åøö…@kMN†@Áw!+˜è6ðbIR? Ñ/€+ ïÖ$sb½æ5¯‰êÊš7 O2Œ j,IÊÀGMs• ™ ÈÙqÇ‹Lå=b¢<Ùà1OúÞ÷¾×œxâ‰ež*Ö°P@l¿ýö šXÈp@bEíÄça`&£Y¢ ëšYïÒ8°"–¡¥5y°g³°Ü^ò’—í‚Î à%ç—ʲ$]*«}L¬gœqFi„—q‰£µ0€…Þ1ÇSætxØÃVÌØ^ÂnéVÍaÖm×Åœ õ Ë ŽLXA!Ú‡aSžXûzYr 9H –ܘ–ã{1ØV¯|m!Èâ!× ² šx±p§—¢½_fBž&ˆ‡ÅÄDµ„nÞ/õ3æ©‚bm^$t³›Ý¬¬É?A¬'äÄUN™nôŠ)B÷¸Ç=fæ$âÑ4P5ÍU6ŽKùäþQ³-Œ ˆeÕ÷ÇÚka ˆ’9ˆ‚z˜`oµ1úk¦ 5LÎ/¾î©C|O‘à@/ž…kmˆÁ€Ö„ï-W7¾²´/L¾' <þÓ¤p—ãk °8æ° ¦,u|Q<2×7èŵ­Ï<óÌ7=庨8?×Éä@r 9˜öÚk¯Xæs^–Y—€B.ëÐ ‰•åßøFqëúïÿþï† / ppøá‡™ƒÜA~’2W…j‹·{1UÍû|#ÎÀþ’ hÞì[‘‚S†$<v B,2”}÷Ý·Ø$Î?ÿüfçw.±D|li?tì ~­Üã,@ ržDæš•¿ÿû¿/þ¹q™QÔ-~ˆ«bƦµóxÙ™?Éä@r 90òà:=Œ ú‘¿ñ)k !7–AÞzXkÒ’ˆFÝi ÈÊp5¯CF‘€a)d®uRî¢ð€©ëäº&ÁàÃm ÜÄ>³,qgã®fÿ\e%3@Âds};bêlpõõ²ýÔ§>µÄ uð~J òÖBêYHYÏ+,ˆ„ìëöRo–†34¼¦®fA{O~ò“‹Ÿ,ŠÌqâ}À‹Í'׋)öç-oyËLzÒ‡?üá%™Ó³ìqÈìÝ’"p‘“e…ÙZLP *^öç=ïyEëÂ=O"9šeùñÊ\×+ƒL¹Hþ$F9ØàC™ &ÕZìQ¼-q ¾áÆ3ŠmÌ6ÍÍI6è¸{Õý-„Ùn­CÁ9wó+A. äñÉâc¸ªµ¯E‰ËíQ²¦ç>÷¹åzÔ£J2ƒù]í¯¥ê{äJ,P»™…·±‘›$2@¼\„¡X˜Ä\GüP¿²’UýøÇ?.rUÄ&q™;餓fñ¿[æ³OV9 g QœH€[„̧ž…” ‹Pô™X/¤Ž,»¼¸Qç! 5ú]õá?)†°Ï*baŸÃ´+`o%É‹/c›N,}c=@¸/«ö³ÚÔÇ´™HÚI)+kôHÃDµ?ŽÜçÖZk­™—‘;“3Pd.ˆQ"÷k`–BSJoÀÏ éž?ƒßBÌò£toÙ–¥qÀ»a©?Ô±oi5çÙÉÕh» ÇxÓ‹WŸ±2[Ú3 ß³^w?êíëÕîåÞOtdX(* ö¬~d^$¾ád‚õ Ÿ9E(™h>ÙÿÈ%,¬B”¶ƒ @F½ábÖ«N㾸gna¸ñ^v+?WY¼%g¹Aʃžþð"à €H’¡*Ñ­Í‹Ùç“T˵XÇ$w /™ßËóÀÚãÙb®“ç –iú?uNVœn¤s)m2ðm5 X¸ÈÕe$>ˆäõ~ÛíŒ'±A díkåÿäÀ 9àÇø¡H–Þ6ŠãñÐ×Ïú¦ƒÆ^ƒÿ”Hm¿}Qn:¸²ø»¼ç=ïYø·ø¦ãLÖi§ídšå ³ƒC¶×èoƶçÍ•:Ø®º)$áNV_‡ÚêýímïŸ$Só¡¹ÊR®ÖIæSç|Ê,XÀ-×?2›,uÆA¡hO|÷r\ ŽŒþ:ÁО‘Ä ´ Ì­>ÐIÉIç€A>‚@dÊá"Ê7¼ŸfpÒy“÷·<0oÊe—]V‚N§˜–§ó»Š9à.¹ä’Y…iÛ¹b¹–…š$ïáÞÄm+&»Œ}„ÐH;l­:7îš(ø"{˜ýb[).j’¹‹@¼Û€@% _½ßáRÄ{B™ˆu!ȆðǪ@ଉvœÅ%ˆwEbM<X‚VŠt\äx­XÄïŠÉáÆKúgîcø÷‹u2h®Gƒ¦'1/£äTú:pô¢½¨¤F õ‹øNª?ƨa\+ë  -æ²$%¦‰y‚¡(‘m·Aº7LOó^çÏ‚—@…>ÿ†WR|wœ sºp1®Éd–hA&‚4½BM„ð#Ž8bf×k_ûÚ’ÑtfGgƒ[¶ØŽ sÞµAÕ›Þô¦2o‹2ܼ¤_¦!ÂKm$à#©œ•©³¤QpQ¡µ7ÿ‹2ÁçC² EZæ³:SM˜ŒÓX ‰} ×/\Ê2É)ç@|ø­ V‚ïˆPžØ³‹Ÿzê©3ºØYMâGã»Å[`áÜí·ß¾B›Œ #^á¸ãŽ+Ù#­‘¹Äd“Üm·ÝJPûýïÿ²Hb•⦇žò”§4ï~÷»ËvüC°xàe7wÔ¾N·¶˜¸ùÞ÷¾w `—ýHL`„€=÷ܳÜíµëHñ$tk—ò&6×õ˜Ç<¦´i“M6)®?ê{×»ÞU2ZŠ3‘ÞÖZyY.ÍŸƒÔ!p\ûJ‚!}`ÓKX†¤–ÆæOr`‘`Éø?ô¡•IÙcü !7ÜFÝ.óïÿþï%ÎJ,•ÉÕF<à˜º7ß}\y8à€2®y/){jåÆ|ëÉrƒç€q(,BÑw\¥_|+²Æ…p`¬(tË–³›Í²«9ÐÎì³úHn%zsÀ ïÃka-°¬”e8 ”dß8ht¥O5!!2—Ø¡#<²d¸"àòe‚DÊM”¹í¶Û Ðm²c–'–(iv¹¾{î¹%x]`ûOúÓìÈRÔ‹ û@šÃ#\åÚe불ÒΗPvØa‡•si§¹è˜ù¨Ò Of$Û§œrJ¹À­Ý.åñ@†1íüðBSù‹U `ÂC@ˆËàé§Ÿ^À û¦}ÆÓÈ@Ö¾‡åøOè @.^(ÀÐr\;¯1ù \¨Éøa!ØZEÛ½HÜ w^ò‰Q.°½úÕ¯.Vâ8°÷žSvDÙ8fm¼5·Q=%‡ìuR~?ýéO/E)ÚDiÀB´¼ð C1ý$Ǩå} ½ÚX!i93}ïBq÷òR.?Ãʳßýª¹wœ9.k{¸Éù¯ FýêW¿Zâƒ>ûÙÏ‹ « ëŒÄ§vZq›c9"„pQÑvÖ $Ⱦ}ìcDÙ )ŽÇšµ èzÒ“žÔ°}ñ‹_lÄ™»%uÊä8/ÖÀƒ¬[›^©øë¶œwÞyeBC ‡¬íë—}îWP·‰Í´Î’$¶ÛE“Œ+4ÙÇÜ« ð ɦìMozÓòŒYœ¾ô¥/•m1Yà\I 9ü !ú~®+Ù¾¼ödq@ßB5ø‰q°Û² ±I¹Ý7'œpBsñÅ—ÓŒbûŽ:ꨙÉÖ)5(U(e€) Y÷(QúЇ–1Ã{ì|àŠ²r]nc 7]–í 6Ø X·3ûm·§5œ}úŠ%Ƨv?ÎU³ÖÅr`¬Àá‹ é{û¸WŸç%M“új~äÖü8€phæW²™‹åðÃ/‹t©üûùè›S‚©† H¿ŒÚ´Ã;´wÍúÈ6 ˆµãfÐå¹bhj—~V•hKÌ[òÝï~·yüã?S£s]»}oRýš¥EùH‘¬pÅ‚Çì#\BÈz—]v)é— w :ñO+Ia•ÔõÅVW²Mãpm–ÓùL¦9÷²\m4æ7jaÖ>d7c¤<ÅK›Œ#1–5Òt¸ò¿Xƒ)(kŇ÷“ešUÐaYb­vpeÎ&`É{ ,wYuKúà8`Äu6iy8㑱©WYž–äUæËŒš/§²\r`Ê9@ˆÅ`o!Œ®úøÇ?ÞH[ºûî»—'™{…o>âÎ%F…°¯½¯zÕ«fâwÊÁÎO?p¢ WÀ°Ñ겆-T)óæ7¿¹\Ž;^?жDýÜÕ€(‹Ø(.r„œ°êƽÑB¿üå//U·úJ¶ÛqÌþ¸~”!œ¡c;nvâvÝuןÛÇs }2Ö,Þ´æ< ɤÕЇ¥ùMZ8Œw[ýl.×øà=š«Ü1ÇSÞ'@±:³ì„Ö¾w¼ã%-3Ë3E  ÕX°õ}ï­ñÑ|O8IËÇÏ] ]÷ý'iô80V–¡Ñc_¶(90=a3€P¬ ¥+A@ D24±ž°¦Ð„®½öÚÍvÛmW@ ëw6–€A¦½öÚkÁMgÃ?ŸÆè + W‰\‰\Îð‰ûÞÏþó`¼•B}~ÌKÄÕÍ–âxÄ °zù°Ò ÊvŒ¥ËÜJï{ßû XaÉAív¹.·BñR2ôÒK‹{Ÿkq5ìF¬ €™$ úÀJ»Èi£>‡¯ÚCûJød ÔFûô îG⣒’ƒâ€>GÀÕïlë{Ýøàž{Î9甾X—ñîQä¼á o(ï_ÛuÍüY⇂Béá¿øŸ¹”OÆÂƒ¢®¶IËÇý#«Ú¶$.Ò24ºÏ&[–YÔÂ@/`ØÜÏeDv&B>1.s\P-³m¸á†%±áYÌ ôèG?º9è ƒÔ<±(\Êd#̈I"Ì ®+´íÀ7— |¡¥•ÍŽ»‹8€3Ï<³1ß|ˆfùŸøD¹¿}÷Ý·-FFB×óŸöWƹ¾ð…p±>/¨Ý.e¹Ê°$á•vq™;餓ú~¨}ĹÝxæb¦Ä ¬4D-܉ܓ.kïÜþ¢g?ûÙE3¾ÒmÍëO¼ úÝ\D)âTÿuyY)FÔ%ºmé‘ò>ÜèœÖ…ºŽ~ÛÜX¹ÔÕÔ-)L}<·Ïßœ6(ò?i49–¡Ñ|.ÙªäÀÈrÀ€^/ó†u3,„sçˆâG¨‰ó²ÎѲIAÎm“ù:,5\X„LæÊe ÔÌ߉ĴÌEíëtk °Bà'<ù¸JkIXÅXBW\ç`²DµÛå¤Ë6ñ&^šÂÏ9¬N–nqF‘™¯[™åÜÇ в!V3À÷[ßúVÑÆG[Ž8−PÂM4õ5·á¨?³ÖÏMzЃ4ãˆÿÊÔý8żEúFŒÂ¾væNñR,×${íÊ$Û yžjÒ·Ýc>.’±X­û7þĵËüÿ§‘R„:ÂI^WšŒG´ëáŠÄ%hó±š;©@\éöçõÇ‹À« ÆØG†TŒ}d˜¹Æ>r×ZVc–=ý·Mâ|Ô-´hŸä1z$rðsfqJ.<_.Ó!/Àä% 9Ï–²¦¶.·5Yû|9°Z…6ß3²\r 90µ0ÐÇ@õЧЊ›“Ù˸3k·Ýv+·Ð+ÁÀ¸ßß\í—p!h¹y Å¢ uß#¤úÏPªo–V,îI⺸5r?JJ,•1Ω'¶cݯn ðõ*'i‚eDQåz2QŠ[”Àe«­¶DõYÇ9P[çÓ_X}Æ ñ­ýæ7¿Ù|ðÁbÅêjhijX>E¬c›§U»<(>f=+σ»AÝš VÁç=ïy]µž+ßâlÁ¤q€¦E?´¦}ÓÄ•O¬ w£L#=iO~eïG?³Œ2‰-“=“«¥TûÒl‹ŠôÝ£ÜöIi[ô‘XOÊ}Mò}Œ-2QÙ|SÔ.ôò±fÒ$Ñš›# )90®|¹‡Zcï?Â|Ü]•MJ,–ú´áÖ5À’?0A›íž‰FüK_úRó¹Ï}®dƒ‚êYîsî¹çËÿ]nx|xù¸3c·‰`£~×4‡ˆ93,ü?ò‘”DÎá,mgRr`98ôXB­·‡y}ïŠ÷y¼k+Iö½£ý:ÿüóË8Ò¯ÜR}ùË_.óêÈdgbEãÎ4>(^È:–ø? ÷Ÿ÷8`âUüyEòÂ$‘yÊxê$%Æ™c †ÂʲÁ4›l²I™ÈðW¿úUñH+ˆ–ŨBdh&K‹IV4A†,+’yH€›š\‡%Fö4A¸æoë`Bø „ˆ2ˆìF&d­ºþúëgÒžjÖ[nY>Þ&>”zõ˜cŽiŽ>úèÌ×7'€vcÜ 6|~]“vÜ Õ&Üa‡â”\'†Ê: qBþ@Ô…ÿʾ4L¢,©ÉÊù%„ù;jºêª«fÍQ#“ÓÙgŸ=ËÝ@"õ»ë4¾¶‰"‡¥º&å^÷º×ÍX‹¥‰6ŸÀ8éT÷¿Hª0é÷œ÷7^ø÷ÿ÷âu"õúÎ;ï\<=Ì/Å[Å{:)$v›Œ“”gŒ%â6ÃÅ qUã>öñ¼¤Sõa$p˜ÿëš…õAõ\ ¬G¬?rûïÿûK™ø¹ôÒKK¾ÿøo®€˜£¡N®I€‰›åÇüT€*.oè°Ã+ùÿs¥ëEõ|&RzJ»‰¤Œu¿Ü¹îÕsjôª+÷'ɼ%ÀQ¯ºYOX\ãÃÏò*µ(‹ "$Ph¨‡ ïsdBSʆp[íU¿sÄáq{í0œÏ ë}1÷FL\É‚K™Bɱîºë6ûï¿i‹I*ã‹ó ²-SPBùÁÒŒ(?¸Åpœ ]ð›Tyc†±Æc¥FXºÔ!Ë“óLºéÞ¬ßP3a£±ÅØeL‰±ÀXC „WA@š1mÒ©î{ñìë}“~ÿy£Ï—¼ä% ;(Z¹Öík_k¶Øb‹FÜÅ\Šòkòõ¼Y±ßz!eçR>¿\«Õíév<÷%Ɖc†h[äøç6Ã%N,DP"$™ƒPó•û!Éo¦hÖ¡3Î8£>\ròǤvW\qE‰1ê—Á õ<.xgžyf™9Þ|¦G?úÑ¥n M “Á±d|Ð ¶X ÷?í´ÓŠÀ¥qGê3“5Á.)9°ÜþÃ"ÔÏUŽð.6Žò¼·2Xk½ê$µ¬¹ŠÌiC 6§¬À¬§”sY¼WŸþô§Ë\,6>òf7VL|èâj+ÀÃ¥5ˆ+ˆòæ qnLÎ)^gŸ}ö)mæ2ëÝ6VxŸÃ’c\`ÙX\ƒ"„5qóemÒ>à†«¯1 )KñbRP‹z;î¸rÌxÑN P±O ãÙGŸœ–{Ïû]SÈâø¢Fk)RȨŸ¢äg?ûYQ (P(lxP´Í_iïRÖuû)Ÿ¤£M¶ºãïX22Ö““‰(m\ßXeÌNJŒ;Æ }þóŸ/€ Ï…fTv¡ €%û$Fà Gór·Éd„w9Ï5Ñìð÷}øÃ^„.X€°º\l›ØìðÃ/³=Óàc¬TK²âCLÀâΈ±`qá m8÷8×Õ~ÂZÄÆX±d4LIÉåäÕÚø~×ßi§ŠP  ´×^{•µÿ@½ãˆÅóÈ#,Û?úÑJý\Jº‘wZ¶°ÑíûX\¼k¬A>è÷Š@MÐxæ3Ÿ9ãû.E3ëû¤ie‰5™g›¸knÄz4–¦š¸Ä"µ­q‡] Z¸îv¬O¬Øx€ÜňqåIÂeÑE]Ôl´ÑFe;~¸êr¡ ëQ쟴µgà'¶c=i÷š÷3~ðþzw5mZýõË÷ßþ¹%ƨ‹/¾¸Œ,¾oûۋҔ׊1Á˜I>@ )ÛOùd¼ã}Â"MÎŽÄ/³YŒ“¬\”Vdsy÷’’ãÌ5Æ­ñÜMÚé{»e—¢MáöBkVrDX‚`^lÚZ¨`ÍA„`†ÖÅ5cÎ ¾ù– ÐâúÏèòË//‚W¤ç®Û·Ç{ÁˆFˆàccrê…;K&-LM„ª~ªºln'ÅÂgôÑDÕÀ¨Ûu€ý˜€.ñ0ÄZD3ÉM-\I/¹ä’’Ñ‘F(pà¥ùøêÿæìò~Eó!J’NÞyzm©)„—ø‡(Ve@D9ïþv7¾6:´¦Ú£Œ1åï|ç¬b—׿þõ³ö±‚áI lq#¬Æ4¿ÜöÄzXÒB‘M“²¤&‚âþR»ÚÖe&a[ÿ°Äó´@}î/ïa¼9àý¥Ðœ«OöS”<ö±-L!—•8A”!Æ c'%°¬¸1Χ¬ºŒSÜvQ7åÓCò‚à8¥°Øi×aõv^dñ¥4âúŸ”wŒeh! 'œêuži „ÚåXtg„ÚÇ»ý¿ûÝï>+pº]†yBêïU·c5 öÝwß2Z¸¶ë^©ÿ‚Þ¾ˆY©v¬Äuf8ˆõO<ÇBèE/zQóÑ~´hÞ¸L"µAõûàK¢W6`‚°îCþîw¿»ÄÊx'ìûøÓ6 ÈÝvÛm{Ö+.Fà&ÜDÊ/mðznÜ<,’𠀄5‡†ôä“O.‚Aû>iI½“ÜéÄöp_³Ôu>ÂÒã?â–ÀÅÿvÝö!Š`‹b:׋é´±îwÊs¥£XayštÂ3ü°¶èGí>:é<ÈûMpWåíÑm’u BŒ þJQ"f°¦¶¢$,Êá•0Ήw ΟoYÊ'ã¨k %h'•©•39‰âVöJ+5µÛ_ËíäÀ¸p`¢Áи<„v;¥à¤Ao“O°6¡ËÌ(¡_[6¼˜‡i&™çÃEs!„w¬¬’,š£Jú á3Ö½Úë# ä?a•åâÆz.r€4ëm¤ò|£—` \¸.·q6×׫ ÝösiåJ&NˆPâšÚøÑzCê—¨¥MÚà]•½âÂ}±fµÛ-ÖH½²?"`‰å†»Ü\ô¶·½­¸¨p]aµòþk/b=fª xRNÛ&™‚õ}ÖÛ“|ïyo£Ïœí8d-×»ÞU¼Qô×ù(JŒ‰ó¥ù”õîÌ¥|êõ.¹'S Ô´PÅ_}nn'F…óËF¥ÅSÐãn´NP#ÍïJ-ë…¸, ‘A_ŒG˜öWª}4mÀD[cNø4G!û-oyKε_JP.Sañ[R¬‡v¯ r¹{q3ðÁ£U —&÷,h–Uƒ )ŽÅš…6_=ëöÀæ•r.¡ÛõµukÁúØc-šÇR¨úqïê™/¹Ç¸ß8Ç|Y2¿‰iÃbU—^´A`ÛŠ„QO¬ñðàâ…¶ß~ûR6À ‹'K kýâ÷ÜG$ ˆºêõÆo\²´qãèuíº|½½á†–>ìùÑŽ²s™‹Àfe=G góÍ7ŸÉ@W×Á Fò.rÎ¥±õÞŠ¬IÀ3Kþ¨pÍ5ar­}­Ë×Ûø ®€›®sÅYI§€KZæšüo»ÎÕÇ'i›ÀÖKh›¤ûÌ{?PVrŸgi–<Å8ë;É-η3â“—¢(Y,Wª|ª¯ÃUÏwÞ¸TÕ‰­'—¯Ëævr`œ80v1CãÄÜIj+m77˜¬`„9aÐÀOX¦AÜN{ÄU¦?è0ˆªÇ‚Bh«MîX¿cQŽÀ´ô#B£8 @$âM€ûžk/ ÷ ø”¦]Ús Óòæ7¿¹Êð'õÐ|ц9N/ˆ•ë€ôDzÖ)Ù#º ‡\ ¸œ~úé%ñÆP@#Ð%E0?-¡€y±pfÂ?á;,&åm·M–5–™ŠÞâÐ"™·4ÖB¾€¼òn<ä<ÏH0µƒÕÆ~àÛ"I°×<$%QƳò¡G@b7"ü[‚$, Í~ìãƒ.WÿTÀ pȦÂÖØZæ"çGQßYg<[î%íXD}\lSýñ÷Üd†ÂõçH¹Ï=·¾GÁÎâBn·RŒÕõÙÇBhAÀ•º¹ÙyŽuY†ô% Q"Dð¬'<}Õ|žôûÎûp}æök|ÐWõYnÁÜÉCaAQÂe–¢Ä¸ ëm(J"vyÐw\+Ÿ|s´ËØÈ·ŸòI;|Ó·&«7©«ÛÝÖ¬/9°\¸QçCòWŸ‹!]Qõ¡¡‘ š,’øèóׯ?ò½šâ¥²\MÄ p/4€ .!Ø ¢~< °@äá24ˆºSGÍ˹ÎWÖ€G¸ 9¡• ‹zþåæv òœmʆͪ„A„{®D!€[V{¹éC,.€‹ºÿDÄ là·ý&½ ”Эí‚ÚY¹ú‚@Ý'0 XÔiOh¿|(ì#äƒeí#ÇÕÁ ®0¾â2â«–ÅèÐ61 4ü&zÁ ^P,¬"ÀØ¹Úæ}@i=ñ3áÎå|mä&›!‹‘zKñ@be¼Ç@‘Å{ °Å»¹Øk,å< ²×Ä,HujØ¥\gÔÎìõ1 ç:M2yOi×õkÁÜ”/23ã’Ø e’’ƒàÅ7UcŸñߨ¨Ÿ»Æô~cŸqšÅ–G‡ïN·¾ÙKQ2ˆö÷ªÃØ]+Ÿl‡ò©×9±È£x4¶ú¦'­æ€ç+ɸ1ãyÈãSÄ`­>#·FckòÁß}÷݇ÂC®L:î ‰@ÍFàâ8 ‡ßÚOèbiVhïi¾~$Eù\ -„|ŒôÀ‹0n¤}€™ý2ÿŠã ãä>øA,#„kÖ!Ã=†Ö Ј@U‚p[lûÌ¿Àâ`.À—Û0äƒÃ Š6ÄuÛë~m3àúp± …Rº»çCÒ;³Àøh›«F–¶¥‚¡h_[Òþå–sݶú,çµWòZ\YödÕÀ¤ä@r`t8`œKP¯VqÕµ,'±RY‚"©Süï·¦0ï‹~åòXr`\80–1C45O.K‘âÙŸ#6ü \LÄ ˜‘ÀÖŽÓ ­!0*Çý† ŠhéihNÁ\QLÐ*Í.²fÑ x:F3Dc¤]„2ÖçÕî/4·&LE´ùÚEá>Ó&V/î7Ü·XX-X–¸‘±ZÑü³>, ܾÌ;`Žs´y\‚¤ùÔZ³å"Œ“`Pq®•É-ųJž1°[gB^mm î\ÄnL\Ë<;å€^ä²TyžÈü2€ r=Ï D\#q€8ý0RI—B]~\7ÜzµÍi,&ÝÕwÍëSú?þ³<‰wTñ€µ4™àÚ©ÃæRJL&ò®’Éä@r 9°xŒ¥eH¬Í»LNÀa€é‘\ ‹0!¨ŸPº[Gæ†g©ã4¾¯Xqh.T@`CÀ¦Õ!ðÊvÅÊ`›&ž_>, Âµ‰µÄµøüA2ŠÌO;í´"€sá‰I%ÕA˜å‚åú„wîALïê€Ï¬D1G‹ÂgW»¸ŠÔ`0Î_Ž5Bmi¥ñŽ%…5cï½÷^Ž&”k†…é $†ˆFü =®tZYýÚ„ŸxLKÆ‚#aŠ r=}ðÕdÏ“LÀųA¬@µ;–~¡8®®öõ#Ú6‚-·D ºµÍ>V}‹\¤\U¯úh}Í:)9H$’Éä@r 7Æ …•eƒ 6(–Î<óÌ"øbG³W ѰT"bÌàÎtͲB€—½K}5¹-;ŸO ×£°&[o½u)Î…Œ…B¦ @!ÓÖ#@ˆ Ë byËbÑ–ÂuM\«´qOcuàåš,GÖÀÔ\–†ºÎamK«- —%¬úV¿z‰n퀋ö­5‰'òìX–p©ë5A.ÿ_  ¸a‹çÚZ’XðÈÚäùÖqc€m¹šd¿CÝÚæÙ·ï!ξz‹2¹NL2¸¼RBQŒP,Å=tùÄ‹añ°“È£¼§ä@r`º80–nr2Ym·ÝvåI±LpãÒĈ`*‘¥Å oT/sªã4XXXs¦uV0e½×éo Ò!ì××ä~Å!ó¡\–˜*@U¸O© "x³n´©ž=^ì‹,TÈ=º_n€\÷j«@»Ž•ø?*‚x¯vôÚïp³ ¼c- _j} B,sžw¡(?ˆµ¶D’úÚmëuÊö;æxR°Êru­k·3(WÚãE·róÝÇšêÚ’”x·“Æã¯-\féQZ!­IDAT¯¼(©¦aòÙ…pŠÅ™{6ÅO$›YÈùY69HL"Æ o €ë5ñ,?HxnNâ@O›€ ‰¸OqjO&6$¬;âBÄŸH‚Ћ¤pöAf`­W#€µ(æXq®@w¤]\æ|´¹å±Øb"ùÈÓrŠeA¬ê“%LÌHRr`R9 ßr'FðÎqqõ>ö#™”XfAº(jXœ¹×Füâ êŸä:xˆa;G1&«ë¨ï }y¯~úú6E[ž÷¸ºDn"ªx‘ÝS¦Tñ£¾IÉäÀÜ;79 ‚ îknK4õX²ÏÑr3‹4ÆegçØD”–¸×d2Gõ\å.Âêr±ÍÕMÛ¸gHxŒ±R\uLùv×û`øX# \\æÄß –e|ÀX§$+@,¬X€VÌûRäOr`„9à}`Eëe¹çç]®-mÞ –Ý…’º¤4 êVwk¯YÎ>ûìY»#ïedtP‚KMÞU ɤÉõ}DŠÊ‘ºm”R¬G<¡,–”9ÆqŒIÝ9@!™ áévo RÒýLÿ k®qÙXÄr¤_DŸ´æ*Y¿ô×÷¼ç=eüuÌbìgYáŒôMJ8Ï5ê»êùÕžqÍnkÏŸ§É¢Ïî³¾#\¹craŠ5©{Ý—oŠ3nØ”uþ‹átú²ï„L˜¦p=ÓY¸7>¹”SŠƒô}4ivðÐ7Oì¥L•ϸz(ã¸òú¶Š“«Kiç[uÈ!‡4Ûuž…÷FL/¡¸IÛŠRMSD¶y$ý?o ßä¤ÕX¨âų #x”DXÁês+9hs`ìÀMG{N"¤6±ö\ 1â:ÂÄåÌÇM¼¹f‚,8fdërÍp‘òQ­?¬®D« ˜ÑHFzîº}ai¢½ñqÔ–.ª . >xÊÅÇ7Žùõ³PE¹\'VšÅpG¥µ§½gÉåÖ*&N²qUÞAJB™8>‚—þ$ƈ÷¦×ý˜k‡à§“ÜzÕ-9‰¤ÞmB,2GK-B‰ÂìÔ)ÁÒB&E‹d!„Nšv÷ÈÍu9Æ.tpc‡úÔKh$xJþ üq«%èª3H¼!¡Pò~Ê—(?k}Çx©o#‘17Ò;†×È8Ï¥ìèüp=%¬›?ÆÚBHâšÌZðÄquz.1»®tùÊFýɳ{ظ°)¾O<ººM -c©´é&½j¸R²Š-$Ád+¬ Þ% ØAG}t¹'`ˆw0 hâŸ÷2HŒ$þzêDD½x_Äèvë¨ß<€í„BêÔ„ÑÑ.±šÚ€Ô8öîzÎH[<«ºŒÌ‹õô@›ºê2\§e ’ÔÈ” €t.ÒÇ€ëœgŸñ°Mú·ìh·ãÝ/ö­Üéë²øn|±wj饈RWRr`Ú90v`h!̰ýlŸoà«Pû8Íb$2hëõ>ém×]wÝ^§ÏìwíøðÚIèð{uÚî™r#90b n/B”óÅ_\„‚ÁŸæ;bf¶ëh”YVà„0JЙ‹¼„HÉI€¢~us­}ùË_^ÊÉHé\ è!Ô† È@ ´ØvÊfI ‰$ÚÇò¬EaSŒ"ë-¢\Ñ&Â$AÊ=Ó¢rð%,Úq,ÉÀw&)Ò“nÈã¢gëcÀ~I|ˆ ½h®yϪâIû‘~ºPW3Ê2@ÈðüY¸Ü‹ ¥Ü«ciÖ•ˆéÑoôµè À¼$úãBüôºï PïžYŸ|cæ¢:‘¹Ø6ÝtÓÒ§Ç:ä$Ü_Pˆà|Å„Ñbk½ç1a4o˜0Z2$ï‚öôK‚áï½ @jÝ~ö@&×e|ûkòLºŒ± CÀž„HÀ:-ÝÈØÁF©CYj3æ$,mΓ¹6&&ç:/»-^h—¾áù¢Zñbj É~XôXÝ»ñ† ®/|x½¸I¦¤ä@r ;& u¿åÑßë#ok“‰Ø!®uÝ´Jíòù?9°Òà2ê#¿¾{òÉ'—fùˆ ¸€ÒºS°ÔáJ,4Ö„ n<ýê~üã_@ïšø>çX‹‚´‰*e?+2?ün1L4î4÷2D ãn'ë#·#D òÎrgbÕ ÷.€©­Œa1" N!—Jòç`9c­áúEØeqänÆ} å24JI¸H¹’ꄲ øÕïõ`Ý:È®]+ úî›0¼?Q_{Í];¬Üüô¿nTDZՉˆ¼3Àá= ¤_³|K@ºY¦{Ô„Ñ”•¬@ýˆka·Û­¬˜cK›€‰ ˆÅ¡àðÌ€ï~72þPüx×Y‚YÅÏí‚ .(dÍ †—ÆEý–û K1þD¾÷µâ£´QEu[Œ¥ˆr¥™ö¾÷½o™Ï0ÁP·§”û’åÀlÕHre$8@ãñNuƒl>”4ìIÉqà—QÁÚA4È+@Ë‚cŸ¿xªÝ>â¼ù¬)¼#h®º¹æ>V!ÉHÚ.i„9šV6ZIÉiá71–AÕ$î2A„%ng\Ú¸¡ùs¢iæÞD £I^ ͧnšy (ÝÜRÅwˆ]âÚcLØi§Jðw»=V®ÜeÔelèhî#Kûœø/.„QM,Àa7i²9 !פ¿ÕH¬.,Š1¡Ùwj! ~"Ù`°#´þËã@;$t ¤@bh(0B¨/;;?u""ž ’x}3#–ÞN(¤þ^‚þ$NÝK |´ÿ+.JŠÈfIŸêr±Š£øk1Æxë9Ôž%b°Ú¤ä@r 7ÆÖMŽ6—2 ¢E êAÕO=ªšÊAÝcÖ“hs˜ ÍÞd“M "äqáF#ãÿwZeneÜÒB ¢õø[[XÚõ÷ú?WÝÎã*Ç}„ÙøØîÄu-Ú!+\{lðBsî>,ZýZƒÛ­~û•Übj"ÜŠ!d&MzwÇÄœ ®D‚Ÿ¹Eùnkq=b®ö{We««© „êcõ¶þ@¨ÞÅ;÷Rí~Ç¢Ì$®ZÏ5 7C.“Ý\ûÝ?ëÔ¶R_rÉ%3 «ßùy,90ÍKË[+T˜æ¥ Eün¥C”(»«Mš@DmœSÏ@¢6ðH=)ЛFŒÙ_lÜ|çq àµçUˆÉR pü{“’ÓÄn8À5K9N"ŽÈ;Á‚•ŽÐÅzÄ]ˆVº­5ãõš…©Ü _ÝÎpÚl½O8q=¬54ñ¡•5nXAÔüg„)½ ”j×gŸ1(HðBlE̯bìêÐâ¼\'ÅÉ$l`ÅñžÖiÞuI©Çûc€{ª·{≤&^©÷ÅŽÉ”¸ÝvÛ-¨ZÊãUMd"qfIËÇèÖó±._ËòJ½80–`HÚN†àSàCúJ.5²>Pdy1· @CÛD° dYêù€î;\sLpÈGºKæj€KjU‡_·àUÉ lè¤7eÖ?ì°ÃŠû ¶kµçUB—p%ø4&Vìõ0rr`Ò8$è÷ÞV!Š ©jk-pdYëvïÞ?$n§Q^Nݨ_ÝÝÊ·÷ù˜õŠ•h—eݲ,”Œ!´òÀ”É\–¤ðMJ,ôqJ»PÜ-Ç5ÇéøK´Ûÿ~ÔV‚P˜ÔJçF¬mž#—v±°Gæ@c%«t­xiO©Á¢g бPÒ Ê'J&Šœ…‚ª¨/׋ã@¯þÑkÿâ®’g ’c †XYdo°ÁÅýFj\Z[Á›æ&ƒŸ¶¨A4µüþ"åX{ÿ\)9J¶ÔW“ë,džö¼ ê„hÞRCPs7·§¬¯ÜÀL³Ñ:דÏ—‘Ž{¾åÇ©œ”¿+’GHÅËípLJ$F!ÐZò›Þ–?Ú‰æÃ ®v’ÃÈ€É2nŽ"sVFIËÇèÑW–ïÊy¥År`,ßZS¹õ‘<ûÒŒ2 Â68qWáKY«¸¢ˆ!@2?±m½õÖe²8æe®t²üÑÔÔ~ÿÜuÂe§N®àš´>‚§»Í«T±>%%¦Þ q6–¤Þ`ÝFsÍ…Ò»†<’H “ÛXŒk¬Û!ì²Ø\qÅüü¼ë–€æØNºv`ʤ±d Ù¤åã€þ}¥^/_ òJ åÀØ!ÄÉüš®kx`™íý„N(÷o0щ/´Kìp€(˜$ š `µ]Ç7WêÜ Å<LÏ:nIb⸨'ÖR¤rëÎOPµlR¬E‘õNLÉë˜Ñ¹âÑ»/Ll×.DQw®“Éä@r 9X9„``ˆ\á[.é…¨o: —¸X¯D‹¹è#²PÒà9 / è±B¹E’Qõ“60|K²ÆAp`ìÀ„˜³ÌÙÀõÆ\‘ø°/ÒKšŒOö9 jvøé+/Öç­o}k}¸yÉK^² yºÍ«ÀJuŸûܧ€1•³qããÃrÅm{Œl;±.2³CþIL%Ä1n»í¶7˜ÐµfÅŒìsµ§>¾Ðm“o²~‹µI±””Hü•!ÜRXv-,0®À‰›ÄÕÞ$ÉÃÉá€~ N¶?¼…¬e÷ýÃ]PšLÞܨ£¹¸á”Ò¼OÕ›cÇà Ϭê>²sþ“¿Î"Õëò:Ô|˪ƒUР¹éE49²ÉIëÛ´»=C·rõ>À  ‰t¢õ±QÙ^/•• ¨ó¢ „%/?K˜CZ³FåɧÞA¹>ð@¼~×\sM™cg!ïæpZ¹øZ)DX‰¹ËŽ ±Hs¯ЋXÃ%uœ–JÆæ‡<ä!5ÞkÛ\’cΙ¥Ö?¨óGR²ã‘ñ–`BQ&)90a€r F’c ùÂ(a“Åÿ8fœ$;„[ýŬAÜfÖ1O[,‘ñÆXiì!‘,Æ¥uÖY§ÌÝc“ £YɹI£Å±³ -„}>’–~¤SöBÎÓÙ%`X™W!)9XÍ‚„w­h&LøˆôúHp-Ä» ¹sK`ñaB@òm_ý„—š0ÜZÛeë2ý¶]'2C)7W;ÛuøkÒvÊzÞÓXj"ˆII.UïBø)½¯¬|øŒX…Äz hÕíËíäÀ8r `cÀM°9ÓLX‹³ß;HydI 4ŽO»›«òa¢PFú…>¢?X¢|©þµçÑ•àÀDƒ¡•`h^39XÍšQ‰J¤©÷±ðDzÙŽ(Ë8Ç:ë²×^{• >[mµU#C#·ÒK;ÉL(6>ó™Ï”ùÅ\¡ß¹ÛubûØÇ–Ìu¨È )E­¸>àHimâæÍ7ß¼¤Îg¥fÝ­“uR ~YÝ€8àCjp)Áç¢#Ž8¢17‘{9ýôÓ ˜êÕNÀFÖËÏ}îsö¢³:)vYƒX—wÚi§2ß°#ë%pP¼LÀX¶(ó I"ƒ×{ï½wi{(ÄA†[ñ\ü<æ˜cÊ5JC:?x™Ç͇=)90¶ †B6FPTx½'þÇâL04y½&úB¸Ê(+k‘}úˆ>Q÷¡ÉãÆdÜQ~á&ã9æ]$F’„q“ˆ^xá…Eˆ7/˜D!4¦Ü²Üâ¹Û,&$Pæ ÷HIvÍu.+¹{<ðÀâÂgâAâ¼óÎ+‰T€#má‚ÆÍÌ|Çv20PŽ'ñ}²»Y&Švô#‰ ¹¸EsµSÌâ©§žZî[½'tRá ÀãÜpÓ}×»ÞU¦àŠáw6h.‹ÐiŠŸ¤×½è¢‹ èš‹ŸiÄ]ªðuÎ9ç”Ø%û’’ÓÈ~ÛV! Ø„¿–ÅñD“Ókôäxî5 òüY†Â:äXô™D£Ý öóÉÖ%Æšf´—Y1„ì}÷Ý·L*ê¦ë„ýW½êUE XwÝu›g>ó™E †9ÁÂM•`Ü ùœkBeIPpÌ „| (s¬&V,Öi÷ÅHâYz€œ¹H¼ Ë’å‰'žØ÷YÅdŸdQ"8ŽÀQM)€Ç¤Ò¦xýë_ß¼ò•¯¬‹”m–]wÝuÆN‚-»àC/~¨>àõÜ&ÜD7J䔘f "{'c‰ý;bL£ñ>šf¾MÚ½² ãba @dmŸ2–8oÒø1)÷“`hRždÞGr`9ÀšÂ,hÓM7-¦þ³D°~ØWÓ&›l2ó·žç‹°Î‚æs®I•ƒ>ï¿ÿþh‰écõèEê– ÂdÎ5©c>$ „æj'×¼=÷ܳ¸ºá÷¶íþæ2×ÓxÜqÇ‹ ¤~nsáþå.í¸:^Ó=îq?ûzñÓaeÇÿî'÷Æ bFe¦çÉíygÉÁp€‹™X.kÒçKJd^.î^\ɸ±ÆpO“ÅLÌN?Zè¹æûb„¸Æ‰ÁAä£%Ý?ÚyçK5ÉX¾üå/—}b‹"ÓZ)8Ÿù´“«@ ÐøÖ¦óÏ?¿yãßX&{ÈÄ/wmuòÉ'pÅ-‘{œ:¹ËÍE’L°|¦l°A)î?0Ô¶ÜÍUWOL"øÄ½Åÿ‚kT[„bŒ‰ór=€`'úýQm 79Ç’F“c†¶Øb‹C0š¬¯VÝ÷¾÷I7<^-ÏÖŽñl®o’ˆ¹áBÀ¿~à 7, ĉ%b5ŠÄÐÌE =è²ô{óì·ß~Í7¿ùÍBXevÙe—b•áï/¾æûßÿ~±ˆHæÀML–9ç.”æÓNñ84‰ŒòmzÜãW2º‰ßáú'Ý·”׬95š,A,9â¬1[¬Ss‘ënÐAÜ嬑ùÕ¤ǃ¤äÀ´s YB/ª…á?Ö(Ö „ ;&ò'žÝ/lð©AD¹‰dÆÜÔXMº:üË[ð礫cùèÖhw`µÁ’µË=€F·IW¯¼òÊ"è³ÈД9Gl @â£ÔÅ…M´ùL¾\ßÔBÎu„} C[Ü‹öX° °9†¸§±B‘ªº¾öB¶ÒÎnõª´µf³Í6ëûae˜"Às¾`¹gɸ$<‘½n”Èx$ çÈRç™é_2í…Ï~ £ôÄ&«-ÞÅ9Ö5ð©Å]GÙøŸëñç@=¾Ä¶ulr§õþñ¿óɽƒ±² MîcÈ;KŒ6b×ÊÜ­ëA¿Û|þóŸoŽ<òÈ2O¯8 Ì×@È9Ù-·Ü²Ûésî[ȹÚ™éTÌ@Èÿz‚TÿYb,5q¯;÷Üsë]3Ûo¼qqg›ÙQm,¤Õi3›x=ߨI, %YÿÌEܲ:q¹ûú׿¾Ðj–¥<~DŠþö?)90,Ôý ÐéÖß ‹û£Wo·ç_÷‘Ñkq¶¨ uãJîK$zr zàÂBi·x>17²ŸI™}ýõ׫—µ^€¢çEGä€tÛ–^4®÷÷³Ç{” ,a/p•³ŒésÑc=jmÌöL>²ïMþ3Î;œ$šŽçœw™5âV&}¶ÉSk7‘¶vÔÿoûÛeXc²¢¡r \æ†z‘VåúWMñ?On–æ@ˆ,úbRr 9H$ËC‹å\ž—˜2„0j!¨8·¾õ­Íµ×^[ ¤Ï6'E–%KRr`¾ˆ~Å•‘Ëžx ÖªuÖY§¸3Ö`¨î—¶“’Éä@r 9°$Z·²lr`Š9 ¨B¬BúB+ 2@’ +Q¬§˜yë}8`ÆÒdÔ¯€"ÛöG¿ÓÏl[;'Îïs‰<”H$’ÉYH04‹ù'9èÅ4 ž!„†æ^f¶¿üå/åTÇì—ùÌ>Y耠B½8›ûkèguÿ€mI(Ù(BQ>úg]Wn'’Éä@r`.$š‹Cy<9˜á@¡Ú)U…BˆåGp†b2™Jr#90ô¯ècmë+‘>"åô»Cs05'’Éä@W$êÊ–Ü™HÔAÓ:„T ‡Pjž VÇhêiíYŠX„,1G”«ëÍíä@Í}°Aú™þ  Ò·€¡°űCÑOë:s;9H$’ý8`¨wòXr 90ÃB@M¢:„€£L–¤äÀB8}-PĤYj79@(–C ár–M$’É:òqŠ%Y>þè<Û7þwšwÐæ€çÌDð$¤NQ¢´ô,Aá)€Iٌ…¤^¨Ç}Ì@HÓ¿ÂR@RÎqTŸßë¹?9H$’5††ê‹ùPÅRïÏíÑç@ÑÒ:‚Ó³Žw—ð  ²ß>Â* º¡éáVÞéR9`Ì @Û%ú›µc±,õšy~r 9HL– { Oñq‹}¹}xf(Ðè?«aµ0ž=Øñß®sa  iXíÉz'›úVXë{±`(úáds#ï.9H$†ÁeC>Zµ¶ÏM}ï{ßk®¸âŠaÜ_Ö9 xfñÜB@±Nš.Ôϼî@O$ ×8ûMWÔÝêkõ¢^} Ù_÷¿PÔ”ƒù“H$’Ép`ÙÀ—– ÍÞ†nØì¹çže_[hjÿ/…òg¨ðœjòýõ×oÌøϯ]¦.ŸÛ“ÏÏßÖ ï´íø ‚&¿,×Fs½Øðÿ—«-yä@r 9˜<,+Â>/2>ñïxÇ;šk¯½¶ùýïß\wÝue1G‰ ¹ÚXBËNˆ÷|œÚœmM$’Éä@r`º9°,`ˆàŒKa]áЉ•„Üæ* ×›CÃ뤞 Â{Ïÿ=“AÖ–x^ŽÇ3LáwxÏ%kN$’Éä@r 9>– ¹ ‚s v )– nr¬B‘¢×±¤ár €*°Ö!ÏçV·ºÕŒ•(æöøÿíÝÛjÛ@…Qúþ/Ý~-D.’XcÜ,£ÓH–×øb6¿¬Ô¦¾†îíg'@€¸WàXZå¡yU†ý¨êOƒêªA ¶«õÚï…úø~3tï— ³D뇂N¯ú¨þX•¨ÛæZ¯Ÿj»öÛq& @€¯&p, ­ŠÐ@»épX7Èîv¬ªBí«2äö¸s_§õO}ñ6Œv«\Ëí¯êйþñN @€8†475€n*ì4¨.ì¬Ñz¨A÷ªF{ Â߃ü¹U`Õúèˆ >«5_ªýúõÖ sr @€7  C»öÏ¡ª@«D´­õÙíÛíq« 57Ý/°þh¾ªÏµoêŸë-rµ©­‰ @€À« üú6Ž¥ÂNo·#´¾Çg·¼íµi½©åƒ—øªýøíë¾VytÚÖòõµm O×ã¾}N@€ @à ÀÑ0´`³°³Ûàšoß¶e°@tÐãÇ¿UáfŸ…¢U‹6©eAèÇ] @€^ZàhšÔBÎPë[^›ÖMÏXj¾ÐóvÛÖŸs…Þ• @€À÷ž†vÙ×À³€´}ͯû¯Û-?^à½ps­½×îñWåŒ @€îxjúÌLj>£ô˜6‚Îc… @à5Ž>Mî+$è_Qs  @€ üû@µ²Ÿ @€ÿ™€0ôŸu¨C€ @€Àç~ÀÆõ®âIEND®B`‚zzzeek-alembic-bee044a1c187/docs/build/api/autogenerate.rst000066400000000000000000000612231353106760100236230ustar00rootroot00000000000000.. _alembic.autogenerate.toplevel: ============== Autogeneration ============== .. note:: this section discusses the **internal API of Alembic** as regards the autogeneration feature of the ``alembic revision`` command. This section is only useful for developers who wish to extend the capabilities of Alembic. For general documentation on the autogenerate feature, please see :doc:`/autogenerate`. The autogeneration system has a wide degree of public API, including the following areas: 1. The ability to do a "diff" of a :class:`~sqlalchemy.schema.MetaData` object against a database, and receive a data structure back. This structure is available either as a rudimentary list of changes, or as a :class:`.MigrateOperation` structure. 2. The ability to alter how the ``alembic revision`` command generates revision scripts, including support for multiple revision scripts generated in one pass. 3. The ability to add new operation directives to autogeneration, including custom schema/model comparison functions and revision script rendering. Getting Diffs ============== The simplest API autogenerate provides is the "schema comparison" API; these are simple functions that will run all registered "comparison" functions between a :class:`~sqlalchemy.schema.MetaData` object and a database backend to produce a structure showing how they differ. The two functions provided are :func:`.compare_metadata`, which is more of the "legacy" function that produces diff tuples, and :func:`.produce_migrations`, which produces a structure consisting of operation directives detailed in :ref:`alembic.operations.toplevel`. .. autofunction:: alembic.autogenerate.compare_metadata .. autofunction:: alembic.autogenerate.produce_migrations .. _customizing_revision: Customizing Revision Generation ========================================== .. versionadded:: 0.8.0 - the ``alembic revision`` system is now customizable. The ``alembic revision`` command, also available programmatically via :func:`.command.revision`, essentially produces a single migration script after being run. Whether or not the ``--autogenerate`` option was specified basically determines if this script is a blank revision script with empty ``upgrade()`` and ``downgrade()`` functions, or was produced with alembic operation directives as the result of autogenerate. In either case, the system creates a full plan of what is to be done in the form of a :class:`.MigrateOperation` structure, which is then used to produce the script. For example, suppose we ran ``alembic revision --autogenerate``, and the end result was that it produced a new revision ``'eced083f5df'`` with the following contents:: """create the organization table.""" # revision identifiers, used by Alembic. revision = 'eced083f5df' down_revision = 'beafc7d709f' from alembic import op import sqlalchemy as sa def upgrade(): op.create_table( 'organization', sa.Column('id', sa.Integer(), primary_key=True), sa.Column('name', sa.String(50), nullable=False) ) op.add_column( 'user', sa.Column('organization_id', sa.Integer()) ) op.create_foreign_key( 'org_fk', 'user', 'organization', ['organization_id'], ['id'] ) def downgrade(): op.drop_constraint('org_fk', 'user') op.drop_column('user', 'organization_id') op.drop_table('organization') The above script is generated by a :class:`.MigrateOperation` structure that looks like this:: from alembic.operations import ops import sqlalchemy as sa migration_script = ops.MigrationScript( 'eced083f5df', ops.UpgradeOps( ops=[ ops.CreateTableOp( 'organization', [ sa.Column('id', sa.Integer(), primary_key=True), sa.Column('name', sa.String(50), nullable=False) ] ), ops.ModifyTableOps( 'user', ops=[ ops.AddColumnOp( 'user', sa.Column('organization_id', sa.Integer()) ), ops.CreateForeignKeyOp( 'org_fk', 'user', 'organization', ['organization_id'], ['id'] ) ] ) ] ), ops.DowngradeOps( ops=[ ops.ModifyTableOps( 'user', ops=[ ops.DropConstraintOp('org_fk', 'user'), ops.DropColumnOp('user', 'organization_id') ] ), ops.DropTableOp('organization') ] ), message='create the organization table.' ) When we deal with a :class:`.MigrationScript` structure, we can render the upgrade/downgrade sections into strings for debugging purposes using the :func:`.render_python_code` helper function:: from alembic.autogenerate import render_python_code print(render_python_code(migration_script.upgrade_ops)) Renders:: ### commands auto generated by Alembic - please adjust! ### op.create_table('organization', sa.Column('id', sa.Integer(), nullable=False), sa.Column('name', sa.String(length=50), nullable=False), sa.PrimaryKeyConstraint('id') ) op.add_column('user', sa.Column('organization_id', sa.Integer(), nullable=True)) op.create_foreign_key('org_fk', 'user', 'organization', ['organization_id'], ['id']) ### end Alembic commands ### Given that structures like the above are used to generate new revision files, and that we'd like to be able to alter these as they are created, we then need a system to access this structure when the :func:`.command.revision` command is used. The :paramref:`.EnvironmentContext.configure.process_revision_directives` parameter gives us a way to alter this. This is a function that is passed the above structure as generated by Alembic, giving us a chance to alter it. For example, if we wanted to put all the "upgrade" operations into a certain branch, and we wanted our script to not have any "downgrade" operations at all, we could build an extension as follows, illustrated within an ``env.py`` script:: def process_revision_directives(context, revision, directives): script = directives[0] # set specific branch script.head = "mybranch@head" # erase downgrade operations script.downgrade_ops.ops[:] = [] # ... def run_migrations_online(): # ... with engine.connect() as connection: context.configure( connection=connection, target_metadata=target_metadata, process_revision_directives=process_revision_directives) with context.begin_transaction(): context.run_migrations() Above, the ``directives`` argument is a Python list. We may alter the given structure within this list in-place, or replace it with a new structure consisting of zero or more :class:`.MigrationScript` directives. The :func:`.command.revision` command will then produce scripts corresponding to whatever is in this list. .. autofunction:: alembic.autogenerate.render_python_code .. _autogen_rewriter: Fine-Grained Autogenerate Generation with Rewriters --------------------------------------------------- The preceding example illustrated how we can make a simple change to the structure of the operation directives to produce new autogenerate output. For the case where we want to affect very specific parts of the autogenerate stream, we can make a function for :paramref:`.EnvironmentContext.configure.process_revision_directives` which traverses through the whole :class:`.MigrationScript` structure, locates the elements we care about and modifies them in-place as needed. However, to reduce the boilerplate associated with this task, we can use the :class:`.Rewriter` object to make this easier. :class:`.Rewriter` gives us an object that we can pass directly to :paramref:`.EnvironmentContext.configure.process_revision_directives` which we can also attach handler functions onto, keyed to specific types of constructs. Below is an example where we rewrite :class:`.ops.AddColumnOp` directives; based on whether or not the new column is "nullable", we either return the existing directive, or we return the existing directive with the nullable flag changed, inside of a list with a second directive to alter the nullable flag in a second step:: # ... fragmented env.py script .... from alembic.autogenerate import rewriter from alembic.operations import ops writer = rewriter.Rewriter() @writer.rewrites(ops.AddColumnOp) def add_column(context, revision, op): if op.column.nullable: return op else: op.column.nullable = True return [ op, ops.AlterColumnOp( op.table_name, op.column.name, modify_nullable=False, existing_type=op.column.type, ) ] # ... later ... def run_migrations_online(): # ... with connectable.connect() as connection: context.configure( connection=connection, target_metadata=target_metadata, process_revision_directives=writer ) with context.begin_transaction(): context.run_migrations() Above, in a full :class:`.ops.MigrationScript` structure, the :class:`.AddColumn` directives would be present within the paths ``MigrationScript->UpgradeOps->ModifyTableOps`` and ``MigrationScript->DowngradeOps->ModifyTableOps``. The :class:`.Rewriter` handles traversing into these structures as well as rewriting them as needed so that we only need to code for the specific object we care about. .. autoclass:: alembic.autogenerate.rewriter.Rewriter :members: .. _autogen_customizing_multiengine_revision: Revision Generation with Multiple Engines / ``run_migrations()`` calls ---------------------------------------------------------------------- A lesser-used technique which allows autogenerated migrations to run against multiple database backends at once, generating changes into a single migration script, is illustrated in the provided ``multidb`` template. This template features a special ``env.py`` which iterates through multiple :class:`~sqlalchemy.engine.Engine` instances and calls upon :meth:`.MigrationContext.run_migrations` for each:: for name, rec in engines.items(): logger.info("Migrating database %s" % name) context.configure( connection=rec['connection'], upgrade_token="%s_upgrades" % name, downgrade_token="%s_downgrades" % name, target_metadata=target_metadata.get(name) ) context.run_migrations(engine_name=name) Above, :meth:`.MigrationContext.run_migrations` is run multiple times, once for each engine. Within the context of autogeneration, each time the method is called the :paramref:`~.EnvironmentContext.configure.upgrade_token` and :paramref:`~.EnvironmentContext.configure.downgrade_token` parameters are changed, so that the collection of template variables gains distinct entries for each engine, which are then referred to explicitly within ``script.py.mako``. In terms of the :paramref:`.EnvironmentContext.configure.process_revision_directives` hook, the behavior here is that the ``process_revision_directives`` hook is invoked **multiple times, once for each call to context.run_migrations()**. This means that if a multi-``run_migrations()`` approach is to be combined with the ``process_revision_directives`` hook, care must be taken to use the hook appropriately. The first point to note is that when a **second** call to ``run_migrations()`` occurs, the ``.upgrade_ops`` and ``.downgrade_ops`` attributes are **converted into Python lists**, and new :class:`.UpgradeOps` and :class:`.DowngradeOps` objects are appended to these lists. Each :class:`.UpgradeOps` and :class:`.DowngradeOps` object maintains an ``.upgrade_token`` and a ``.downgrade_token`` attribute respectively, which serves to render their contents into the appropriate template token. For example, a multi-engine run that has the engine names ``engine1`` and ``engine2`` will generate tokens of ``engine1_upgrades``, ``engine1_downgrades``, ``engine2_upgrades`` and ``engine2_downgrades`` as it runs. The resulting migration structure would look like this:: from alembic.operations import ops import sqlalchemy as sa migration_script = ops.MigrationScript( 'eced083f5df', [ ops.UpgradeOps( ops=[ # upgrade operations for "engine1" ], upgrade_token="engine1_upgrades" ), ops.UpgradeOps( ops=[ # upgrade operations for "engine2" ], upgrade_token="engine2_upgrades" ), ], [ ops.DowngradeOps( ops=[ # downgrade operations for "engine1" ], downgrade_token="engine1_downgrades" ), ops.DowngradeOps( ops=[ # downgrade operations for "engine2" ], downgrade_token="engine2_downgrades" ) ], message='migration message' ) Given the above, the following guidelines should be considered when the ``env.py`` script calls upon :meth:`.MigrationContext.run_migrations` mutiple times when running autogenerate: * If the ``process_revision_directives`` hook aims to **add elements based on inspection of the current database / connection**, it should do its operation **on each iteration**. This is so that each time the hook runs, the database is available. * Alternatively, if the ``process_revision_directives`` hook aims to **modify the list of migration directives in place**, this should be called **only on the last iteration**. This is so that the hook isn't being given an ever-growing structure each time which it has already modified previously. * The :class:`.Rewriter` object, if used, should be called **only on the last iteration**, because it will always deliver all directives every time, so again to avoid double/triple/etc. processing of directives it should be called only when the structure is complete. * The :attr:`.MigrationScript.upgrade_ops_list` and :attr:`.MigrationScript.downgrade_ops_list` attributes should be consulted when referring to the collection of :class:`.UpgradeOps` and :class:`.DowngradeOps` objects. .. versionchanged:: 0.8.1 - multiple calls to :meth:`.MigrationContext.run_migrations` within an autogenerate operation, such as that proposed within the ``multidb`` script template, are now accommodated by the new extensible migration system introduced in 0.8.0. .. _autogen_custom_ops: Autogenerating Custom Operation Directives ========================================== In the section :ref:`operation_plugins`, we talked about adding new subclasses of :class:`.MigrateOperation` in order to add new ``op.`` directives. In the preceding section :ref:`customizing_revision`, we also learned that these same :class:`.MigrateOperation` structures are at the base of how the autogenerate system knows what Python code to render. Using this knowledge, we can create additional functions that plug into the autogenerate system so that our new operations can be generated into migration scripts when ``alembic revision --autogenerate`` is run. The following sections will detail an example of this using the the ``CreateSequenceOp`` and ``DropSequenceOp`` directives we created in :ref:`operation_plugins`, which correspond to the SQLAlchemy :class:`~sqlalchemy.schema.Sequence` construct. .. versionadded:: 0.8.0 - custom operations can be added to the autogenerate system to support new kinds of database objects. Tracking our Object with the Model ---------------------------------- The basic job of an autogenerate comparison function is to inspect a series of objects in the database and compare them against a series of objects defined in our model. By "in our model", we mean anything defined in Python code that we want to track, however most commonly we're talking about a series of :class:`~sqlalchemy.schema.Table` objects present in a :class:`~sqlalchemy.schema.MetaData` collection. Let's propose a simple way of seeing what :class:`~sqlalchemy.schema.Sequence` objects we want to ensure exist in the database when autogenerate runs. While these objects do have some integrations with :class:`~sqlalchemy.schema.Table` and :class:`~sqlalchemy.schema.MetaData` already, let's assume they don't, as the example here intends to illustrate how we would do this for most any kind of custom construct. We associate the object with the :attr:`~sqlalchemy.schema.MetaData.info` collection of :class:`~sqlalchemy.schema.MetaData`, which is a dictionary we can use for anything, which we also know will be passed to the autogenerate process:: from sqlalchemy.schema import Sequence def add_sequence_to_model(sequence, metadata): metadata.info.setdefault("sequences", set()).add( (sequence.schema, sequence.name) ) my_seq = Sequence("my_sequence") add_sequence_to_model(my_seq, model_metadata) The :attr:`~sqlalchemy.schema.MetaData.info` dictionary is a good place to put things that we want our autogeneration routines to be able to locate, which can include any object such as custom DDL objects representing views, triggers, special constraints, or anything else we want to support. Registering a Comparison Function --------------------------------- We now need to register a comparison hook, which will be used to compare the database to our model and produce ``CreateSequenceOp`` and ``DropSequenceOp`` directives to be included in our migration script. Note that we are assuming a Postgresql backend:: from alembic.autogenerate import comparators @comparators.dispatch_for("schema") def compare_sequences(autogen_context, upgrade_ops, schemas): all_conn_sequences = set() for sch in schemas: all_conn_sequences.update([ (sch, row[0]) for row in autogen_context.connection.execute( "SELECT relname FROM pg_class c join " "pg_namespace n on n.oid=c.relnamespace where " "relkind='S' and n.nspname=%(nspname)s", # note that we consider a schema of 'None' in our # model to be the "default" name in the PG database; # this usually is the name 'public' nspname=autogen_context.dialect.default_schema_name if sch is None else sch ) ]) # get the collection of Sequence objects we're storing with # our MetaData metadata_sequences = autogen_context.metadata.info.setdefault( "sequences", set()) # for new names, produce CreateSequenceOp directives for sch, name in metadata_sequences.difference(all_conn_sequences): upgrade_ops.ops.append( CreateSequenceOp(name, schema=sch) ) # for names that are going away, produce DropSequenceOp # directives for sch, name in all_conn_sequences.difference(metadata_sequences): upgrade_ops.ops.append( DropSequenceOp(name, schema=sch) ) Above, we've built a new function ``compare_sequences()`` and registered it as a "schema" level comparison function with autogenerate. The job that it performs is that it compares the list of sequence names present in each database schema with that of a list of sequence names that we are maintaining in our :class:`~sqlalchemy.schema.MetaData` object. When autogenerate completes, it will have a series of ``CreateSequenceOp`` and ``DropSequenceOp`` directives in the list of "upgrade" operations; the list of "downgrade" operations is generated directly from these using the ``CreateSequenceOp.reverse()`` and ``DropSequenceOp.reverse()`` methods that we've implemented on these objects. The registration of our function at the scope of "schema" means our autogenerate comparison function is called outside of the context of any specific table or column. The three available scopes are "schema", "table", and "column", summarized as follows: * **Schema level** - these hooks are passed a :class:`.AutogenContext`, an :class:`.UpgradeOps` collection, and a collection of string schema names to be operated upon. If the :class:`.UpgradeOps` collection contains changes after all hooks are run, it is included in the migration script: :: @comparators.dispatch_for("schema") def compare_schema_level(autogen_context, upgrade_ops, schemas): pass * **Table level** - these hooks are passed a :class:`.AutogenContext`, a :class:`.ModifyTableOps` collection, a schema name, table name, a :class:`~sqlalchemy.schema.Table` reflected from the database if any or ``None``, and a :class:`~sqlalchemy.schema.Table` present in the local :class:`~sqlalchemy.schema.MetaData`. If the :class:`.ModifyTableOps` collection contains changes after all hooks are run, it is included in the migration script: :: @comparators.dispatch_for("table") def compare_table_level(autogen_context, modify_ops, schemaname, tablename, conn_table, metadata_table): pass * **Column level** - these hooks are passed a :class:`.AutogenContext`, an :class:`.AlterColumnOp` object, a schema name, table name, column name, a :class:`~sqlalchemy.schema.Column` reflected from the database and a :class:`~sqlalchemy.schema.Column` present in the local table. If the :class:`.AlterColumnOp` contains changes after all hooks are run, it is included in the migration script; a "change" is considered to be present if any of the ``modify_`` attributes are set to a non-default value, or there are any keys in the ``.kw`` collection with the prefix ``"modify_"``: :: @comparators.dispatch_for("column") def compare_column_level(autogen_context, alter_column_op, schemaname, tname, cname, conn_col, metadata_col): pass The :class:`.AutogenContext` passed to these hooks is documented below. .. autoclass:: alembic.autogenerate.api.AutogenContext :members: Creating a Render Function -------------------------- The second autogenerate integration hook is to provide a "render" function; since the autogenerate system renders Python code, we need to build a function that renders the correct "op" instructions for our directive:: from alembic.autogenerate import renderers @renderers.dispatch_for(CreateSequenceOp) def render_create_sequence(autogen_context, op): return "op.create_sequence(%r, **%r)" % ( op.sequence_name, {"schema": op.schema} ) @renderers.dispatch_for(DropSequenceOp) def render_drop_sequence(autogen_context, op): return "op.drop_sequence(%r, **%r)" % ( op.sequence_name, {"schema": op.schema} ) The above functions will render Python code corresponding to the presence of ``CreateSequenceOp`` and ``DropSequenceOp`` instructions in the list that our comparison function generates. Running It ---------- All the above code can be organized however the developer sees fit; the only thing that needs to make it work is that when the Alembic environment ``env.py`` is invoked, it either imports modules which contain all the above routines, or they are locally present, or some combination thereof. If we then have code in our model (which of course also needs to be invoked when ``env.py`` runs!) like this:: from sqlalchemy.schema import Sequence my_seq_1 = Sequence("my_sequence_1") add_sequence_to_model(my_seq_1, target_metadata) When we first run ``alembic revision --autogenerate``, we'll see this in our migration file:: def upgrade(): ### commands auto generated by Alembic - please adjust! ### op.create_sequence('my_sequence_1', **{'schema': None}) ### end Alembic commands ### def downgrade(): ### commands auto generated by Alembic - please adjust! ### op.drop_sequence('my_sequence_1', **{'schema': None}) ### end Alembic commands ### These are our custom directives that will invoke when ``alembic upgrade`` or ``alembic downgrade`` is run. zzzeek-alembic-bee044a1c187/docs/build/api/commands.rst000066400000000000000000000032711353106760100227400ustar00rootroot00000000000000.. _alembic.command.toplevel: ========= Commands ========= .. note:: this section discusses the **internal API of Alembic** as regards its command invocation system. This section is only useful for developers who wish to extend the capabilities of Alembic. For documentation on using Alembic commands, please see :doc:`/tutorial`. Alembic commands are all represented by functions in the :ref:`alembic.command.toplevel` package. They all accept the same style of usage, being sent the :class:`.Config` object as the first argument. Commands can be run programmatically, by first constructing a :class:`.Config` object, as in:: from alembic.config import Config from alembic import command alembic_cfg = Config("/path/to/yourapp/alembic.ini") command.upgrade(alembic_cfg, "head") In many cases, and perhaps more often than not, an application will wish to call upon a series of Alembic commands and/or other features. It is usually a good idea to link multiple commands along a single connection and transaction, if feasible. This can be achieved using the :attr:`.Config.attributes` dictionary in order to share a connection:: with engine.begin() as connection: alembic_cfg.attributes['connection'] = connection command.upgrade(alembic_cfg, "head") This recipe requires that ``env.py`` consumes this connection argument; see the example in :ref:`connection_sharing` for details. To write small API functions that make direct use of database and script directory information, rather than just running one of the built-in commands, use the :class:`.ScriptDirectory` and :class:`.MigrationContext` classes directly. .. automodule:: alembic.command :members: zzzeek-alembic-bee044a1c187/docs/build/api/config.rst000066400000000000000000000023171353106760100224040ustar00rootroot00000000000000.. _alembic.config.toplevel: ============== Configuration ============== .. note:: this section discusses the **internal API of Alembic** as regards internal configuration constructs. This section is only useful for developers who wish to extend the capabilities of Alembic. For documentation on configuration of an Alembic environment, please see :doc:`/tutorial`. The :class:`.Config` object represents the configuration passed to the Alembic environment. From an API usage perspective, it is needed for the following use cases: * to create a :class:`.ScriptDirectory`, which allows you to work with the actual script files in a migration environment * to create an :class:`.EnvironmentContext`, which allows you to actually run the ``env.py`` module within the migration environment * to programatically run any of the commands in the :ref:`alembic.command.toplevel` module. The :class:`.Config` is *not* needed for these cases: * to instantiate a :class:`.MigrationContext` directly - this object only needs a SQLAlchemy connection or dialect name. * to instantiate a :class:`.Operations` object - this object only needs a :class:`.MigrationContext`. .. automodule:: alembic.config :members: zzzeek-alembic-bee044a1c187/docs/build/api/ddl.rst000066400000000000000000000021021353106760100216720ustar00rootroot00000000000000.. _alembic.ddl.toplevel: ============= DDL Internals ============= These are some of the constructs used to generate migration instructions. The APIs here build off of the :class:`sqlalchemy.schema.DDLElement` and :ref:`sqlalchemy.ext.compiler_toplevel` systems. For programmatic usage of Alembic's migration directives, the easiest route is to use the higher level functions given by :ref:`alembic.operations.toplevel`. .. automodule:: alembic.ddl :members: :undoc-members: .. automodule:: alembic.ddl.base :members: :undoc-members: .. automodule:: alembic.ddl.impl :members: :undoc-members: MySQL ============= .. automodule:: alembic.ddl.mysql :members: :undoc-members: :show-inheritance: MS-SQL ============= .. automodule:: alembic.ddl.mssql :members: :undoc-members: :show-inheritance: Postgresql ============= .. automodule:: alembic.ddl.postgresql :members: :undoc-members: :show-inheritance: SQLite ============= .. automodule:: alembic.ddl.sqlite :members: :undoc-members: :show-inheritance: zzzeek-alembic-bee044a1c187/docs/build/api/index.rst000066400000000000000000000017651353106760100222540ustar00rootroot00000000000000.. _api: =========== API Details =========== Alembic's internal API has many public integration points that can be used to extend Alembic's functionality as well as to re-use its functionality in new ways. As the project has grown, more APIs are created and exposed for this purpose. Direct use of the vast majority of API details discussed here is not needed for rudimentary use of Alembic; the only API that is used normally by end users is the methods provided by the :class:`.Operations` class, which is discussed outside of this subsection, and the parameters that can be passed to the :meth:`.EnvironmentContext.configure` method, used when configuring one's ``env.py`` environment. However, real-world applications will usually end up using more of the internal API, in particular being able to run commands programmatically, as discussed in the section :doc:`/api/commands`. .. toctree:: :maxdepth: 2 overview runtime config commands operations autogenerate script ddl zzzeek-alembic-bee044a1c187/docs/build/api/operations.rst000066400000000000000000000151571353106760100233300ustar00rootroot00000000000000.. _alembic.operations.toplevel: ===================== Operation Directives ===================== .. note:: this section discusses the **internal API of Alembic** as regards the internal system of defining migration operation directives. This section is only useful for developers who wish to extend the capabilities of Alembic. For end-user guidance on Alembic migration operations, please see :ref:`ops`. Within migration scripts, actual database migration operations are handled via an instance of :class:`.Operations`. The :class:`.Operations` class lists out available migration operations that are linked to a :class:`.MigrationContext`, which communicates instructions originated by the :class:`.Operations` object into SQL that is sent to a database or SQL output stream. Most methods on the :class:`.Operations` class are generated dynamically using a "plugin" system, described in the next section :ref:`operation_plugins`. Additionally, when Alembic migration scripts actually run, the methods on the current :class:`.Operations` object are proxied out to the ``alembic.op`` module, so that they are available using module-style access. For an overview of how to use an :class:`.Operations` object directly in programs, as well as for reference to the standard operation methods as well as "batch" methods, see :ref:`ops`. .. _operation_plugins: Operation Plugins ===================== The Operations object is extensible using a plugin system. This system allows one to add new ``op.`` methods at runtime. The steps to use this system are to first create a subclass of :class:`.MigrateOperation`, register it using the :meth:`.Operations.register_operation` class decorator, then build a default "implementation" function which is established using the :meth:`.Operations.implementation_for` decorator. .. versionadded:: 0.8.0 - the :class:`.Operations` class is now an open namespace that is extensible via the creation of new :class:`.MigrateOperation` subclasses. Below we illustrate a very simple operation ``CreateSequenceOp`` which will implement a new method ``op.create_sequence()`` for use in migration scripts:: from alembic.operations import Operations, MigrateOperation @Operations.register_operation("create_sequence") class CreateSequenceOp(MigrateOperation): """Create a SEQUENCE.""" def __init__(self, sequence_name, schema=None): self.sequence_name = sequence_name self.schema = schema @classmethod def create_sequence(cls, operations, sequence_name, **kw): """Issue a "CREATE SEQUENCE" instruction.""" op = CreateSequenceOp(sequence_name, **kw) return operations.invoke(op) def reverse(self): # only needed to support autogenerate return DropSequenceOp(self.sequence_name, schema=self.schema) @Operations.register_operation("drop_sequence") class DropSequenceOp(MigrateOperation): """Drop a SEQUENCE.""" def __init__(self, sequence_name, schema=None): self.sequence_name = sequence_name self.schema = schema @classmethod def drop_sequence(cls, operations, sequence_name, **kw): """Issue a "DROP SEQUENCE" instruction.""" op = DropSequenceOp(sequence_name, **kw) return operations.invoke(op) def reverse(self): # only needed to support autogenerate return CreateSequenceOp(self.sequence_name, schema=self.schema) Above, the ``CreateSequenceOp`` and ``DropSequenceOp`` classes represent new operations that will be available as ``op.create_sequence()`` and ``op.drop_sequence()``. The reason the operations are represented as stateful classes is so that an operation and a specific set of arguments can be represented generically; the state can then correspond to different kinds of operations, such as invoking the instruction against a database, or autogenerating Python code for the operation into a script. In order to establish the migrate-script behavior of the new operations, we use the :meth:`.Operations.implementation_for` decorator:: @Operations.implementation_for(CreateSequenceOp) def create_sequence(operations, operation): if operation.schema is not None: name = "%s.%s" % (operation.schema, operation.sequence_name) else: name = operation.sequence_name operations.execute("CREATE SEQUENCE %s" % name) @Operations.implementation_for(DropSequenceOp) def drop_sequence(operations, operation): if operation.schema is not None: name = "%s.%s" % (operation.schema, operation.sequence_name) else: name = operation.sequence_name operations.execute("DROP SEQUENCE %s" % name) Above, we use the simplest possible technique of invoking our DDL, which is just to call :meth:`.Operations.execute` with literal SQL. If this is all a custom operation needs, then this is fine. However, options for more comprehensive support include building out a custom SQL construct, as documented at :ref:`sqlalchemy.ext.compiler_toplevel`. With the above two steps, a migration script can now use new methods ``op.create_sequence()`` and ``op.drop_sequence()`` that will proxy to our object as a classmethod:: def upgrade(): op.create_sequence("my_sequence") def downgrade(): op.drop_sequence("my_sequence") The registration of new operations only needs to occur in time for the ``env.py`` script to invoke :meth:`.MigrationContext.run_migrations`; within the module level of the ``env.py`` script is sufficient. .. seealso:: :ref:`autogen_custom_ops` - how to add autogenerate support to custom operations. .. versionadded:: 0.8 - the migration operations available via the :class:`.Operations` class as well as the ``alembic.op`` namespace is now extensible using a plugin system. .. _operation_objects: .. _alembic.operations.ops.toplevel: Built-in Operation Objects ============================== The migration operations present on :class:`.Operations` are themselves delivered via operation objects that represent an operation and its arguments. All operations descend from the :class:`.MigrateOperation` class, and are registered with the :class:`.Operations` class using the :meth:`.Operations.register_operation` class decorator. The :class:`.MigrateOperation` objects also serve as the basis for how the autogenerate system renders new migration scripts. .. seealso:: :ref:`operation_plugins` :ref:`customizing_revision` The built-in operation objects are listed below. .. automodule:: alembic.operations.ops :members: zzzeek-alembic-bee044a1c187/docs/build/api/overview.rst000066400000000000000000000065211353106760100230060ustar00rootroot00000000000000======== Overview ======== .. note:: this section is a technical overview of the **internal API of Alembic**. This section is only useful for developers who wish to extend the capabilities of Alembic; for regular users, reading this section is **not necessary**. A visualization of the primary features of Alembic's internals is presented in the following figure. The module and class boxes do not list out all the operations provided by each unit; only a small set of representative elements intended to convey the primary purpose of each system. .. image:: api_overview.png The script runner for Alembic is present in the :ref:`alembic.config.toplevel` module. This module produces a :class:`.Config` object and passes it to the appropriate function in :ref:`alembic.command.toplevel`. Functions within :ref:`alembic.command.toplevel` will typically instantiate an :class:`.ScriptDirectory` instance, which represents the collection of version files, and an :class:`.EnvironmentContext`, which is a configurational facade passed to the environment's ``env.py`` script. The :class:`.EnvironmentContext` object is the primary object used within the ``env.py`` script, whose main purpose is that of a facade for creating and using a :class:`.MigrationContext` object, which is the actual migration engine that refers to a database implementation. The primary method called on this object within an ``env.py`` script is the :meth:`.EnvironmentContext.configure` method, which sets up the :class:`.MigrationContext` with database connectivity and behavioral configuration. It also supplies methods for transaction demarcation and migration running, but these methods ultimately call upon the :class:`.MigrationContext` that's been configured. :class:`.MigrationContext` is the gateway to the database for other parts of the application, and produces a :class:`.DefaultImpl` object which does the actual database communication, and knows how to create the specific SQL text of the various DDL directives such as ALTER TABLE; :class:`.DefaultImpl` has subclasses that are per-database-backend. In "offline" mode (e.g. ``--sql``), the :class:`.MigrationContext` will produce SQL to a file output stream instead of a database. During an upgrade or downgrade operation, a specific series of migration scripts are invoked starting with the :class:`.MigrationContext` in conjunction with the :class:`.ScriptDirectory`; the actual scripts themselves make use of the :class:`.Operations` object, which provide the end-user interface to specific database operations. The :class:`.Operations` object is generated based on a series of "operation directive" objects that are user-extensible, and start out in the :ref:`alembic.operations.ops.toplevel` module. Another prominent feature of Alembic is the "autogenerate" feature, which produces new migration scripts that contain Python code. The autogenerate feature starts in :ref:`alembic.autogenerate.toplevel`, and is used exclusively by the :func:`.alembic.command.revision` command when the ``--autogenerate`` flag is passed. Autogenerate refers to the :class:`.MigrationContext` and :class:`.DefaultImpl` in order to access database connectivity and access per-backend rules for autogenerate comparisons. It also makes use of :ref:`alembic.operations.ops.toplevel` in order to represent the operations that it will render into scripts. zzzeek-alembic-bee044a1c187/docs/build/api/runtime.rst000066400000000000000000000027401353106760100226220ustar00rootroot00000000000000.. _alembic.runtime.environment.toplevel: ======================= Runtime Objects ======================= The "runtime" of Alembic involves the :class:`.EnvironmentContext` and :class:`.MigrationContext` objects. These are the objects that are in play once the ``env.py`` script is loaded up by a command and a migration operation proceeds. The Environment Context ======================= The :class:`.EnvironmentContext` class provides most of the API used within an ``env.py`` script. Within ``env.py``, the instantated :class:`.EnvironmentContext` is made available via a special *proxy module* called ``alembic.context``. That is, you can import ``alembic.context`` like a regular Python module, and each name you call upon it is ultimately routed towards the current :class:`.EnvironmentContext` in use. In particular, the key method used within ``env.py`` is :meth:`.EnvironmentContext.configure`, which establishes all the details about how the database will be accessed. .. automodule:: alembic.runtime.environment :members: EnvironmentContext .. _alembic.runtime.migration.toplevel: The Migration Context ===================== The :class:`.MigrationContext` handles the actual work to be performed against a database backend as migration operations proceed. It is generally not exposed to the end-user, except when the :paramref:`~.EnvironmentContext.configure.on_version_apply` callback hook is used. .. automodule:: alembic.runtime.migration :members: MigrationContext zzzeek-alembic-bee044a1c187/docs/build/api/script.rst000066400000000000000000000007021353106760100224370ustar00rootroot00000000000000.. _alembic.script.toplevel: ================ Script Directory ================ The :class:`.ScriptDirectory` object provides programmatic access to the Alembic version files present in the filesystem. .. automodule:: alembic.script :members: Revision ======== The :class:`.RevisionMap` object serves as the basis for revision management, used exclusively by :class:`.ScriptDirectory`. .. automodule:: alembic.script.revision :members: zzzeek-alembic-bee044a1c187/docs/build/assets/000077500000000000000000000000001353106760100211335ustar00rootroot00000000000000zzzeek-alembic-bee044a1c187/docs/build/assets/api_overview.graffle000066400000000000000000002072571353106760100251770ustar00rootroot00000000000000 ActiveLayerIndex 0 ApplicationVersion com.omnigroup.OmniGrafflePro 139.18.0.187838 AutoAdjust BackgroundGraphic Bounds {{0, 0}, {1176, 768}} Class SolidGraphic ID 2 Style shadow Draws NO stroke Draws NO BaseZoom 0 CanvasOrigin {0, 0} ColumnAlign 1 ColumnSpacing 36 CreationDate 2012-01-24 21:51:07 +0000 Creator classic DisplayScale 1 0/72 in = 1.0000 in GraphDocumentVersion 8 GraphicsList Bounds {{601.74580087231288, 420}, {84, 12}} Class ShapedGraphic FitText YES Flow Resize ID 2140 Shape Rectangle Style shadow Draws NO stroke Draws NO Text Align 0 Text {\rtf1\ansi\ansicpg1252\cocoartf1347\cocoasubrtf570 \cocoascreenfonts1{\fonttbl\f0\fswiss\fcharset0 Helvetica;} {\colortbl;\red255\green255\blue255;} \pard\tx560\tx1120\tx1680\tx2240\tx2800\tx3360\tx3920\tx4480\tx5040\tx5600\tx6160\tx6720 \f0\fs20 \cf0 <<instantiates>>} VerticalPad 0 Wrap NO Class TableGroup Graphics Bounds {{191, 107.40116119384766}, {102.9071044921875, 14}} Class ShapedGraphic FitText Vertical Flow Resize ID 2132 Shape Rectangle Style fill GradientCenter {-0.29411799999999999, -0.264706} Text Text {\rtf1\ansi\ansicpg1252\cocoartf1347\cocoasubrtf570 \cocoascreenfonts1{\fonttbl\f0\fswiss\fcharset0 Helvetica;} {\colortbl;\red255\green255\blue255;} \pard\tx560\tx1120\tx1680\tx2240\tx2800\tx3360\tx3920\tx4480\tx5040\tx5600\tx6160\tx6720\qc \f0\b\fs24 \cf0 PostgresqlImpl} VerticalPad 0 TextPlacement 0 GroupConnect YES ID 2131 Class TableGroup Graphics Bounds {{230.9169921875, 132.80233001708984}, {102.9071044921875, 14}} Class ShapedGraphic FitText Vertical Flow Resize ID 2130 Shape Rectangle Style fill GradientCenter {-0.29411799999999999, -0.264706} Text Text {\rtf1\ansi\ansicpg1252\cocoartf1347\cocoasubrtf570 \cocoascreenfonts1{\fonttbl\f0\fswiss\fcharset0 Helvetica;} {\colortbl;\red255\green255\blue255;} \pard\tx560\tx1120\tx1680\tx2240\tx2800\tx3360\tx3920\tx4480\tx5040\tx5600\tx6160\tx6720\qc \f0\b\fs24 \cf0 MSSQLImpl} VerticalPad 0 TextPlacement 0 GroupConnect YES ID 2129 Class TableGroup Graphics Bounds {{226, 82}, {102.9071044921875, 14}} Class ShapedGraphic FitText Vertical Flow Resize ID 2127 Shape Rectangle Style fill GradientCenter {-0.29411799999999999, -0.264706} Text Text {\rtf1\ansi\ansicpg1252\cocoartf1347\cocoasubrtf570 \cocoascreenfonts1{\fonttbl\f0\fswiss\fcharset0 Helvetica;} {\colortbl;\red255\green255\blue255;} \pard\tx560\tx1120\tx1680\tx2240\tx2800\tx3360\tx3920\tx4480\tx5040\tx5600\tx6160\tx6720\qc \f0\b\fs24 \cf0 MySQLImpl} VerticalPad 0 TextPlacement 0 GroupConnect YES ID 2126 Class LineGraphic Head ID 2055 ID 2135 Points {280.22809604806071, 146.80233001708984} {272.46503226582109, 172.16651000976572} Style stroke HeadArrow UMLInheritance Legacy TailArrow 0 Tail ID 2129 Class LineGraphic Head ID 2055 ID 2134 Points {243.64926792598939, 121.40116119384763} {252.32082843664148, 172.16651000976572} Style stroke HeadArrow UMLInheritance Legacy TailArrow 0 Tail ID 2131 Class LineGraphic Head ID 2055 ID 2133 Points {276.4518773872507, 95.999999999999986} {265.55272336402226, 172.16651000976572} Style stroke HeadArrow UMLInheritance Legacy TailArrow 0 Tail ID 2126 Bounds {{504, 310}, {84, 12}} Class ShapedGraphic FitText YES Flow Resize ID 2125 Shape Rectangle Style shadow Draws NO stroke Draws NO Text Align 0 Text {\rtf1\ansi\ansicpg1252\cocoartf1347\cocoasubrtf570 \cocoascreenfonts1{\fonttbl\f0\fswiss\fcharset0 Helvetica;} {\colortbl;\red255\green255\blue255;} \pard\tx560\tx1120\tx1680\tx2240\tx2800\tx3360\tx3920\tx4480\tx5040\tx5600\tx6160\tx6720 \f0\fs20 \cf0 <<instantiates>>} VerticalPad 0 Wrap NO Class LineGraphic Head ID 33 ID 2124 OrthogonalBarAutomatic OrthogonalBarPoint {0, 0} OrthogonalBarPosition 16 Points {563, 340.34042553191489} {497.13201904296875, 327.88251038766401} Style stroke HeadArrow StickArrow Legacy LineType 2 Pattern 1 TailArrow 0 Tail ID 2072 Bounds {{494.00001409542369, 415.9000186920166}, {55, 12}} Class ShapedGraphic FitText YES Flow Resize ID 2123 Line ID 2139 Position 0.37128287553787231 RotationType 0 Shape Rectangle Style shadow Draws NO stroke Draws NO Text Align 0 Text {\rtf1\ansi\ansicpg1252\cocoartf1347\cocoasubrtf570 \cocoascreenfonts1{\fonttbl\f0\fswiss\fcharset0 Helvetica;} {\colortbl;\red255\green255\blue255;} \pard\tx560\tx1120\tx1680\tx2240\tx2800\tx3360\tx3920\tx4480\tx5040\tx5600\tx6160\tx6720 \f0\fs20 \cf0 <<uses>>} VerticalPad 0 Wrap NO Bounds {{713.35945466160774, 356.11699358749399}, {55, 12}} Class ShapedGraphic FitText YES Flow Resize ID 2122 Line ID 2121 Position 0.49189183115959167 RotationType 0 Shape Rectangle Style shadow Draws NO stroke Draws NO Text Align 0 Text {\rtf1\ansi\ansicpg1252\cocoartf1347\cocoasubrtf570 \cocoascreenfonts1{\fonttbl\f0\fswiss\fcharset0 Helvetica;} {\colortbl;\red255\green255\blue255;} \pard\tx560\tx1120\tx1680\tx2240\tx2800\tx3360\tx3920\tx4480\tx5040\tx5600\tx6160\tx6720 \f0\fs20 \cf0 <<uses>>} VerticalPad 0 Wrap NO Class LineGraphic Head ID 2081 Info 5 ID 2121 Points {702, 363.10150901307452} {781, 361.10002136230463} Style stroke HeadArrow StickArrow HopLines HopType 102 Legacy Pattern 1 TailArrow 0 Tail ID 2072 Class LineGraphic Head ID 2059 ID 2120 OrthogonalBarAutomatic OrthogonalBarPoint {0, 0} OrthogonalBarPosition -1 Points {637, 406} {565.78369522094727, 454.05202861384231} Style stroke HeadArrow StickArrow Legacy LineType 2 Pattern 1 TailArrow 0 Tail ID 2072 Bounds {{717, 400}, {68, 12}} Class ShapedGraphic FitText YES Flow Resize ID 2119 Shape Rectangle Style shadow Draws NO stroke Draws NO Text Align 0 Text {\rtf1\ansi\ansicpg1252\cocoartf1347\cocoasubrtf570 \cocoascreenfonts1{\fonttbl\f0\fswiss\fcharset0 Helvetica;} {\colortbl;\red255\green255\blue255;} \pard\tx560\tx1120\tx1680\tx2240\tx2800\tx3360\tx3920\tx4480\tx5040\tx5600\tx6160\tx6720 \f0\fs20 \cf0 <<invokes>>} VerticalPad 0 Wrap NO Class LineGraphic Head ID 2072 Info 5 ID 2118 OrthogonalBarAutomatic OrthogonalBarPoint {0, 0} OrthogonalBarPosition -1 Points {759.34192925872742, 429.89997863769531} {702, 384.99999999999994} Style stroke HeadArrow StickArrow Legacy LineType 2 Pattern 1 TailArrow 0 Tail ID 2048 Info 3 Bounds {{603.74580087231288, 470.3107529903566}, {80, 12}} Class ShapedGraphic FitText YES Flow Resize ID 2117 Line ID 2116 Position 0.47171458601951599 RotationType 0 Shape Rectangle Style shadow Draws NO stroke Draws NO Text Align 0 Text {\rtf1\ansi\ansicpg1252\cocoartf1347\cocoasubrtf570 \cocoascreenfonts1{\fonttbl\f0\fswiss\fcharset0 Helvetica;} {\colortbl;\red255\green255\blue255;} \pard\tx560\tx1120\tx1680\tx2240\tx2800\tx3360\tx3920\tx4480\tx5040\tx5600\tx6160\tx6720 \f0\fs20 \cf0 <<configures>>} VerticalPad 0 Wrap NO Class LineGraphic Head ID 2059 ID 2116 Points {713.35941696166992, 476.88540101271974} {565.78369522094727, 475.66718967115884} Style stroke HeadArrow StickArrow HopLines HopType 102 Legacy Pattern 1 TailArrow 0 Tail ID 2048 Bounds {{816, 258.37493918977634}, {69, 24}} Class ShapedGraphic FitText YES Flow Resize ID 2113 Line ID 2109 Position 0.46421170234680176 RotationType 0 Shape Rectangle Style shadow Draws NO stroke Draws NO Text Align 0 Text {\rtf1\ansi\ansicpg1252\cocoartf1347\cocoasubrtf570 \cocoascreenfonts1{\fonttbl\f0\fswiss\fcharset0 Helvetica;} {\colortbl;\red255\green255\blue255;} \pard\tx560\tx1120\tx1680\tx2240\tx2800\tx3360\tx3920\tx4480\tx5040\tx5600\tx6160\tx6720 \f0\fs20 \cf0 <<generates,\ renders>>} VerticalPad 0 Wrap NO Bounds {{705.05227716905051, 191.22492316822797}, {69, 24}} Class ShapedGraphic FitText YES Flow Resize ID 2112 Line ID 2108 Position 0.46593526005744934 RotationType 0 Shape Rectangle Style shadow Draws NO stroke Draws NO Text Align 0 Text {\rtf1\ansi\ansicpg1252\cocoartf1347\cocoasubrtf570 \cocoascreenfonts1{\fonttbl\f0\fswiss\fcharset0 Helvetica;} {\colortbl;\red255\green255\blue255;} \pard\tx560\tx1120\tx1680\tx2240\tx2800\tx3360\tx3920\tx4480\tx5040\tx5600\tx6160\tx6720 \f0\fs20 \cf0 <<provides\ operations>>} VerticalPad 0 Wrap NO Class LineGraphic Head ID 2098 ID 2109 Points {850.5, 298.10002136230469} {850.50001322861976, 238.37493896484375} Style stroke HeadArrow StickArrow HopLines HopType 102 Legacy Pattern 1 TailArrow 0 Tail ID 2081 Class LineGraphic Head ID 38 ID 2108 Points {781.00002098083496, 203.28096591495026} {692.04400634765625, 203.16068579982147} Style stroke HeadArrow StickArrow Legacy Pattern 1 TailArrow 0 Tail ID 2098 Bounds {{623.48996514081955, 291.09998092651369}, {55, 12}} Class ShapedGraphic FitText YES Flow Resize ID 2107 Line ID 2105 Position 0.43473681807518005 RotationType 0 Shape Rectangle Style shadow Draws NO stroke Draws NO Text Align 0 Text {\rtf1\ansi\ansicpg1252\cocoartf1347\cocoasubrtf570 \cocoascreenfonts1{\fonttbl\f0\fswiss\fcharset0 Helvetica;} {\colortbl;\red255\green255\blue255;} \pard\tx560\tx1120\tx1680\tx2240\tx2800\tx3360\tx3920\tx4480\tx5040\tx5600\tx6160\tx6720 \f0\fs20 \cf0 <<uses>>} VerticalPad 0 Wrap NO Bounds {{513.14304282962803, 197.37493856351756}, {55, 12}} Class ShapedGraphic FitText YES Flow Resize ID 2106 Line ID 2104 Position 0.3995765745639801 RotationType 0 Shape Rectangle Style shadow Draws NO stroke Draws NO Text Align 0 Text {\rtf1\ansi\ansicpg1252\cocoartf1347\cocoasubrtf570 \cocoascreenfonts1{\fonttbl\f0\fswiss\fcharset0 Helvetica;} {\colortbl;\red255\green255\blue255;} \pard\tx560\tx1120\tx1680\tx2240\tx2800\tx3360\tx3920\tx4480\tx5040\tx5600\tx6160\tx6720 \f0\fs20 \cf0 <<uses>>} VerticalPad 0 Wrap NO Class LineGraphic Head ID 41 Info 4 ID 2105 OrthogonalBarAutomatic OrthogonalBarPoint {0, 0} OrthogonalBarPosition 5.1000003814697266 Points {781, 339.20153037537921} {747, 331} {744, 297.09998092651369} {533, 272.33299255371094} {526, 233} {491.30664526513783, 232.60000610351562} Style stroke HeadArrow StickArrow Legacy LineType 2 Pattern 1 TailArrow 0 Tail ID 2081 Info 2 Class LineGraphic Head ID 41 ID 2104 Points {572.95599365234375, 203} {492.0880126953125, 203.93833970103648} Style stroke HeadArrow StickArrow HopLines HopType 102 Legacy Pattern 1 TailArrow 0 Tail ID 38 Bounds {{392.47411627278478, 268.53371033283503}, {84, 12}} Class ShapedGraphic FitText YES Flow Resize ID 2103 Line ID 2102 Offset 1 Position 0.46998947858810425 RotationType 0 Shape Rectangle Style shadow Draws NO stroke Draws NO Text Align 0 Text {\rtf1\ansi\ansicpg1252\cocoartf1347\cocoasubrtf570 \cocoascreenfonts1{\fonttbl\f0\fswiss\fcharset0 Helvetica;} {\colortbl;\red255\green255\blue255;} \pard\tx560\tx1120\tx1680\tx2240\tx2800\tx3360\tx3920\tx4480\tx5040\tx5600\tx6160\tx6720 \f0\fs20 \cf0 <<instantiates>>} VerticalPad 0 Wrap NO Class LineGraphic Head ID 41 ID 2102 Points {435.00741612193735, 298.09998092651369} {436.00000000000011, 248} Style stroke HeadArrow StickArrow HopLines HopType 102 Legacy Pattern 1 TailArrow 0 Tail ID 33 Bounds {{320.83625227212906, 209.28763384458864}, {55, 12}} Class ShapedGraphic FitText YES Flow Resize ID 2101 Line ID 2040 Position 0.39780238270759583 RotationType 0 Shape Rectangle Style shadow Draws NO stroke Draws NO Text Align 0 Text {\rtf1\ansi\ansicpg1252\cocoartf1347\cocoasubrtf570 \cocoascreenfonts1{\fonttbl\f0\fswiss\fcharset0 Helvetica;} {\colortbl;\red255\green255\blue255;} \pard\tx560\tx1120\tx1680\tx2240\tx2800\tx3360\tx3920\tx4480\tx5040\tx5600\tx6160\tx6720 \f0\fs20 \cf0 <<uses>>} VerticalPad 0 Wrap NO Class TableGroup Graphics Bounds {{781.00002098083496, 168.37493896484375}, {139, 14}} Class ShapedGraphic FitText Vertical Flow Resize ID 2099 Shape Rectangle Style fill GradientCenter {-0.29411799999999999, -0.264706} Text Text {\rtf1\ansi\ansicpg1252\cocoartf1347\cocoasubrtf570 \cocoascreenfonts1{\fonttbl\f0\fswiss\fcharset0 Helvetica;} {\colortbl;\red255\green255\blue255;} \pard\tx560\tx1120\tx1680\tx2240\tx2800\tx3360\tx3920\tx4480\tx5040\tx5600\tx6160\tx6720\qc \f0\b\fs24 \cf0 alembic.operations.op} VerticalPad 0 TextPlacement 0 Bounds {{781.00002098083496, 182.37493896484375}, {139, 56}} Class ShapedGraphic FitText Vertical Flow Resize ID 2100 Shape Rectangle Style fill GradientCenter {-0.29411799999999999, -0.264706} Text Align 0 Text {\rtf1\ansi\ansicpg1252\cocoartf1347\cocoasubrtf570 \cocoascreenfonts1{\fonttbl\f0\fswiss\fcharset0 Helvetica;} {\colortbl;\red255\green255\blue255;} \pard\tx560\tx1120\tx1680\tx2240\tx2800\tx3360\tx3920\tx4480\tx5040\tx5600\tx6160\tx6720 \f0\fs24 \cf0 CreateTableOp\ AlterColumnOp\ AddColumnOp\ DropColumnOp} VerticalPad 0 TextPlacement 0 GridH 2099 2100 GroupConnect YES ID 2098 Bounds {{333.24926419826539, 462.28131709379346}, {78, 12}} Class ShapedGraphic FitText YES Flow Resize ID 2090 Line ID 2068 Position 0.44118145108222961 RotationType 0 Shape Rectangle Style shadow Draws NO stroke Draws NO Text Align 0 Text {\rtf1\ansi\ansicpg1252\cocoartf1347\cocoasubrtf570 \cocoascreenfonts1{\fonttbl\f0\fswiss\fcharset0 Helvetica;} {\colortbl;\red255\green255\blue255;} \pard\tx560\tx1120\tx1680\tx2240\tx2800\tx3360\tx3920\tx4480\tx5040\tx5600\tx6160\tx6720 \f0\fs20 \cf0 <<read/write>>} VerticalPad 0 Wrap NO Class TableGroup Graphics Bounds {{781, 298.10002136230469}, {139, 14}} Class ShapedGraphic FitText Vertical Flow Resize ID 2082 Shape Rectangle Style fill GradientCenter {-0.29411799999999999, -0.264706} Text Text {\rtf1\ansi\ansicpg1252\cocoartf1347\cocoasubrtf570 \cocoascreenfonts1{\fonttbl\f0\fswiss\fcharset0 Helvetica;} {\colortbl;\red255\green255\blue255;} \pard\tx560\tx1120\tx1680\tx2240\tx2800\tx3360\tx3920\tx4480\tx5040\tx5600\tx6160\tx6720\qc \f0\b\fs24 \cf0 alembic.autogenerate} VerticalPad 0 TextPlacement 0 Bounds {{781, 312.10002136230469}, {139, 70}} Class ShapedGraphic FitText Vertical Flow Resize ID 2083 Shape Rectangle Style fill GradientCenter {-0.29411799999999999, -0.264706} Text Align 0 Text {\rtf1\ansi\ansicpg1252\cocoartf1347\cocoasubrtf570 \cocoascreenfonts1{\fonttbl\f0\fswiss\fcharset0 Helvetica;} {\colortbl;\red255\green255\blue255;} \pard\tx560\tx1120\tx1680\tx2240\tx2800\tx3360\tx3920\tx4480\tx5040\tx5600\tx6160\tx6720 \f0\fs24 \cf0 compare_metadata()\ produce_migrations()\ compare\ render\ generate} VerticalPad 0 TextPlacement 0 GridH 2082 2083 GroupConnect YES ID 2081 Magnets {0.032374100719424703, 0.5} {-0.5071942446043165, -0.010850225176129769} {0.52163523392711664, 0} {0, -0.5} {-0.5, 0.24999999999999911} Class TableGroup Graphics Bounds {{563, 322}, {139, 14}} Class ShapedGraphic FitText Vertical Flow Resize ID 2073 Shape Rectangle Style fill GradientCenter {-0.29411799999999999, -0.264706} Text Text {\rtf1\ansi\ansicpg1252\cocoartf1347\cocoasubrtf570 \cocoascreenfonts1{\fonttbl\f0\fswiss\fcharset0 Helvetica;} {\colortbl;\red255\green255\blue255;} \pard\tx560\tx1120\tx1680\tx2240\tx2800\tx3360\tx3920\tx4480\tx5040\tx5600\tx6160\tx6720\qc \f0\b\fs24 \cf0 alembic.command} VerticalPad 0 TextPlacement 0 Bounds {{563, 336}, {139, 70}} Class ShapedGraphic FitText Vertical Flow Resize ID 2074 Shape Rectangle Style fill GradientCenter {-0.29411799999999999, -0.264706} Text Align 0 Text {\rtf1\ansi\ansicpg1252\cocoartf1347\cocoasubrtf570 \cocoascreenfonts1{\fonttbl\f0\fswiss\fcharset0 Helvetica;} {\colortbl;\red255\green255\blue255;} \pard\tx560\tx1120\tx1680\tx2240\tx2800\tx3360\tx3920\tx4480\tx5040\tx5600\tx6160\tx6720 \f0\fs24 \cf0 init()\ revision()\ upgrade()\ downgrade()\ history()} VerticalPad 0 TextPlacement 0 GridH 2073 2074 GroupConnect YES ID 2072 Magnets {0.032374100719424703, 0.5} {-0.5071942446043165, -0.010850225176129769} {0.26978417266187105, 0.50105453672863209} {0.16675024238421798, -0.51583989461263036} {0.5, 0.24999999999999911} {0.50000000000000089, -0.010696321272922305} {-0.50719424460431561, -0.28571428571428559} Class LineGraphic Head ID 2067 ID 2068 Points {426.78369522094727, 467.79283450278251} {303.17371368408192, 468.90004920959467} Style stroke HeadArrow StickArrow HopLines HopType 102 Legacy Pattern 1 TailArrow 0 Tail ID 2059 Class Group Graphics Bounds {{218.92971038818359, 448.71651649475098}, {74.487998962402344, 46}} Class ShapedGraphic FitText Vertical Flow Resize FontInfo Font Helvetica Size 10 ID 2066 Shape Rectangle Style Text Align 0 Pad 1 Text {\rtf1\ansi\ansicpg1252\cocoartf1347\cocoasubrtf570 \cocoascreenfonts1{\fonttbl\f0\fswiss\fcharset0 Helvetica;} {\colortbl;\red255\green255\blue255;} \pard\tx560\tx1120\tx1680\tx2240\tx2800\tx3360\tx3920\tx4480\tx5040\tx5600\tx6160\tx6720\pardirnatural \f0\fs20 \cf0 \expnd0\expndtw0\kerning0 /versions/a.py\ /versions/b.py\ /versions/...} Bounds {{209.17371368408203, 424.9000186920166}, {94, 84}} Class ShapedGraphic ID 2067 Magnets {0.49999999999999911, -0.30952344621930905} {0.49999999999999911, 0.023809887114024875} Shape Rectangle Style Text Text {\rtf1\ansi\ansicpg1252\cocoartf1347\cocoasubrtf570 \cocoascreenfonts1{\fonttbl\f0\fswiss\fcharset0 Helvetica;} {\colortbl;\red255\green255\blue255;} \pard\tx560\tx1120\tx1680\tx2240\tx2800\tx3360\tx3920\tx4480\tx5040\tx5600\tx6160\tx6720\pardirnatural\qc \f0\fs24 \cf0 filesystem} TextPlacement 0 ID 2065 Class TableGroup Graphics Bounds {{426.78369522094727, 442.76912879943848}, {139, 14}} Class ShapedGraphic FitText Vertical Flow Resize ID 2060 Shape Rectangle Style fill GradientCenter {-0.29411799999999999, -0.264706} Text Text {\rtf1\ansi\ansicpg1252\cocoartf1347\cocoasubrtf570 \cocoascreenfonts1{\fonttbl\f0\fswiss\fcharset0 Helvetica;} {\colortbl;\red255\green255\blue255;} \pard\tx560\tx1120\tx1680\tx2240\tx2800\tx3360\tx3920\tx4480\tx5040\tx5600\tx6160\tx6720\qc \f0\b\fs24 \cf0 ScriptDirectory} VerticalPad 0 TextPlacement 0 Bounds {{426.78369522094727, 456.76912879943848}, {139, 42}} Class ShapedGraphic FitText Vertical Flow Resize ID 2061 Shape Rectangle Style fill GradientCenter {-0.29411799999999999, -0.264706} Text Align 0 Text {\rtf1\ansi\ansicpg1252\cocoartf1347\cocoasubrtf570 \cocoascreenfonts1{\fonttbl\f0\fswiss\fcharset0 Helvetica;} {\colortbl;\red255\green255\blue255;} \pard\tx560\tx1120\tx1680\tx2240\tx2800\tx3360\tx3920\tx4480\tx5040\tx5600\tx6160\tx6720 \f0\fs24 \cf0 walk_revisions()\ get_revision()\ generate_revision()} VerticalPad 0 TextPlacement 0 GridH 2060 2061 GroupConnect YES ID 2059 Magnets {0.51040606996823534, 0.089285714285713524} {0.25000000000000044, -0.50000000000000089} {-0.50398241924039766, -0.053571428571430602} {-0.00038529693823985411, 0.5357142857142847} {0.5015561494895886, -0.29944872856140314} Class LineGraphic Head ID 2038 ID 2058 Points {259.5464429157899, 256.16651000976572} {259.5464429157899, 299.49998778426624} Style stroke HeadArrow StickArrow Legacy Pattern 1 TailArrow 0 Tail ID 2055 Class TableGroup Graphics Bounds {{208.09290313720703, 172.16651000976572}, {102.90709686279297, 14}} Class ShapedGraphic FitText Vertical Flow Resize ID 2056 Shape Rectangle Style fill GradientCenter {-0.29411799999999999, -0.264706} Text Text {\rtf1\ansi\ansicpg1252\cocoartf1347\cocoasubrtf570 \cocoascreenfonts1{\fonttbl\f0\fswiss\fcharset0 Helvetica;} {\colortbl;\red255\green255\blue255;} \pard\tx560\tx1120\tx1680\tx2240\tx2800\tx3360\tx3920\tx4480\tx5040\tx5600\tx6160\tx6720\qc \f0\b\fs24 \cf0 DefaultImpl} VerticalPad 0 TextPlacement 0 Bounds {{208.09290313720703, 186.16651000976572}, {102.90709686279297, 70}} Class ShapedGraphic FitText Vertical Flow Resize ID 2057 Shape Rectangle Style fill GradientCenter {-0.29411799999999999, -0.264706} Text Align 0 Text {\rtf1\ansi\ansicpg1252\cocoartf1347\cocoasubrtf570 \cocoascreenfonts1{\fonttbl\f0\fswiss\fcharset0 Helvetica;} {\colortbl;\red255\green255\blue255;} \pard\tx560\tx1120\tx1680\tx2240\tx2800\tx3360\tx3920\tx4480\tx5040\tx5600\tx6160\tx6720 \f0\fs24 \cf0 execute()\ create_table()\ alter_column()\ add_column()\ drop_column()} VerticalPad 0 TextPlacement 0 GridH 2056 2057 GroupConnect YES ID 2055 Class TableGroup Graphics Bounds {{713.35941696166992, 429.89997863769531}, {119.0880126953125, 14}} Class ShapedGraphic FitText Vertical Flow Resize ID 2049 Shape Rectangle Style fill GradientCenter {-0.29411799999999999, -0.264706} Text Text {\rtf1\ansi\ansicpg1252\cocoartf1347\cocoasubrtf570 \cocoascreenfonts1{\fonttbl\f0\fswiss\fcharset0 Helvetica;} {\colortbl;\red255\green255\blue255;} \pard\tx560\tx1120\tx1680\tx2240\tx2800\tx3360\tx3920\tx4480\tx5040\tx5600\tx6160\tx6720\qc \f0\b\fs24 \cf0 alembic.config} VerticalPad 0 TextPlacement 0 Bounds {{713.35941696166992, 443.89997863769531}, {119.0880126953125, 42}} Class ShapedGraphic FitText Vertical Flow Resize ID 2050 Shape Rectangle Style fill GradientCenter {-0.29411799999999999, -0.264706} Text Align 0 Text {\rtf1\ansi\ansicpg1252\cocoartf1347\cocoasubrtf570 \cocoascreenfonts1{\fonttbl\f0\fswiss\fcharset0 Helvetica;} {\colortbl;\red255\green255\blue255;} \pard\tx560\tx1120\tx1680\tx2240\tx2800\tx3360\tx3920\tx4480\tx5040\tx5600\tx6160\tx6720 \f0\fs24 \cf0 Config\ Command\ main()} VerticalPad 0 TextPlacement 0 GridH 2049 2050 GroupConnect YES ID 2048 Magnets {0.5, -4.4408920985006262e-16} {-0.5, -0.25000000000000178} {-0.1138779104937786, -0.5} {-0.49999999999999911, 0.33902539955400712} Class LineGraphic Head ID 2055 ID 2040 Points {373, 215.59905413254651} {311, 214.81620239134219} Style stroke HeadArrow StickArrow Legacy Pattern 1 TailArrow 0 Tail ID 41 Bounds {{216.45355606079102, 299.9999877929688}, {86.1858, 84}} Class ShapedGraphic ID 2038 Shape Cylinder Style Text Text {\rtf1\ansi\ansicpg1252\cocoartf1347\cocoasubrtf570 \cocoascreenfonts1{\fonttbl\f0\fswiss\fcharset0 Helvetica;} {\colortbl;\red255\green255\blue255;} \pard\tx560\tx1120\tx1680\tx2240\tx2800\tx3360\tx3920\tx4480\tx5040\tx5600\tx6160\tx6720\pardirnatural\qc \f0\fs24 \cf0 database} VerticalPad 0 Class TableGroup Graphics Bounds {{373, 180.20000610351565}, {119.0880126953125, 14}} Class ShapedGraphic FitText Vertical Flow Resize ID 42 Shape Rectangle Style fill GradientCenter {-0.29411799999999999, -0.264706} Text Text {\rtf1\ansi\ansicpg1252\cocoartf1347\cocoasubrtf570 \cocoascreenfonts1{\fonttbl\f0\fswiss\fcharset0 Helvetica;} {\colortbl;\red255\green255\blue255;} \pard\tx560\tx1120\tx1680\tx2240\tx2800\tx3360\tx3920\tx4480\tx5040\tx5600\tx6160\tx6720\qc \f0\b\fs24 \cf0 MigrationContext} VerticalPad 0 TextPlacement 0 Bounds {{373, 194.20000610351565}, {119.0880126953125, 56}} Class ShapedGraphic FitText Vertical Flow Resize ID 44 Shape Rectangle Style fill GradientCenter {-0.29411799999999999, -0.264706} Text Align 0 Text {\rtf1\ansi\ansicpg1252\cocoartf1347\cocoasubrtf570 \cocoascreenfonts1{\fonttbl\f0\fswiss\fcharset0 Helvetica;} {\colortbl;\red255\green255\blue255;} \pard\tx560\tx1120\tx1680\tx2240\tx2800\tx3360\tx3920\tx4480\tx5040\tx5600\tx6160\tx6720\pardirnatural \f0\fs24 \cf0 connection\ run_migrations()\ execute()\ stamp()} VerticalPad 0 TextPlacement 0 GridH 42 44 GroupConnect YES ID 41 Magnets {0.5, -0.16088094860684521} {0.0042301604752394972, -0.5514285714285716} {-0.49936690654431892, 0.0057142857142853387} {0.49343873986566722, 0.24857142857142822} {0.029020499831381219, 0.46857134137834766} Class TableGroup Graphics Bounds {{572.95599365234375, 175.59130477905273}, {119.0880126953125, 14}} Class ShapedGraphic FitText Vertical Flow Resize ID 39 Shape Rectangle Style fill GradientCenter {-0.29411799999999999, -0.264706} Text Text {\rtf1\ansi\ansicpg1252\cocoartf1347\cocoasubrtf570 \cocoascreenfonts1{\fonttbl\f0\fswiss\fcharset0 Helvetica;} {\colortbl;\red255\green255\blue255;} \pard\tx560\tx1120\tx1680\tx2240\tx2800\tx3360\tx3920\tx4480\tx5040\tx5600\tx6160\tx6720\qc \f0\b\fs24 \cf0 Operations} VerticalPad 0 TextPlacement 0 Bounds {{572.95599365234375, 189.59130477905273}, {119.0880126953125, 70}} Class ShapedGraphic FitText Vertical Flow Resize ID 40 Shape Rectangle Style fill GradientCenter {-0.29411799999999999, -0.264706} Text Align 0 Text {\rtf1\ansi\ansicpg1252\cocoartf1347\cocoasubrtf570 \cocoascreenfonts1{\fonttbl\f0\fswiss\fcharset0 Helvetica;} {\colortbl;\red255\green255\blue255;} \pard\tx560\tx1120\tx1680\tx2240\tx2800\tx3360\tx3920\tx4480\tx5040\tx5600\tx6160\tx6720 \f0\fs24 \cf0 migration_context\ create_table()\ alter_column()\ add_column()\ drop_column()} VerticalPad 0 TextPlacement 0 GridH 39 40 GroupConnect YES ID 38 Magnets {-0.49999999999999911, -0.17370600927443736} Class TableGroup Graphics Bounds {{367.95599365234375, 298.09998092651369}, {129.176025390625, 14.000003814697266}} Class ShapedGraphic FitText Vertical Flow Resize ID 34 Shape Rectangle Style fill GradientCenter {-0.29411799999999999, -0.264706} Text Text {\rtf1\ansi\ansicpg1252\cocoartf1347\cocoasubrtf570 \cocoascreenfonts1{\fonttbl\f0\fswiss\fcharset0 Helvetica;} {\colortbl;\red255\green255\blue255;} \pard\tx560\tx1120\tx1680\tx2240\tx2800\tx3360\tx3920\tx4480\tx5040\tx5600\tx6160\tx6720\qc \f0\b\fs24 \cf0 EnvironmentContext} VerticalPad 0 TextPlacement 0 Bounds {{367.95599365234375, 312.09998855590823}, {129.176025390625, 70.000015258789062}} Class ShapedGraphic FitText Vertical Flow Resize ID 35 Shape Rectangle Style fill GradientCenter {-0.29411799999999999, -0.264706} Text Align 0 Text {\rtf1\ansi\ansicpg1252\cocoartf1347\cocoasubrtf570 \cocoascreenfonts1{\fonttbl\f0\fswiss\fcharset0 Helvetica;} {\colortbl;\red255\green255\blue255;} \pard\tx560\tx1120\tx1680\tx2240\tx2800\tx3360\tx3920\tx4480\tx5040\tx5600\tx6160\tx6720 \f0\fs24 \cf0 migration_context\ configure()\ run_migrations()\ begin_transaction()\ is_offline_mode()} VerticalPad 0 TextPlacement 0 GridH 34 35 GroupConnect YES ID 33 Magnets {0.5, -0.14544617445169949} {0.019251798561151112, 0.50476190476190474} {0.019070177820008194, -0.49999999999999956} Bounds {{350, 148.9999938964844}, {164.82400000000001, 255.60000610351562}} Class ShapedGraphic ID 2036 Shape Rectangle Style fill Draws NO shadow Draws NO Fuzziness 0.0 stroke Color b 0.191506 g 0.389204 r 0.744565 CornerRadius 5 Pattern 1 Text Align 0 Text {\rtf1\ansi\ansicpg1252\cocoartf1347\cocoasubrtf570 \cocoascreenfonts1{\fonttbl\f0\fswiss\fcharset0 Helvetica;} {\colortbl;\red255\green255\blue255;} \pard\tx560\tx1120\tx1680\tx2240\tx2800\tx3360\tx3920\tx4480\tx5040\tx5600\tx6160\tx6720\pardirnatural \f0\fs24 \cf0 env.py script} VerticalPad 0 TextPlacement 0 Wrap NO Bounds {{552, 149}, {169, 130.33299255371094}} Class ShapedGraphic ID 2032 Magnets {-0.43313956596913394, 0.50000000000000044} {0.014211640211639676, 0.49587157857074082} Shape Rectangle Style fill Draws NO shadow Draws NO Fuzziness 0.0 stroke Color b 0.191506 g 0.389204 r 0.744565 CornerRadius 5 Pattern 1 Text Align 0 Text {\rtf1\ansi\ansicpg1252\cocoartf1347\cocoasubrtf570 \cocoascreenfonts1{\fonttbl\f0\fswiss\fcharset0 Helvetica;} {\colortbl;\red255\green255\blue255;} \pard\tx560\tx1120\tx1680\tx2240\tx2800\tx3360\tx3920\tx4480\tx5040\tx5600\tx6160\tx6720\pardirnatural \f0\fs24 \cf0 migration script} VerticalPad 0 TextPlacement 0 Wrap NO Class LineGraphic Head ID 2048 ID 2139 OrthogonalBarAutomatic OrthogonalBarPoint {0, 0} OrthogonalBarPosition -1 Points {435.00741612193735, 382.10000381469729} {548, 421.9000186920166} {601.38076234099412, 436} {713.35941696166992, 443.8999786376952} Style stroke HeadArrow StickArrow Legacy LineType 2 Pattern 1 TailArrow 0 Tail ID 33 Info 2 GridInfo GuidesLocked NO GuidesVisible YES HPages 2 ImageCounter 1 KeepToScale Layers Lock NO Name Layer 1 Print YES View YES LayoutInfo Animate NO circoMinDist 18 circoSeparation 0.0 layoutEngine dot neatoSeparation 0.0 twopiSeparation 0.0 LinksVisible NO MagnetsVisible NO MasterSheets ModificationDate 2015-07-02 23:12:07 +0000 Modifier classic NotesVisible NO Orientation 2 OriginVisible NO OutlineStyle Basic PageBreaks NO PrintInfo NSBottomMargin float 12 NSHorizonalPagination coded BAtzdHJlYW10eXBlZIHoA4QBQISEhAhOU051bWJlcgCEhAdOU1ZhbHVlAISECE5TT2JqZWN0AIWEASqEhAFxlwCG NSLeftMargin float 12 NSPaperSize size {612, 792} NSPrintReverseOrientation int 0 NSRightMargin float 12 NSTopMargin float 12 PrintOnePage ReadOnly NO RowAlign 1 RowSpacing 36 SheetTitle Canvas 1 SmartAlignmentGuidesActive YES SmartDistanceGuidesActive YES UniqueID 1 UseEntirePage VPages 1 WindowInfo CurrentSheet 0 ExpandedCanvases Frame {{130, 128}, {1193, 852}} ListView OutlineWidth 142 RightSidebar Sidebar SidebarWidth 138 VisibleRegion {{-8, 1}, {1193, 755}} Zoom 1 ZoomValues Canvas 1 1 1 zzzeek-alembic-bee044a1c187/docs/build/autogenerate.rst000066400000000000000000000477201353106760100230600ustar00rootroot00000000000000Auto Generating Migrations =========================== Alembic can view the status of the database and compare against the table metadata in the application, generating the "obvious" migrations based on a comparison. This is achieved using the ``--autogenerate`` option to the ``alembic revision`` command, which places so-called *candidate* migrations into our new migrations file. We review and modify these by hand as needed, then proceed normally. To use autogenerate, we first need to modify our ``env.py`` so that it gets access to a table metadata object that contains the target. Suppose our application has a :ref:`declarative base ` in ``myapp.mymodel``. This base contains a :class:`~sqlalchemy.schema.MetaData` object which contains :class:`~sqlalchemy.schema.Table` objects defining our database. We make sure this is loaded in ``env.py`` and then passed to :meth:`.EnvironmentContext.configure` via the ``target_metadata`` argument. The ``env.py`` sample script used in the generic template already has a variable declaration near the top for our convenience, where we replace ``None`` with our :class:`~sqlalchemy.schema.MetaData`. Starting with:: # add your model's MetaData object here # for 'autogenerate' support # from myapp import mymodel # target_metadata = mymodel.Base.metadata target_metadata = None we change to:: from myapp.mymodel import Base target_metadata = Base.metadata .. note:: The above example refers to the **generic alembic env.py template**, e.g. the one created by default when calling upon ``alembic init``, and not the special-use templates such as ``multidb``. Please consult the source code and comments within the ``env.py`` script directly for specific guidance on where and how the autogenerate metadata is established. If we look later in the script, down in ``run_migrations_online()``, we can see the directive passed to :meth:`.EnvironmentContext.configure`:: def run_migrations_online(): engine = engine_from_config( config.get_section(config.config_ini_section), prefix='sqlalchemy.') with engine.connect() as connection: context.configure( connection=connection, target_metadata=target_metadata ) with context.begin_transaction(): context.run_migrations() We can then use the ``alembic revision`` command in conjunction with the ``--autogenerate`` option. Suppose our :class:`~sqlalchemy.schema.MetaData` contained a definition for the ``account`` table, and the database did not. We'd get output like:: $ alembic revision --autogenerate -m "Added account table" INFO [alembic.context] Detected added table 'account' Generating /path/to/foo/alembic/versions/27c6a30d7c24.py...done We can then view our file ``27c6a30d7c24.py`` and see that a rudimentary migration is already present:: """empty message Revision ID: 27c6a30d7c24 Revises: None Create Date: 2011-11-08 11:40:27.089406 """ # revision identifiers, used by Alembic. revision = '27c6a30d7c24' down_revision = None from alembic import op import sqlalchemy as sa def upgrade(): ### commands auto generated by Alembic - please adjust! ### op.create_table( 'account', sa.Column('id', sa.Integer()), sa.Column('name', sa.String(length=50), nullable=False), sa.Column('description', sa.VARCHAR(200)), sa.Column('last_transaction_date', sa.DateTime()), sa.PrimaryKeyConstraint('id') ) ### end Alembic commands ### def downgrade(): ### commands auto generated by Alembic - please adjust! ### op.drop_table("account") ### end Alembic commands ### The migration hasn't actually run yet, of course. We do that via the usual ``upgrade`` command. We should also go into our migration file and alter it as needed, including adjustments to the directives as well as the addition of other directives which these may be dependent on - specifically data changes in between creates/alters/drops. What does Autogenerate Detect (and what does it *not* detect?) -------------------------------------------------------------- The vast majority of user issues with Alembic centers on the topic of what kinds of changes autogenerate can and cannot detect reliably, as well as how it renders Python code for what it does detect. it is critical to note that **autogenerate is not intended to be perfect**. It is *always* necessary to manually review and correct the **candidate migrations** that autogenererate produces. The feature is getting more and more comprehensive and error-free as releases continue, but one should take note of the current limitations. Autogenerate **will detect**: * Table additions, removals. * Column additions, removals. * Change of nullable status on columns. * Basic changes in indexes and explicitly-named unique constraints .. versionadded:: 0.6.1 Support for autogenerate of indexes and unique constraints. * Basic changes in foreign key constraints .. versionadded:: 0.7.1 Support for autogenerate of foreign key constraints. Autogenerate can **optionally detect**: * Change of column type. This will occur if you set the :paramref:`.EnvironmentContext.configure.compare_type` parameter to ``True``, or to a custom callable function. The default implementation **only detects major type changes**, such as between ``Numeric`` and ``String``, and does not detect changes in arguments such as lengths, precisions, or enumeration members. The type comparison logic is extensible to work around these limitations, see :ref:`compare_types` for details. * Change of server default. This will occur if you set the :paramref:`.EnvironmentContext.configure.compare_server_default` parameter to ``True``, or to a custom callable function. This feature works well for simple cases but cannot always produce accurate results. The Postgresql backend will actually invoke the "detected" and "metadata" values against the database to determine equivalence. The feature is off by default so that it can be tested on the target schema first. Like type comparison, it can also be customized by passing a callable; see the function's documentation for details. Autogenerate **can not detect**: * Changes of table name. These will come out as an add/drop of two different tables, and should be hand-edited into a name change instead. * Changes of column name. Like table name changes, these are detected as a column add/drop pair, which is not at all the same as a name change. * Anonymously named constraints. Give your constraints a name, e.g. ``UniqueConstraint('col1', 'col2', name="my_name")``. See the section :doc:`naming` for background on how to configure automatic naming schemes for constraints. * Special SQLAlchemy types such as :class:`~sqlalchemy.types.Enum` when generated on a backend which doesn't support ENUM directly - this because the representation of such a type in the non-supporting database, i.e. a CHAR+ CHECK constraint, could be any kind of CHAR+CHECK. For SQLAlchemy to determine that this is actually an ENUM would only be a guess, something that's generally a bad idea. To implement your own "guessing" function here, use the :meth:`sqlalchemy.events.DDLEvents.column_reflect` event to detect when a CHAR (or whatever the target type is) is reflected, and change it to an ENUM (or whatever type is desired) if it is known that that's the intent of the type. The :meth:`sqlalchemy.events.DDLEvents.after_parent_attach` can be used within the autogenerate process to intercept and un-attach unwanted CHECK constraints. Autogenerate can't currently, but **will eventually detect**: * Some free-standing constraint additions and removals may not be supported, including PRIMARY KEY, EXCLUDE, CHECK; these are not necessarily implemented within the autogenerate detection system and also may not be supported by the supporting SQLAlchemy dialect. * Sequence additions, removals - not yet implemented. Autogenerating Multiple MetaData collections -------------------------------------------- The ``target_metadata`` collection may also be defined as a sequence if an application has multiple :class:`~sqlalchemy.schema.MetaData` collections involved:: from myapp.mymodel1 import Model1Base from myapp.mymodel2 import Model2Base target_metadata = [Model1Base.metadata, Model2Base.metadata] The sequence of :class:`~sqlalchemy.schema.MetaData` collections will be consulted in order during the autogenerate process. Note that each :class:`~sqlalchemy.schema.MetaData` must contain **unique** table keys (e.g. the "key" is the combination of the table's name and schema); if two :class:`~sqlalchemy.schema.MetaData` objects contain a table with the same schema/name combination, an error is raised. .. versionchanged:: 0.9.0 the :paramref:`.EnvironmentContext.configure.target_metadata` parameter may now be passed a sequence of :class:`~sqlalchemy.schema.MetaData` objects to support autogeneration of multiple :class:`~sqlalchemy.schema.MetaData` collections. Comparing and Rendering Types ------------------------------ The area of autogenerate's behavior of comparing and rendering Python-based type objects in migration scripts presents a challenge, in that there's a very wide variety of types to be rendered in scripts, including those part of SQLAlchemy as well as user-defined types. A few options are given to help out with this task. .. _autogen_module_prefix: Controlling the Module Prefix ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ When types are rendered, they are generated with a **module prefix**, so that they are available based on a relatively small number of imports. The rules for what the prefix is is based on the kind of datatype as well as configurational settings. For example, when Alembic renders SQLAlchemy types, it will by default prefix the type name with the prefix ``sa.``:: Column("my_column", sa.Integer()) The use of the ``sa.`` prefix is controllable by altering the value of :paramref:`.EnvironmentContext.configure.sqlalchemy_module_prefix`:: def run_migrations_online(): # ... context.configure( connection=connection, target_metadata=target_metadata, sqlalchemy_module_prefix="sqla.", # ... ) # ... In either case, the ``sa.`` prefix, or whatever prefix is desired, should also be included in the imports section of ``script.py.mako``; it also defaults to ``import sqlalchemy as sa``. For user-defined types, that is, any custom type that is not within the ``sqlalchemy.`` module namespace, by default Alembic will use the **value of __module__ for the custom type**:: Column("my_column", myapp.models.utils.types.MyCustomType()) The imports for the above type again must be made present within the migration, either manually, or by adding it to ``script.py.mako``. .. versionchanged:: 0.7.0 The default module prefix rendering for a user-defined type now makes use of the type's ``__module__`` attribute to retrieve the prefix, rather than using the value of :paramref:`~.EnvironmentContext.configure.sqlalchemy_module_prefix`. The above custom type has a long and cumbersome name based on the use of ``__module__`` directly, which also implies that lots of imports would be needed in order to accomodate lots of types. For this reason, it is recommended that user-defined types used in migration scripts be made available from a single module. Suppose we call it ``myapp.migration_types``:: # myapp/migration_types.py from myapp.models.utils.types import MyCustomType We can first add an import for ``migration_types`` to our ``script.py.mako``:: from alembic import op import sqlalchemy as sa import myapp.migration_types ${imports if imports else ""} We then override Alembic's use of ``__module__`` by providing a fixed prefix, using the :paramref:`.EnvironmentContext.configure.user_module_prefix` option:: def run_migrations_online(): # ... context.configure( connection=connection, target_metadata=target_metadata, user_module_prefix="myapp.migration_types.", # ... ) # ... Above, we now would get a migration like:: Column("my_column", myapp.migration_types.MyCustomType()) Now, when we inevitably refactor our application to move ``MyCustomType`` somewhere else, we only need modify the ``myapp.migration_types`` module, instead of searching and replacing all instances within our migration scripts. .. versionadded:: 0.6.3 Added :paramref:`.EnvironmentContext.configure.user_module_prefix`. .. _autogen_render_types: Affecting the Rendering of Types Themselves ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ The methodology Alembic uses to generate SQLAlchemy and user-defined type constructs as Python code is plain old ``__repr__()``. SQLAlchemy's built-in types for the most part have a ``__repr__()`` that faithfully renders a Python-compatible constructor call, but there are some exceptions, particularly in those cases when a constructor accepts arguments that aren't compatible with ``__repr__()``, such as a pickling function. When building a custom type that will be rendered into a migration script, it is often necessary to explicitly give the type a ``__repr__()`` that will faithfully reproduce the constructor for that type. This, in combination with :paramref:`.EnvironmentContext.configure.user_module_prefix`, is usually enough. However, if additional behaviors are needed, a more comprehensive hook is the :paramref:`.EnvironmentContext.configure.render_item` option. This hook allows one to provide a callable function within ``env.py`` that will fully take over how a type is rendered, including its module prefix:: def render_item(type_, obj, autogen_context): """Apply custom rendering for selected items.""" if type_ == 'type' and isinstance(obj, MySpecialType): return "mypackage.%r" % obj # default rendering for other objects return False def run_migrations_online(): # ... context.configure( connection=connection, target_metadata=target_metadata, render_item=render_item, # ... ) # ... In the above example, we'd ensure our ``MySpecialType`` includes an appropriate ``__repr__()`` method, which is invoked when we call it against ``"%r"``. The callable we use for :paramref:`.EnvironmentContext.configure.render_item` can also add imports to our migration script. The :class:`.AutogenContext` passed in contains a datamember called :attr:`.AutogenContext.imports`, which is a Python ``set()`` for which we can add new imports. For example, if ``MySpecialType`` were in a module called ``mymodel.types``, we can add the import for it as we encounter the type:: def render_item(type_, obj, autogen_context): """Apply custom rendering for selected items.""" if type_ == 'type' and isinstance(obj, MySpecialType): # add import for this type autogen_context.imports.add("from mymodel import types") return "types.%r" % obj # default rendering for other objects return False .. versionchanged:: 0.8 The ``autogen_context`` data member passed to the ``render_item`` callable is now an instance of :class:`.AutogenContext`. .. versionchanged:: 0.8.3 The "imports" data member of the autogen context is restored to the new :class:`.AutogenContext` object as :attr:`.AutogenContext.imports`. The finished migration script will include our imports where the ``${imports}`` expression is used, producing output such as:: from alembic import op import sqlalchemy as sa from mymodel import types def upgrade(): op.add_column('sometable', Column('mycolumn', types.MySpecialType())) .. _compare_types: Comparing Types ^^^^^^^^^^^^^^^^ The default type comparison logic will work for SQLAlchemy built in types as well as basic user defined types. This logic is only enabled if the :paramref:`.EnvironmentContext.configure.compare_type` parameter is set to True:: context.configure( # ... compare_type = True ) .. note:: The default type comparison logic (which is end-user extensible) currently works for **major changes in type only**, such as between ``Numeric`` and ``String``. The logic will **not** detect changes such as: * changes between types that have the same "type affinity", such as between ``VARCHAR`` and ``TEXT``, or ``FLOAT`` and ``NUMERIC`` * changes between the arguments within the type, such as the lengths of strings, precision values for numerics, the elements inside of an enumeration. Detection of these kinds of parameters is a long term project on the SQLAlchemy side. Alternatively, the :paramref:`.EnvironmentContext.configure.compare_type` parameter accepts a callable function which may be used to implement custom type comparison logic, for cases such as where special user defined types are being used:: def my_compare_type(context, inspected_column, metadata_column, inspected_type, metadata_type): # return False if the metadata_type is the same as the inspected_type # or None to allow the default implementation to compare these # types. a return value of True means the two types do not # match and should result in a type change operation. return None context.configure( # ... compare_type = my_compare_type ) Above, ``inspected_column`` is a :class:`sqlalchemy.schema.Column` as returned by :meth:`sqlalchemy.engine.reflection.Inspector.reflecttable`, whereas ``metadata_column`` is a :class:`sqlalchemy.schema.Column` from the local model environment. A return value of ``None`` indicates that default type comparison to proceed. Additionally, custom types that are part of imported or third party packages which have special behaviors such as per-dialect behavior should implement a method called ``compare_against_backend()`` on their SQLAlchemy type. If this method is present, it will be called where it can also return True or False to specify the types compare as equivalent or not; if it returns None, default type comparison logic will proceed:: class MySpecialType(TypeDecorator): # ... def compare_against_backend(self, dialect, conn_type): # return True if this type is the same as the given database type, # or None to allow the default implementation to compare these # types. a return value of False means the given type does not # match this type. if dialect.name == 'postgresql': return isinstance(conn_type, postgresql.UUID) else: return isinstance(conn_type, String) .. warning:: The boolean return values for the above ``compare_against_backend`` method, which is part of SQLAlchemy and not Alembic,are **the opposite** of that of the :paramref:`.EnvironmentContext.configure.compare_type` callable, returning ``True`` for types that are the same vs. ``False`` for types that are different.The :paramref:`.EnvironmentContext.configure.compare_type` callable on the other hand should return ``True`` for types that are **different**. The order of precedence regarding the :paramref:`.EnvironmentContext.configure.compare_type` callable vs. the type itself implementing ``compare_against_backend`` is that the :paramref:`.EnvironmentContext.configure.compare_type` callable is favored first; if it returns ``None``, then the ``compare_against_backend`` method will be used, if present on the metadata type. If that returns ``None``, then a basic check for type equivalence is run. .. versionadded:: 0.7.6 - added support for the ``compare_against_backend()`` method. zzzeek-alembic-bee044a1c187/docs/build/batch.rst000066400000000000000000000405551353106760100214550ustar00rootroot00000000000000.. _batch_migrations: Running "Batch" Migrations for SQLite and Other Databases ========================================================= .. note:: "Batch mode" for SQLite and other databases is a new and intricate feature within the 0.7.0 series of Alembic, and should be considered as "beta" for the next several releases. .. versionadded:: 0.7.0 The SQLite database presents a challenge to migration tools in that it has almost no support for the ALTER statement upon which relational schema migrations rely upon. The rationale for this stems from philosophical and architectural concerns within SQLite, and they are unlikely to be changed. Migration tools are instead expected to produce copies of SQLite tables that correspond to the new structure, transfer the data from the existing table to the new one, then drop the old table. For our purposes here we'll call this **"move and copy"** workflow, and in order to accommodate it in a way that is reasonably predictable, while also remaining compatible with other databases, Alembic provides the **batch** operations context. Within this context, a relational table is named, and then a series of mutation operations to that table alone are specified within the block. When the context is complete, a process begins whereby the "move and copy" procedure begins; the existing table structure is reflected from the database, a new version of this table is created with the given changes, data is copied from the old table to the new table using "INSERT from SELECT", and finally the old table is dropped and the new one renamed to the original name. The :meth:`.Operations.batch_alter_table` method provides the gateway to this process:: with op.batch_alter_table("some_table") as batch_op: batch_op.add_column(Column('foo', Integer)) batch_op.drop_column('bar') When the above directives are invoked within a migration script, on a SQLite backend we would see SQL like: .. sourcecode:: sql CREATE TABLE _alembic_batch_temp ( id INTEGER NOT NULL, foo INTEGER, PRIMARY KEY (id) ); INSERT INTO _alembic_batch_temp (id) SELECT some_table.id FROM some_table; DROP TABLE some_table; ALTER TABLE _alembic_batch_temp RENAME TO some_table; On other backends, we'd see the usual ``ALTER`` statements done as though there were no batch directive - the batch context by default only does the "move and copy" process if SQLite is in use, and if there are migration directives other than :meth:`.Operations.add_column` present, which is the one kind of column-level ALTER statement that SQLite supports. :meth:`.Operations.batch_alter_table` can be configured to run "move and copy" unconditionally in all cases, including on databases other than SQLite; more on this is below. .. _batch_controlling_table_reflection: Controlling Table Reflection ---------------------------- The :class:`~sqlalchemy.schema.Table` object that is reflected when "move and copy" proceeds is performed using the standard ``autoload=True`` approach. This call can be affected using the :paramref:`~.Operations.batch_alter_table.reflect_args` and :paramref:`~.Operations.batch_alter_table.reflect_kwargs` arguments. For example, to override a :class:`~sqlalchemy.schema.Column` within the reflection process such that a :class:`~sqlalchemy.types.Boolean` object is reflected with the ``create_constraint`` flag set to ``False``:: with self.op.batch_alter_table( "bar", reflect_args=[Column('flag', Boolean(create_constraint=False))] ) as batch_op: batch_op.alter_column( 'flag', new_column_name='bflag', existing_type=Boolean) Another use case, add a listener to the :class:`~sqlalchemy.schema.Table` as it is reflected so that special logic can be applied to columns or types, using the :meth:`~sqlalchemy.events.DDLEvents.column_reflect` event:: def listen_for_reflect(inspector, table, column_info): "correct an ENUM type" if column_info['name'] == 'my_enum': column_info['type'] = Enum('a', 'b', 'c') with self.op.batch_alter_table( "bar", reflect_kwargs=dict( listeners=[ ('column_reflect', listen_for_reflect) ] ) ) as batch_op: batch_op.alter_column( 'flag', new_column_name='bflag', existing_type=Boolean) The reflection process may also be bypassed entirely by sending a pre-fabricated :class:`~sqlalchemy.schema.Table` object; see :ref:`batch_offline_mode` for an example. .. versionadded:: 0.7.1 added :paramref:`.Operations.batch_alter_table.reflect_args` and :paramref:`.Operations.batch_alter_table.reflect_kwargs` options. .. _sqlite_batch_constraints: Dealing with Constraints ------------------------ There are a variety of issues when using "batch" mode with constraints, such as FOREIGN KEY, CHECK and UNIQUE constraints. This section will attempt to detail many of these scenarios. .. _dropping_sqlite_foreign_keys: Dropping Unnamed or Named Foreign Key Constraints ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ SQLite, unlike any other database, allows constraints to exist in the database that have no identifying name. On all other backends, the target database will always generate some kind of name, if one is not given. The first challenge this represents is that an unnamed constraint can't by itself be targeted by the :meth:`.BatchOperations.drop_constraint` method. An unnamed FOREIGN KEY constraint is implicit whenever the :class:`~sqlalchemy.schema.ForeignKey` or :class:`~sqlalchemy.schema.ForeignKeyConstraint` objects are used without passing them a name. Only on SQLite will these constraints remain entirely unnamed when they are created on the target database; an automatically generated name will be assigned in the case of all other database backends. A second issue is that SQLAlchemy itself has inconsistent behavior in dealing with SQLite constraints as far as names. Prior to version 1.0, SQLAlchemy omits the name of foreign key constraints when reflecting them against the SQLite backend. So even if the target application has gone through the steps to apply names to the constraints as stated in the database, they still aren't targetable within the batch reflection process prior to SQLAlchemy 1.0. Within the scope of batch mode, this presents the issue that the :meth:`.BatchOperations.drop_constraint` method requires a constraint name in order to target the correct constraint. In order to overcome this, the :meth:`.Operations.batch_alter_table` method supports a :paramref:`~.Operations.batch_alter_table.naming_convention` argument, so that all reflected constraints, including foreign keys that are unnamed, or were named but SQLAlchemy isn't loading this name, may be given a name, as described in :ref:`autogen_naming_conventions`. Usage is as follows:: naming_convention = { "fk": "fk_%(table_name)s_%(column_0_name)s_%(referred_table_name)s", } with self.op.batch_alter_table( "bar", naming_convention=naming_convention) as batch_op: batch_op.drop_constraint( "fk_bar_foo_id_foo", type_="foreignkey") Note that the naming convention feature requires at least **SQLAlchemy 0.9.4** for support. .. versionadded:: 0.7.1 added :paramref:`~.Operations.batch_alter_table.naming_convention` to :meth:`.Operations.batch_alter_table`. Including unnamed UNIQUE constraints ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ A similar, but frustratingly slightly different, issue is that in the case of UNIQUE constraints, we again have the issue that SQLite allows unnamed UNIQUE constraints to exist on the database, however in this case, SQLAlchemy prior to version 1.0 doesn't reflect these constraints at all. It does properly reflect named unique constraints with their names, however. So in this case, the workaround for foreign key names is still not sufficient prior to SQLAlchemy 1.0. If our table includes unnamed unique constraints, and we'd like them to be re-created along with the table, we need to include them directly, which can be via the :paramref:`~.Operations.batch_alter_table.table_args` argument:: with self.op.batch_alter_table( "bar", table_args=(UniqueConstraint('username'),) ): batch_op.add_column(Column('foo', Integer)) Changing the Type of Boolean, Enum and other implicit CHECK datatypes ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ The SQLAlchemy types :class:`~sqlalchemy.types.Boolean` and :class:`~sqlalchemy.types.Enum` are part of a category of types known as "schema" types; this style of type creates other structures along with the type itself, most commonly (but not always) a CHECK constraint. Alembic handles dropping and creating the CHECK constraints here automatically, including in the case of batch mode. When changing the type of an existing column, what's necessary is that the existing type be specified fully:: with self.op.batch_alter_table("some_table"): batch_op.alter_column( 'q', type_=Integer, existing_type=Boolean(create_constraint=True, constraint_name="ck1")) Including CHECK constraints ^^^^^^^^^^^^^^^^^^^^^^^^^^^ SQLAlchemy currently doesn't reflect CHECK constraints on any backend. So again these must be stated explicitly if they are to be included in the recreated table:: with op.batch_alter_table("some_table", table_args=[ CheckConstraint('x > 5') ]) as batch_op: batch_op.add_column(Column('foo', Integer)) batch_op.drop_column('bar') Note this only includes CHECK constraints that are explicitly stated as part of the table definition, not the CHECK constraints that are generated by datatypes such as :class:`~sqlalchemy.types.Boolean` or :class:`~sqlalchemy.types.Enum`. Dealing with Referencing Foreign Keys ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ It is important to note that batch table operations **do not work** with foreign keys that enforce referential integrity. This because the target table is dropped; if foreign keys refer to it, this will raise an error. On SQLite, whether or not foreign keys actually enforce is controlled by the ``PRAGMA FOREIGN KEYS`` pragma; this pragma, if in use, must be disabled when the workflow mode proceeds. When the operation is complete, the batch-migrated table will have the same name that it started with, so those referring foreign keys will again refer to this table. A special case is dealing with self-referring foreign keys. Here, Alembic takes a special step of recreating the self-referring foreign key as referring to the original table name, rather than at the "temp" table, so that like in the case of other foreign key constraints, when the table is renamed to its original name, the foreign key again references the correct table. This operation only works when referential integrity is disabled, consistent with the same requirement for referring foreign keys from other tables. .. versionchanged:: 0.8.4 Self-referring foreign keys are created with the target table name in batch mode, even though this table will temporarily not exist when dropped. This requires that the target database is not enforcing referential integrity. When SQLite's ``PRAGMA FOREIGN KEYS`` mode is turned on, it does provide the service that foreign key constraints, including self-referential, will automatically be modified to point to their table across table renames, however this mode prevents the target table from being dropped as is required by a batch migration. Therefore it may be necessary to manipulate the ``PRAGMA FOREIGN KEYS`` setting if a migration seeks to rename a table vs. batch migrate it. .. _batch_offline_mode: Working in Offline Mode ----------------------- In the preceding sections, we've seen how much of an emphasis the "move and copy" process has on using reflection in order to know the structure of the table that is to be copied. This means that in the typical case, "online" mode, where a live database connection is present so that :meth:`.Operations.batch_alter_table` can reflect the table from the database, is required; the ``--sql`` flag **cannot** be used without extra steps. To support offline mode, the system must work without table reflection present, which means the full table as it intends to be created must be passed to :meth:`.Operations.batch_alter_table` using :paramref:`~.Operations.batch_alter_table.copy_from`:: meta = MetaData() some_table = Table( 'some_table', meta, Column('id', Integer, primary_key=True), Column('bar', String(50)) ) with op.batch_alter_table("some_table", copy_from=some_table) as batch_op: batch_op.add_column(Column('foo', Integer)) batch_op.drop_column('bar') The above use pattern is pretty tedious and quite far off from Alembic's preferred style of working; however, if one needs to do SQLite-compatible "move and copy" migrations and need them to generate flat SQL files in "offline" mode, there's not much alternative. .. versionadded:: 0.7.6 Fully implemented the :paramref:`~.Operations.batch_alter_table.copy_from` parameter. Batch mode with Autogenerate ---------------------------- The syntax of batch mode is essentially that :meth:`.Operations.batch_alter_table` is used to enter a batch block, and the returned :class:`.BatchOperations` context works just like the regular :class:`.Operations` context, except that the "table name" and "schema name" arguments are omitted. To support rendering of migration commands in batch mode for autogenerate, configure the :paramref:`.EnvironmentContext.configure.render_as_batch` flag in ``env.py``:: context.configure( connection=connection, target_metadata=target_metadata, render_as_batch=True ) Autogenerate will now generate along the lines of:: def upgrade(): ### commands auto generated by Alembic - please adjust! ### with op.batch_alter_table('address', schema=None) as batch_op: batch_op.add_column(sa.Column('street', sa.String(length=50), nullable=True)) This mode is safe to use in all cases, as the :meth:`.Operations.batch_alter_table` directive by default only takes place for SQLite; other backends will behave just as they normally do in the absense of the batch directives. Note that autogenerate support does not include "offline" mode, where the :paramref:`.Operations.batch_alter_table.copy_from` parameter is used. The table definition here would need to be entered into migration files manually if this is needed. Batch mode with databases other than SQLite -------------------------------------------- There's an odd use case some shops have, where the "move and copy" style of migration is useful in some cases for databases that do already support ALTER. There's some cases where an ALTER operation may block access to the table for a long time, which might not be acceptable. "move and copy" can be made to work on other backends, though with a few extra caveats. The batch mode directive will run the "recreate" system regardless of backend if the flag ``recreate='always'`` is passed:: with op.batch_alter_table("some_table", recreate='always') as batch_op: batch_op.add_column(Column('foo', Integer)) The issues that arise in this mode are mostly to do with constraints. Databases such as Postgresql and MySQL with InnoDB will enforce referential integrity (e.g. via foreign keys) in all cases. Unlike SQLite, it's not as simple to turn off referential integrity across the board (nor would it be desirable). Since a new table is replacing the old one, existing foreign key constraints which refer to the target table will need to be unconditionally dropped before the batch operation, and re-created to refer to the new table afterwards. Batch mode currently does not provide any automation for this. The Postgresql database and possibly others also have the behavior such that when the new table is created, a naming conflict occurs with the named constraints of the new table, in that they match those of the old table, and on Postgresql, these names need to be unique across all tables. The Postgresql dialect will therefore emit a "DROP CONSTRAINT" directive for all constraints on the old table before the new one is created; this is "safe" in case of a failed operation because Postgresql also supports transactional DDL. Note that also as is the case with SQLite, CHECK constraints need to be moved over between old and new table manually using the :paramref:`.Operations.batch_alter_table.table_args` parameter. zzzeek-alembic-bee044a1c187/docs/build/branches.rst000066400000000000000000001072461353106760100221620ustar00rootroot00000000000000.. _branches: Working with Branches ===================== .. note:: Alembic 0.7.0 features an all-new versioning model that fully supports branch points, merge points, and long-lived, labeled branches, including independent branches originating from multiple bases. A great emphasis has been placed on there being almost no impact on the existing Alembic workflow, including that all commands work pretty much the same as they did before, the format of migration files doesn't require any change (though there are some changes that are recommended), and even the structure of the ``alembic_version`` table does not change at all. However, most alembic commands now offer new features which will break out an Alembic environment into "branch mode", where things become a lot more intricate. Working in "branch mode" should be considered as a "beta" feature, with many new paradigms and use cases still to be stress tested in the wild. Please tread lightly! .. versionadded:: 0.7.0 A **branch** describes a point in a migration stream when two or more versions refer to the same parent migration as their anscestor. Branches occur naturally when two divergent source trees, both containing Alembic revision files created independently within those source trees, are merged together into one. When this occurs, the challenge of a branch is to **merge** the branches into a single series of changes, so that databases established from either source tree individually can be upgraded to reference the merged result equally. Another scenario where branches are present are when we create them directly; either at some point in the migration stream we'd like different series of migrations to be managed independently (e.g. we create a tree), or we'd like separate migration streams for different features starting at the root (e.g. a *forest*). We'll illustrate all of these cases, starting with the most common which is a source-merge-originated branch that we'll merge. Starting with the "account table" example we began in :ref:`create_migration`, assume we have our basemost version ``1975ea83b712``, which leads into the second revision ``ae1027a6acf``, and the migration files for these two revisions are checked into our source repository. Consider if we merged into our source repository another code branch which contained a revision for another table called ``shopping_cart``. This revision was made against our first Alembic revision, the one that generated ``account``. After loading the second source tree in, a new file ``27c6a30d7c24_add_shopping_cart_table.py`` exists within our ``versions`` directory. Both it, as well as ``ae1027a6acf_add_a_column.py``, reference ``1975ea83b712_add_account_table.py`` as the "downgrade" revision. To illustrate:: # main source tree: 1975ea83b712 (create account table) -> ae1027a6acf (add a column) # branched source tree 1975ea83b712 (create account table) -> 27c6a30d7c24 (add shopping cart table) Above, we can see ``1975ea83b712`` is our **branch point**; two distinct versions both refer to it as its parent. The Alembic command ``branches`` illustrates this fact:: $ alembic branches --verbose Rev: 1975ea83b712 (branchpoint) Parent: Branches into: 27c6a30d7c24, ae1027a6acf Path: foo/versions/1975ea83b712_add_account_table.py create account table Revision ID: 1975ea83b712 Revises: Create Date: 2014-11-20 13:02:46.257104 -> 27c6a30d7c24 (head), add shopping cart table -> ae1027a6acf (head), add a column History shows it too, illustrating two ``head`` entries as well as a ``branchpoint``:: $ alembic history 1975ea83b712 -> 27c6a30d7c24 (head), add shopping cart table 1975ea83b712 -> ae1027a6acf (head), add a column -> 1975ea83b712 (branchpoint), create account table We can get a view of just the current heads using ``alembic heads``:: $ alembic heads --verbose Rev: 27c6a30d7c24 (head) Parent: 1975ea83b712 Path: foo/versions/27c6a30d7c24_add_shopping_cart_table.py add shopping cart table Revision ID: 27c6a30d7c24 Revises: 1975ea83b712 Create Date: 2014-11-20 13:03:11.436407 Rev: ae1027a6acf (head) Parent: 1975ea83b712 Path: foo/versions/ae1027a6acf_add_a_column.py add a column Revision ID: ae1027a6acf Revises: 1975ea83b712 Create Date: 2014-11-20 13:02:54.849677 If we try to run an ``upgrade`` to the usual end target of ``head``, Alembic no longer considers this to be an unambiguous command. As we have more than one ``head``, the ``upgrade`` command wants us to provide more information:: $ alembic upgrade head FAILED: Multiple head revisions are present for given argument 'head'; please specify a specific target revision, '@head' to narrow to a specific head, or 'heads' for all heads The ``upgrade`` command gives us quite a few options in which we can proceed with our upgrade, either giving it information on *which* head we'd like to upgrade towards, or alternatively stating that we'd like *all* heads to be upgraded towards at once. However, in the typical case of two source trees being merged, we will want to pursue a third option, which is that we can **merge** these branches. Merging Branches ---------------- An Alembic merge is a migration file that joins two or more "head" files together. If the two branches we have right now can be said to be a "tree" structure, introducing this merge file will turn it into a "diamond" structure:: -- ae1027a6acf --> / \ --> 1975ea83b712 --> --> mergepoint \ / -- 27c6a30d7c24 --> We create the merge file using ``alembic merge``; with this command, we can pass to it an argument such as ``heads``, meaning we'd like to merge all heads. Or, we can pass it individual revision numbers sequentally:: $ alembic merge -m "merge ae1 and 27c" ae1027 27c6a Generating /path/to/foo/versions/53fffde5ad5_merge_ae1_and_27c.py ... done Looking inside the new file, we see it as a regular migration file, with the only new twist is that ``down_revision`` points to both revisions:: """merge ae1 and 27c Revision ID: 53fffde5ad5 Revises: ae1027a6acf, 27c6a30d7c24 Create Date: 2014-11-20 13:31:50.811663 """ # revision identifiers, used by Alembic. revision = '53fffde5ad5' down_revision = ('ae1027a6acf', '27c6a30d7c24') branch_labels = None from alembic import op import sqlalchemy as sa def upgrade(): pass def downgrade(): pass This file is a regular migration file, and if we wish to, we may place :class:`.Operations` directives into the ``upgrade()`` and ``downgrade()`` functions like any other migration file. Though it is probably best to limit the instructions placed here only to those that deal with any kind of reconciliation that is needed between the two merged branches, if any. The ``heads`` command now illustrates that the multiple heads in our ``versions/`` directory have been resolved into our new head:: $ alembic heads --verbose Rev: 53fffde5ad5 (head) (mergepoint) Merges: ae1027a6acf, 27c6a30d7c24 Path: foo/versions/53fffde5ad5_merge_ae1_and_27c.py merge ae1 and 27c Revision ID: 53fffde5ad5 Revises: ae1027a6acf, 27c6a30d7c24 Create Date: 2014-11-20 13:31:50.811663 History shows a similar result, as the mergepoint becomes our head:: $ alembic history ae1027a6acf, 27c6a30d7c24 -> 53fffde5ad5 (head) (mergepoint), merge ae1 and 27c 1975ea83b712 -> ae1027a6acf, add a column 1975ea83b712 -> 27c6a30d7c24, add shopping cart table -> 1975ea83b712 (branchpoint), create account table With a single ``head`` target, a generic ``upgrade`` can proceed:: $ alembic upgrade head INFO [alembic.migration] Context impl PostgresqlImpl. INFO [alembic.migration] Will assume transactional DDL. INFO [alembic.migration] Running upgrade -> 1975ea83b712, create account table INFO [alembic.migration] Running upgrade 1975ea83b712 -> 27c6a30d7c24, add shopping cart table INFO [alembic.migration] Running upgrade 1975ea83b712 -> ae1027a6acf, add a column INFO [alembic.migration] Running upgrade ae1027a6acf, 27c6a30d7c24 -> 53fffde5ad5, merge ae1 and 27c .. topic:: merge mechanics The upgrade process traverses through all of our migration files using a **topological sorting** algorithm, treating the list of migration files not as a linked list, but as a **directed acyclic graph**. The starting points of this traversal are the **current heads** within our database, and the end point is the "head" revision or revisions specified. When a migration proceeds across a point at which there are multiple heads, the ``alembic_version`` table will at that point store *multiple* rows, one for each head. Our migration process above will emit SQL against ``alembic_version`` along these lines: .. sourcecode:: sql -- Running upgrade -> 1975ea83b712, create account table INSERT INTO alembic_version (version_num) VALUES ('1975ea83b712') -- Running upgrade 1975ea83b712 -> 27c6a30d7c24, add shopping cart table UPDATE alembic_version SET version_num='27c6a30d7c24' WHERE alembic_version.version_num = '1975ea83b712' -- Running upgrade 1975ea83b712 -> ae1027a6acf, add a column INSERT INTO alembic_version (version_num) VALUES ('ae1027a6acf') -- Running upgrade ae1027a6acf, 27c6a30d7c24 -> 53fffde5ad5, merge ae1 and 27c DELETE FROM alembic_version WHERE alembic_version.version_num = 'ae1027a6acf' UPDATE alembic_version SET version_num='53fffde5ad5' WHERE alembic_version.version_num = '27c6a30d7c24' At the point at which both ``27c6a30d7c24`` and ``ae1027a6acf`` exist within our database, both values are present in ``alembic_version``, which now has two rows. If we upgrade to these two versions alone, then stop and run ``alembic current``, we will see this:: $ alembic current --verbose Current revision(s) for postgresql://scott:XXXXX@localhost/test: Rev: ae1027a6acf Parent: 1975ea83b712 Path: foo/versions/ae1027a6acf_add_a_column.py add a column Revision ID: ae1027a6acf Revises: 1975ea83b712 Create Date: 2014-11-20 13:02:54.849677 Rev: 27c6a30d7c24 Parent: 1975ea83b712 Path: foo/versions/27c6a30d7c24_add_shopping_cart_table.py add shopping cart table Revision ID: 27c6a30d7c24 Revises: 1975ea83b712 Create Date: 2014-11-20 13:03:11.436407 A key advantage to the ``merge`` process is that it will run equally well on databases that were present on version ``ae1027a6acf`` alone, versus databases that were present on version ``27c6a30d7c24`` alone; whichever version was not yet applied, will be applied before the merge point can be crossed. This brings forth a way of thinking about a merge file, as well as about any Alembic revision file. As they are considered to be "nodes" within a set that is subject to topological sorting, each "node" is a point that cannot be crossed until all of its dependencies are satisfied. Prior to Alembic's support of merge points, the use case of databases sitting on different heads was basically impossible to reconcile; having to manually splice the head files together invariably meant that one migration would occur before the other, thus being incompatible with databases that were present on the other migration. Working with Explicit Branches ------------------------------ The ``alembic upgrade`` command hinted at other options besides merging when dealing with multiple heads. Let's back up and assume we're back where we have as our heads just ``ae1027a6acf`` and ``27c6a30d7c24``:: $ alembic heads 27c6a30d7c24 ae1027a6acf Earlier, when we did ``alembic upgrade head``, it gave us an error which suggested ``please specify a specific target revision, '@head' to narrow to a specific head, or 'heads' for all heads`` in order to proceed without merging. Let's cover those cases. Referring to all heads at once ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ The ``heads`` identifier is a lot like ``head``, except it explicitly refers to *all* heads at once. That is, it's like telling Alembic to do the operation for both ``ae1027a6acf`` and ``27c6a30d7c24`` simultaneously. If we started from a fresh database and ran ``upgrade heads`` we'd see:: $ alembic upgrade heads INFO [alembic.migration] Context impl PostgresqlImpl. INFO [alembic.migration] Will assume transactional DDL. INFO [alembic.migration] Running upgrade -> 1975ea83b712, create account table INFO [alembic.migration] Running upgrade 1975ea83b712 -> ae1027a6acf, add a column INFO [alembic.migration] Running upgrade 1975ea83b712 -> 27c6a30d7c24, add shopping cart table Since we've upgraded to ``heads``, and we do in fact have more than one head, that means these two distinct heads are now in our ``alembic_version`` table. We can see this if we run ``alembic current``:: $ alembic current ae1027a6acf (head) 27c6a30d7c24 (head) That means there's two rows in ``alembic_version`` right now. If we downgrade one step at a time, Alembic will **delete** from the ``alembic_version`` table each branch that's closed out, until only one branch remains; then it will continue updating the single value down to the previous versions:: $ alembic downgrade -1 INFO [alembic.migration] Running downgrade ae1027a6acf -> 1975ea83b712, add a column $ alembic current 27c6a30d7c24 (head) $ alembic downgrade -1 INFO [alembic.migration] Running downgrade 27c6a30d7c24 -> 1975ea83b712, add shopping cart table $ alembic current 1975ea83b712 (branchpoint) $ alembic downgrade -1 INFO [alembic.migration] Running downgrade 1975ea83b712 -> , create account table $ alembic current Referring to a Specific Version ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ We can pass a specific version number to ``upgrade``. Alembic will ensure that all revisions upon which this version depends are invoked, and nothing more. So if we ``upgrade`` either to ``27c6a30d7c24`` or ``ae1027a6acf`` specifically, it guarantees that ``1975ea83b712`` will have been applied, but not that any "sibling" versions are applied:: $ alembic upgrade 27c6a INFO [alembic.migration] Running upgrade -> 1975ea83b712, create account table INFO [alembic.migration] Running upgrade 1975ea83b712 -> 27c6a30d7c24, add shopping cart table With ``1975ea83b712`` and ``27c6a30d7c24`` applied, ``ae1027a6acf`` is just a single additional step:: $ alembic upgrade ae102 INFO [alembic.migration] Running upgrade 1975ea83b712 -> ae1027a6acf, add a column Working with Branch Labels ^^^^^^^^^^^^^^^^^^^^^^^^^^ To satisfy the use case where an environment has long-lived branches, especially independent branches as will be discussed in the next section, Alembic supports the concept of **branch labels**. These are string values that are present within the migration file, using the new identifier ``branch_labels``. For example, if we want to refer to the "shopping cart" branch using the name "shoppingcart", we can add that name to our file ``27c6a30d7c24_add_shopping_cart_table.py``:: """add shopping cart table """ # revision identifiers, used by Alembic. revision = '27c6a30d7c24' down_revision = '1975ea83b712' branch_labels = ('shoppingcart',) # ... The ``branch_labels`` attribute refers to a string name, or a tuple of names, which will now apply to this revision, all descendants of this revision, as well as all ancestors of this revision up until the preceding branch point, in this case ``1975ea83b712``. We can see the ``shoppingcart`` label applied to this revision:: $ alembic history 1975ea83b712 -> 27c6a30d7c24 (shoppingcart) (head), add shopping cart table 1975ea83b712 -> ae1027a6acf (head), add a column -> 1975ea83b712 (branchpoint), create account table With the label applied, the name ``shoppingcart`` now serves as an alias for the ``27c6a30d7c24`` revision specifically. We can illustrate this by showing it with ``alembic show``:: $ alembic show shoppingcart Rev: 27c6a30d7c24 (head) Parent: 1975ea83b712 Branch names: shoppingcart Path: foo/versions/27c6a30d7c24_add_shopping_cart_table.py add shopping cart table Revision ID: 27c6a30d7c24 Revises: 1975ea83b712 Create Date: 2014-11-20 13:03:11.436407 However, when using branch labels, we usually want to use them using a syntax known as "branch at" syntax; this syntax allows us to state that we want to use a specific revision, let's say a "head" revision, in terms of a *specific* branch. While normally, we can't refer to ``alembic upgrade head`` when there's multiple heads, we *can* refer to this head specifcally using ``shoppingcart@head`` syntax:: $ alembic upgrade shoppingcart@head INFO [alembic.migration] Running upgrade 1975ea83b712 -> 27c6a30d7c24, add shopping cart table The ``shoppingcart@head`` syntax becomes important to us if we wish to add new migration files to our versions directory while maintaining multiple branches. Just like the ``upgrade`` command, if we attempted to add a new revision file to our multiple-heads layout without a specific parent revision, we'd get a familiar error:: $ alembic revision -m "add a shopping cart column" FAILED: Multiple heads are present; please specify the head revision on which the new revision should be based, or perform a merge. The ``alembic revision`` command is pretty clear in what we need to do; to add our new revision specifically to the ``shoppingcart`` branch, we use the ``--head`` argument, either with the specific revision identifier ``27c6a30d7c24``, or more generically using our branchname ``shoppingcart@head``:: $ alembic revision -m "add a shopping cart column" --head shoppingcart@head Generating /path/to/foo/versions/d747a8a8879_add_a_shopping_cart_column.py ... done ``alembic history`` shows both files now part of the ``shoppingcart`` branch:: $ alembic history 1975ea83b712 -> ae1027a6acf (head), add a column 27c6a30d7c24 -> d747a8a8879 (shoppingcart) (head), add a shopping cart column 1975ea83b712 -> 27c6a30d7c24 (shoppingcart), add shopping cart table -> 1975ea83b712 (branchpoint), create account table We can limit our history operation just to this branch as well:: $ alembic history -r shoppingcart: 27c6a30d7c24 -> d747a8a8879 (shoppingcart) (head), add a shopping cart column 1975ea83b712 -> 27c6a30d7c24 (shoppingcart), add shopping cart table If we want to illustrate the path of ``shoppingcart`` all the way from the base, we can do that as follows:: $ alembic history -r :shoppingcart@head 27c6a30d7c24 -> d747a8a8879 (shoppingcart) (head), add a shopping cart column 1975ea83b712 -> 27c6a30d7c24 (shoppingcart), add shopping cart table -> 1975ea83b712 (branchpoint), create account table We can run this operation from the "base" side as well, but we get a different result:: $ alembic history -r shoppingcart@base: 1975ea83b712 -> ae1027a6acf (head), add a column 27c6a30d7c24 -> d747a8a8879 (shoppingcart) (head), add a shopping cart column 1975ea83b712 -> 27c6a30d7c24 (shoppingcart), add shopping cart table -> 1975ea83b712 (branchpoint), create account table When we list from ``shoppingcart@base`` without an endpoint, it's really shorthand for ``-r shoppingcart@base:heads``, e.g. all heads, and since ``shoppingcart@base`` is the same "base" shared by the ``ae1027a6acf`` revision, we get that revision in our listing as well. The ``@base`` syntax can be useful when we are dealing with individual bases, as we'll see in the next section. The ``@head`` format can also be used with revision numbers instead of branch names, though this is less convenient. If we wanted to add a new revision to our branch that includes the un-labeled ``ae1027a6acf``, if this weren't a head already, we could ask for the "head of the branch that includes ``ae1027a6acf``" as follows:: $ alembic revision -m "add another account column" --head ae10@head Generating /path/to/foo/versions/55af2cb1c267_add_another_account_column.py ... done More Label Syntaxes ^^^^^^^^^^^^^^^^^^^ The ``heads`` symbol can be combined with a branch label, in the case that your labeled branch itself breaks off into multiple branches:: $ alembic upgrade shoppingcart@heads Relative identifiers, as introduced in :ref:`relative_migrations`, work with labels too. For example, upgrading to ``shoppingcart@+2`` means to upgrade from current heads on "shoppingcart" upwards two revisions:: $ alembic upgrade shoppingcart@+2 This kind of thing works from history as well:: $ alembic history -r current:shoppingcart@+2 The newer ``relnum+delta`` format can be combined as well, for example if we wanted to list along ``shoppingcart`` up until two revisions before the head:: $ alembic history -r :shoppingcart@head-2 .. _multiple_bases: Working with Multiple Bases --------------------------- .. note:: The multiple base feature is intended to allow for multiple Alembic versioning lineages which **share the same alembic_version table**. This is so that individual revisions within the lineages can have cross-dependencies on each other. For the simpler case where one project has multiple, **completely independent** revision lineages that refer to **separate** alembic_version tables, see the example in :ref:`multiple_environments`. We've seen in the previous section that ``alembic upgrade`` is fine if we have multiple heads, ``alembic revision`` allows us to tell it which "head" we'd like to associate our new revision file with, and branch labels allow us to assign names to branches that we can use in subsequent commands. Let's put all these together and refer to a new "base", that is, a whole new tree of revision files that will be semi-independent of the account/shopping cart revisions we've been working with. This new tree will deal with database tables involving "networking". .. _multiple_version_directories: Setting up Multiple Version Directories ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ While optional, it is often the case that when working with multiple bases, we'd like different sets of version files to exist within their own directories; typically, if an application is organized into several sub-modules, each one would have a version directory containing migrations pertinent to that module. So to start out, we can edit ``alembic.ini`` to refer to multiple directories; we'll also state the current ``versions`` directory as one of them:: # version location specification; this defaults # to foo/versions. When using multiple version # directories, initial revisions must be specified with --version-path version_locations = %(here)s/model/networking %(here)s/alembic/versions The new directory ``%(here)s/model/networking`` is in terms of where the ``alembic.ini`` file is, as we are using the symbol ``%(here)s`` which resolves to this location. When we create our first new revision targeted at this directory, ``model/networking`` will be created automatically if it does not exist yet. Once we've created a revision here, the path is used automatically when generating subsequent revision files that refer to this revision tree. Creating a Labeled Base Revision ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ We also want our new branch to have its own name, and for that we want to apply a branch label to the base. In order to achieve this using the ``alembic revision`` command without editing, we need to ensure our ``script.py.mako`` file, used for generating new revision files, has the appropriate substitutions present. If Alembic version 0.7.0 or greater was used to generate the original migration environment, this is already done. However when working with an older environment, ``script.py.mako`` needs to have this directive added, typically underneath the ``down_revision`` directive:: # revision identifiers, used by Alembic. revision = ${repr(up_revision)} down_revision = ${repr(down_revision)} # add this here in order to use revision with branch_label branch_labels = ${repr(branch_labels)} With this in place, we can create a new revision file, starting up a branch that will deal with database tables involving networking; we specify the ``--head`` version of ``base``, a ``--branch-label`` of ``networking``, and the directory we want this first revision file to be placed in with ``--version-path``:: $ alembic revision -m "create networking branch" --head=base --branch-label=networking --version-path=model/networking Creating directory /path/to/foo/model/networking ... done Generating /path/to/foo/model/networking/3cac04ae8714_create_networking_branch.py ... done If we ran the above command and we didn't have the newer ``script.py.mako`` directive, we'd get this error:: FAILED: Version 3cac04ae8714 specified branch_labels networking, however the migration file foo/model/networking/3cac04ae8714_create_networking_branch.py does not have them; have you upgraded your script.py.mako to include the 'branch_labels' section? When we receive the above error, and we would like to try again, we need to either **delete** the incorrectly generated file in order to run ``revision`` again, *or* we can edit the ``3cac04ae8714_create_networking_branch.py`` directly to add the ``branch_labels`` in of our choosing. Running with Multiple Bases ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ Once we have a new, permanent (for as long as we desire it to be) base in our system, we'll always have multiple heads present:: $ alembic heads 3cac04ae8714 (networking) (head) 27c6a30d7c24 (shoppingcart) (head) ae1027a6acf (head) When we want to add a new revision file to ``networking``, we specify ``networking@head`` as the ``--head``. The appropriate version directory is now selected automatically based on the head we choose:: $ alembic revision -m "add ip number table" --head=networking@head Generating /path/to/foo/model/networking/109ec7d132bf_add_ip_number_table.py ... done It's important that we refer to the head using ``networking@head``; if we only refer to ``networking``, that refers to only ``3cac04ae8714`` specifically; if we specify this and it's not a head, ``alembic revision`` will make sure we didn't mean to specify the head:: $ alembic revision -m "add DNS table" --head=networking FAILED: Revision 3cac04ae8714 is not a head revision; please specify --splice to create a new branch from this revision As mentioned earlier, as this base is independent, we can view its history from the base using ``history -r networking@base:``:: $ alembic history -r networking@base: 109ec7d132bf -> 29f859a13ea (networking) (head), add DNS table 3cac04ae8714 -> 109ec7d132bf (networking), add ip number table -> 3cac04ae8714 (networking), create networking branch At the moment, this is the same output we'd get at this point if we used ``-r :networking@head``. However, that will change later on as we use additional directives. We may now run upgrades or downgrades freely, among individual branches (let's assume a clean database again):: $ alembic upgrade networking@head INFO [alembic.migration] Running upgrade -> 3cac04ae8714, create networking branch INFO [alembic.migration] Running upgrade 3cac04ae8714 -> 109ec7d132bf, add ip number table INFO [alembic.migration] Running upgrade 109ec7d132bf -> 29f859a13ea, add DNS table or against the whole thing using ``heads``:: $ alembic upgrade heads INFO [alembic.migration] Running upgrade -> 1975ea83b712, create account table INFO [alembic.migration] Running upgrade 1975ea83b712 -> 27c6a30d7c24, add shopping cart table INFO [alembic.migration] Running upgrade 27c6a30d7c24 -> d747a8a8879, add a shopping cart column INFO [alembic.migration] Running upgrade 1975ea83b712 -> ae1027a6acf, add a column INFO [alembic.migration] Running upgrade ae1027a6acf -> 55af2cb1c267, add another account column Branch Dependencies ------------------- When working with multiple roots, it is expected that these different revision streams will need to refer to one another. For example, a new revision in ``networking`` which needs to refer to the ``account`` table will want to establish ``55af2cb1c267, add another account column``, the last revision that works with the account table, as a dependency. From a graph perspective, this means nothing more that the new file will feature both ``55af2cb1c267, add another account column`` and ``29f859a13ea, add DNS table`` as "down" revisions, and looks just as though we had merged these two branches together. However, we don't want to consider these as "merged"; we want the two revision streams to *remain independent*, even though a version in ``networking`` is going to reach over into the other stream. To support this use case, Alembic provides a directive known as ``depends_on``, which allows a revision file to refer to another as a "dependency", very similar to an entry in ``down_revision`` from a graph perspective, but different from a semantic perspective. To use ``depends_on``, we can specify it as part of our ``alembic revision`` command:: $ alembic revision -m "add ip account table" --head=networking@head --depends-on=55af2cb1c267 Generating /path/to/foo/model/networking/2a95102259be_add_ip_account_table.py ... done Within our migration file, we'll see this new directive present:: # revision identifiers, used by Alembic. revision = '2a95102259be' down_revision = '29f859a13ea' branch_labels = None depends_on='55af2cb1c267' ``depends_on`` may be either a real revision number or a branch name. When specified at the command line, a resolution from a partial revision number will work as well. It can refer to any number of dependent revisions as well; for example, if we were to run the command:: $ alembic revision -m "add ip account table" \\ --head=networking@head \\ --depends-on=55af2cb1c267 --depends-on=d747a --depends-on=fa445 Generating /path/to/foo/model/networking/2a95102259be_add_ip_account_table.py ... done We'd see inside the file:: # revision identifiers, used by Alembic. revision = '2a95102259be' down_revision = '29f859a13ea' branch_labels = None depends_on = ('55af2cb1c267', 'd747a8a8879', 'fa4456a9201') We also can of course add or alter this value within the file manually after it is generated, rather than using the ``--depends-on`` argument. .. versionadded:: 0.8 The ``depends_on`` attribute may be set directly from the ``alembic revision`` command, rather than editing the file directly. ``depends_on`` identifiers may also be specified as branch names at the command line or directly within the migration file. The values may be specified as partial revision numbers from the command line which will be resolved to full revision numbers in the output file. We can see the effect this directive has when we view the history of the ``networking`` branch in terms of "heads", e.g., all the revisions that are descendants:: $ alembic history -r :networking@head 29f859a13ea (55af2cb1c267) -> 2a95102259be (networking) (head), add ip account table 109ec7d132bf -> 29f859a13ea (networking), add DNS table 3cac04ae8714 -> 109ec7d132bf (networking), add ip number table -> 3cac04ae8714 (networking), create networking branch ae1027a6acf -> 55af2cb1c267 (effective head), add another account column 1975ea83b712 -> ae1027a6acf, Add a column -> 1975ea83b712 (branchpoint), create account table What we see is that the full history of the ``networking`` branch, in terms of an "upgrade" to the "head", will include that the tree building up ``55af2cb1c267, add another account column`` will be pulled in first. Interstingly, we don't see this displayed when we display history in the other direction, e.g. from ``networking@base``:: $ alembic history -r networking@base: 29f859a13ea (55af2cb1c267) -> 2a95102259be (networking) (head), add ip account table 109ec7d132bf -> 29f859a13ea (networking), add DNS table 3cac04ae8714 -> 109ec7d132bf (networking), add ip number table -> 3cac04ae8714 (networking), create networking branch The reason for the discrepancy is that displaying history from the base shows us what would occur if we ran a downgrade operation, instead of an upgrade. If we downgraded all the files in ``networking`` using ``networking@base``, the dependencies aren't affected, they're left in place. We also see something odd if we view ``heads`` at the moment:: $ alembic heads 2a95102259be (networking) (head) 27c6a30d7c24 (shoppingcart) (head) 55af2cb1c267 (effective head) The head file that we used as a "dependency", ``55af2cb1c267``, is displayed as an "effective" head, which we can see also in the history display earlier. What this means is that at the moment, if we were to upgrade all versions to the top, the ``55af2cb1c267`` revision number would not actually be present in the ``alembic_version`` table; this is because it does not have a branch of its own subsequent to the ``2a95102259be`` revision which depends on it:: $ alembic upgrade heads INFO [alembic.migration] Running upgrade 29f859a13ea, 55af2cb1c267 -> 2a95102259be, add ip account table $ alembic current 2a95102259be (head) 27c6a30d7c24 (head) The entry is still displayed in ``alembic heads`` because Alembic knows that even though this revision isn't a "real" head, it's still something that we developers consider semantically to be a head, so it's displayed, noting its special status so that we don't get quite as confused when we don't see it within ``alembic current``. If we add a new revision onto ``55af2cb1c267``, the branch again becomes a "real" branch which can have its own entry in the database:: $ alembic revision -m "more account changes" --head=55af2cb@head Generating /path/to/foo/versions/34e094ad6ef1_more_account_changes.py ... done $ alembic upgrade heads INFO [alembic.migration] Running upgrade 55af2cb1c267 -> 34e094ad6ef1, more account changes $ alembic current 2a95102259be (head) 27c6a30d7c24 (head) 34e094ad6ef1 (head) For posterity, the revision tree now looks like:: $ alembic history 29f859a13ea (55af2cb1c267) -> 2a95102259be (networking) (head), add ip account table 109ec7d132bf -> 29f859a13ea (networking), add DNS table 3cac04ae8714 -> 109ec7d132bf (networking), add ip number table -> 3cac04ae8714 (networking), create networking branch 1975ea83b712 -> 27c6a30d7c24 (shoppingcart) (head), add shopping cart table 55af2cb1c267 -> 34e094ad6ef1 (head), more account changes ae1027a6acf -> 55af2cb1c267, add another account column 1975ea83b712 -> ae1027a6acf, Add a column -> 1975ea83b712 (branchpoint), create account table --- 27c6 --> d747 --> / (shoppingcart) --> 1975 --> \ --- ae10 --> 55af --> ^ +--------+ (dependency) | | --> 3782 -----> 109e ----> 29f8 ---> 2a95 --> (networking) If there's any point to be made here, it's if you are too freely branching, merging and labeling, things can get pretty crazy! Hence the branching system should be used carefully and thoughtfully for best results. zzzeek-alembic-bee044a1c187/docs/build/changelog.rst000066400000000000000000004106601353106760100223210ustar00rootroot00000000000000 ========== Changelog ========== .. changelog:: :version: 1.1.0 :released: August 26, 2019 .. change:: :tags: change Alembic 1.1 bumps the minimum version of SQLAlchemy to 1.1. As was the case before, Python requirements remain at Python 2.7, or in the 3.x series Python 3.4. .. change:: :tags: change, internals The test suite for Alembic now makes use of SQLAlchemy's testing framework directly. Previously, Alembic had its own version of this framework that was mostly copied from that of SQLAlchemy to enable testing with older SQLAlchemy versions. The majority of this code is now removed so that both projects can leverage improvements from a common testing framework. .. change:: :tags: bug, commands :tickets: 562 Fixed bug where the double-percent logic applied to some dialects such as psycopg2 would be rendered in ``--sql`` mode, by allowing dialect options to be passed through to the dialect used to generate SQL and then providing ``paramstyle="named"`` so that percent signs need not be doubled. For users having this issue, existing env.py scripts need to add ``dialect_opts={"paramstyle": "named"}`` to their offline context.configure(). See the ``alembic/templates/generic/env.py`` template for an example. .. change:: :tags: bug, py3k Fixed use of the deprecated "imp" module, which is used to detect pep3147 availability as well as to locate .pyc files, which started emitting deprecation warnings during the test suite. The warnings were not being emitted earlier during the test suite, the change is possibly due to changes in py.test itself but this is not clear. The check for pep3147 is set to True for any Python version 3.5 or greater now and importlib is used when available. Note that some dependencies such as distutils may still be emitting this warning. Tests are adjusted to accommodate for dependencies that emit the warning as well. .. change:: :tags: bug, mysql :tickets: 594 Fixed issue where emitting a change of column name for MySQL did not preserve the column comment, even if it were specified as existing_comment. .. change:: :tags: bug, setup :tickets: 592 Removed the "python setup.py test" feature in favor of a straight run of "tox". Per Pypa / pytest developers, "setup.py" commands are in general headed towards deprecation in favor of tox. The tox.ini script has been updated such that running "tox" with no arguments will perform a single run of the test suite against the default installed Python interpreter. .. seealso:: https://github.com/pypa/setuptools/issues/1684 https://github.com/pytest-dev/pytest/issues/5534 .. change:: :tags: usecase, commands :tickets: 571 The "alembic init" command will now proceed if the target directory exists as long as it's still empty. Previously, it would not proceed if the directory existed. The new behavior is modeled from what git does, to accommodate for container or other deployments where an Alembic target directory may need to be already mounted instead of being created with alembic init. Pull request courtesy Aviskar KC. .. changelog:: :version: 1.0.11 :released: June 25, 2019 .. change:: :tags: bug, sqlite, autogenerate, batch :tickets: 579 SQLite server default reflection will ensure parenthesis are surrounding a column default expression that is detected as being a non-constant expression, such as a ``datetime()`` default, to accommodate for the requirement that SQL expressions have to be parenthesized when being sent as DDL. Parenthesis are not added to constant expressions to allow for maximum cross-compatibility with other dialects and existing test suites (such as Alembic's), which necessarily entails scanning the expression to eliminate for constant numeric and string values. The logic is added to the two "reflection->DDL round trip" paths which are currently autogenerate and batch migration. Within autogenerate, the logic is on the rendering side, whereas in batch the logic is installed as a column reflection hook. .. change:: :tags: bug, sqlite, autogenerate :tickets: 579 Improved SQLite server default comparison to accommodate for a ``text()`` construct that added parenthesis directly vs. a construct that relied upon the SQLAlchemy SQLite dialect to render the parenthesis, as well as improved support for various forms of constant expressions such as values that are quoted vs. non-quoted. .. change:: :tags: bug, autogenerate Fixed bug where the "literal_binds" flag was not being set when autogenerate would create a server default value, meaning server default comparisons would fail for functions that contained literal values. .. change:: :tags: bug, mysql :tickets: 554 Added support for MySQL "DROP CHECK", which is added as of MySQL 8.0.16, separate from MariaDB's "DROP CONSTRAINT" for CHECK constraints. The MySQL Alembic implementation now checks for "MariaDB" in server_version_info to decide which one to use. .. change:: :tags: bug, mysql, operations :tickets: 564 Fixed issue where MySQL databases need to use CHANGE COLUMN when altering a server default of CURRENT_TIMESTAMP, NOW() and probably other functions that are only usable with DATETIME/TIMESTAMP columns. While MariaDB supports both CHANGE and ALTER COLUMN in this case, MySQL databases only support CHANGE. So the new logic is that if the server default change is against a DateTime-oriented column, the CHANGE format is used unconditionally, as in the vast majority of cases the server default is to be CURRENT_TIMESTAMP which may also be potentially bundled with an "ON UPDATE CURRENT_TIMESTAMP" directive, which SQLAlchemy does not currently support as a distinct field. The fix addiionally improves the server default comparison logic when the "ON UPDATE" clause is present and there are parenthesis to be adjusted for as is the case on some MariaDB versions. .. change:: :tags: bug, environment Warnings emitted by Alembic now include a default stack level of 2, and in some cases it's set to 3, in order to help warnings indicate more closely where they are originating from. Pull request courtesy Ash Berlin-Taylor. .. change:: :tags: bug, py3k :tickets: 563 Replaced the Python compatbility routines for ``getargspec()`` with a fully vendored version based on ``getfullargspec()`` from Python 3.3. Originally, Python was emitting deprecation warnings for this function in Python 3.8 alphas. While this change was reverted, it was observed that Python 3 implementations for ``getfullargspec()`` are an order of magnitude slower as of the 3.4 series where it was rewritten against ``Signature``. While Python plans to improve upon this situation, SQLAlchemy projects for now are using a simple replacement to avoid any future issues. .. changelog:: :version: 1.0.10 :released: April 28, 2019 .. change:: :tags: bug, commands :tickets: 552 Fixed bug introduced in release 0.9.0 where the helptext for commands inadvertently got expanded to include function docstrings from the command.py module. The logic has been adjusted to only refer to the first line(s) preceding the first line break within each docstring, as was the original intent. .. change:: :tags: bug, operations, mysql :tickets: 551 Added an assertion in :meth:`.RevisionMap.get_revisions` and other methods which ensures revision numbers are passed as strings or collections of strings. Driver issues particularly on MySQL may inadvertently be passing bytes here which leads to failures later on. .. change:: :tags: bug, autogenerate, mysql :tickets: 553 Fixed bug when using the :paramref:`.EnvironmentContext.configure.compare_server_default` flag set to ``True`` where a server default that is introduced in the table metadata on an ``Integer`` column, where there is no existing server default in the database, would raise a ``TypeError``. .. changelog:: :version: 1.0.9 :released: April 15, 2019 .. change:: :tags: bug, operations :tickets: 548 Simplified the internal scheme used to generate the ``alembic.op`` namespace to no longer attempt to generate full method signatures (e.g. rather than generic ``*args, **kw``) as this was not working in most cases anyway, while in rare circumstances it would in fact sporadically have access to the real argument names and then fail when generating the function due to missing symbols in the argument signature. .. changelog:: :version: 1.0.8 :released: March 4, 2019 .. change:: :tags: bug, operations :tickets: 528 Removed use of deprecated ``force`` parameter for SQLAlchemy quoting functions as this parameter will be removed in a future release. Pull request courtesy Parth Shandilya(ParthS007). .. change:: :tags: bug, autogenerate, postgresql, py3k :tickets: 541 Fixed issue where server default comparison on the PostgreSQL dialect would fail for a blank string on Python 3.7 only, due to a change in regular expression behavior in Python 3.7. .. changelog:: :version: 1.0.7 :released: January 25, 2019 .. change:: :tags: bug, autogenerate :tickets: 529 Fixed issue in new comment support where autogenerated Python code for comments wasn't using ``repr()`` thus causing issues with quoting. Pull request courtesy Damien Garaud. .. changelog:: :version: 1.0.6 :released: January 13, 2019 .. change:: :tags: feature, operations :tickets: 422 Added Table and Column level comments for supported backends. New methods :meth:`.Operations.create_table_comment` and :meth:`.Operations.drop_table_comment` are added. A new arguments :paramref:`.Operations.alter_column.comment` and :paramref:`.Operations.alter_column.existing_comment` are added to :meth:`.Operations.alter_column`. Autogenerate support is also added to ensure comment add/drops from tables and columns are generated as well as that :meth:`.Operations.create_table`, :meth:`.Operations.add_column` both include the comment field from the source :class:`.Table` or :class:`.Column` object. .. changelog:: :version: 1.0.5 :released: November 27, 2018 .. change:: :tags: bug, py3k :tickets: 507 Resolved remaining Python 3 deprecation warnings, covering the use of inspect.formatargspec() with a vendored version copied from the Python standard library, importing collections.abc above Python 3.3 when testing against abstract base classes, fixed one occurrence of log.warn(), as well as a few invalid escape sequences. .. changelog:: :version: 1.0.4 :released: November 27, 2018 .. change:: :tags: change Code hosting has been moved to GitHub, at https://github.com/sqlalchemy/alembic. Additionally, the main Alembic website documentation URL is now https://alembic.sqlalchemy.org. .. changelog:: :version: 1.0.3 :released: November 14, 2018 .. change:: :tags: bug, mssql :tickets: 516 Fixed regression caused by :ticket:`513`, where the logic to consume ``mssql_include`` was not correctly interpreting the case where the flag was not present, breaking the ``op.create_index`` directive for SQL Server as a whole. .. changelog:: :version: 1.0.2 :released: October 31, 2018 .. change:: :tags: bug, autogenerate :tickets: 515 The ``system=True`` flag on :class:`.Column`, used primarily in conjunction with the Postgresql "xmin" column, now renders within the autogenerate render process, allowing the column to be excluded from DDL. Additionally, adding a system=True column to a model will produce no autogenerate diff as this column is implicitly present in the database. .. change:: :tags: bug, mssql :tickets: 513 Fixed issue where usage of the SQL Server ``mssql_include`` option within a :meth:`.Operations.create_index` would raise a KeyError, as the additional column(s) need to be added to the table object used by the construct internally. .. changelog:: :version: 1.0.1 :released: October 17, 2018 .. change:: :tags: bug, commands :tickets: 497 Fixed an issue where revision descriptions were essentially being formatted twice. Any revision description that contained characters like %, writing output to stdout will fail because the call to config.print_stdout attempted to format any additional args passed to the function. This fix now only applies string formatting if any args are provided along with the output text. .. change:: :tags: bug, autogenerate :tickets: 512 Fixed issue where removed method ``union_update()`` was used when a customized :class:`.MigrationScript` instance included entries in the ``.imports`` data member, raising an AttributeError. .. changelog:: :version: 1.0.0 :released: July 13, 2018 :released: July 13, 2018 :released: July 13, 2018 .. change:: :tags: feature, general :tickets: 491 For Alembic 1.0, Python 2.6 / 3.3 support is being dropped, allowing a fixed setup.py to be built as well as universal wheels. Pull request courtesy Hugo. .. change:: :tags: feature, general With the 1.0 release, Alembic's minimum SQLAlchemy support version moves to 0.9.0, previously 0.7.9. .. change:: :tags: bug, batch :tickets: 502 Fixed issue in batch where dropping a primary key column, then adding it back under the same name but without the primary_key flag, would not remove it from the existing PrimaryKeyConstraint. If a new PrimaryKeyConstraint is added, it is used as-is, as was the case before. .. changelog:: :version: 0.9.10 :released: June 29, 2018 .. change:: :tags: bug, autogenerate The "op.drop_constraint()" directive will now render using ``repr()`` for the schema name, in the same way that "schema" renders for all the other op directives. Pull request courtesy Denis Kataev. .. change:: :tags: bug, autogenerate :tickets: 494 Added basic capabilities for external dialects to support rendering of "nested" types, like arrays, in a manner similar to that of the Postgresql dialect. .. change:: :tags: bug, autogenerate Fixed issue where "autoincrement=True" would not render for a column that specified it, since as of SQLAlchemy 1.1 this is no longer the default value for "autoincrement". Note the behavior only takes effect against the SQLAlchemy 1.1.0 and higher; for pre-1.1 SQLAlchemy, "autoincrement=True" does not render as was the case before. Pull request courtesy Elad Almos. .. changelog:: :version: 0.9.9 :released: March 22, 2018 .. change:: :tags: feature, commands :tickets: 481 Added new flag ``--indicate-current`` to the ``alembic history`` command. When listing versions, it will include the token "(current)" to indicate the given version is a current head in the target database. Pull request courtesy Kazutaka Mise. .. change:: :tags: bug, autogenerate, mysql :tickets: 455 The fix for :ticket:`455` in version 0.9.6 involving MySQL server default comparison was entirely non functional, as the test itself was also broken and didn't reveal that it wasn't working. The regular expression to compare server default values like CURRENT_TIMESTAMP to current_timestamp() is repaired. .. change:: :tags: bug, mysql, autogenerate :tickets: 483 Fixed bug where MySQL server default comparisons were basically not working at all due to incorrect regexp added in :ticket:`455`. Also accommodates for MariaDB 10.2 quoting differences in reporting integer based server defaults. .. change:: :tags: bug, operations, mysql :tickets: 487 Fixed bug in ``op.drop_constraint()`` for MySQL where quoting rules would not be applied to the constraint name. .. changelog:: :version: 0.9.8 :released: February 16, 2018 .. change:: :tags: bug, runtime :tickets: 482 Fixed bug where the :meth:`.Script.as_revision_number` method did not accommodate for the 'heads' identifier, which in turn caused the :meth:`.EnvironmentContext.get_head_revisions` and :meth:`.EnvironmentContext.get_revision_argument` methods to be not usable when multiple heads were present. The :meth:.`EnvironmentContext.get_head_revisions` method returns a tuple in all cases as documented. .. change:: :tags: bug, postgresql, autogenerate :tickets: 478 Fixed bug where autogenerate of :class:`.ExcludeConstraint` would render a raw quoted name for a Column that has case-sensitive characters, which when invoked as an inline member of the Table would produce a stack trace that the quoted name is not found. An incoming Column object is now rendered as ``sa.column('name')``. .. change:: :tags: bug, autogenerate :tickets: 468 Fixed bug where the indexes would not be included in a migration that was dropping the owning table. The fix now will also emit DROP INDEX for the indexes ahead of time, but more importantly will include CREATE INDEX in the downgrade migration. .. change:: :tags: bug, postgresql :tickets: 480 Fixed the autogenerate of the module prefix when rendering the text_type parameter of postgresql.HSTORE, in much the same way that we do for ARRAY's type and JSON's text_type. .. change:: :tags: bug, mysql :tickets: 479 Added support for DROP CONSTRAINT to the MySQL Alembic dialect to support MariaDB 10.2 which now has real CHECK constraints. Note this change does **not** add autogenerate support, only support for op.drop_constraint() to work. .. changelog:: :version: 0.9.7 :released: January 16, 2018 .. change:: :tags: bug, autogenerate :tickets: 472 Fixed regression caused by :ticket:`421` which would cause case-sensitive quoting rules to interfere with the comparison logic for index names, thus causing indexes to show as added for indexes that have case-sensitive names. Works with SQLAlchemy 0.9 and later series. .. change:: :tags: bug, postgresql, autogenerate :tickets: 461 Fixed bug where autogenerate would produce a DROP statement for the index implicitly created by a Postgresql EXCLUDE constraint, rather than skipping it as is the case for indexes implicitly generated by unique constraints. Makes use of SQLAlchemy 1.0.x's improved "duplicates index" metadata and requires at least SQLAlchemy version 1.0.x to function correctly. .. changelog:: :version: 0.9.6 :released: October 13, 2017 .. change:: :tags: bug, commands :tickets: 458 Fixed a few Python3.6 deprecation warnings by replacing ``StopIteration`` with ``return``, as well as using ``getfullargspec()`` instead of ``getargspec()`` under Python 3. .. change:: :tags: bug, commands :tickets: 441 An addition to :ticket:`441` fixed in 0.9.5, we forgot to also filter for the ``+`` sign in migration names which also breaks due to the relative migrations feature. .. change:: :tags: bug, autogenerate :tickets: 442 Fixed bug expanding upon the fix for :ticket:`85` which adds the correct module import to the "inner" type for an ``ARRAY`` type, the fix now accommodates for the generic ``sqlalchemy.types.ARRAY`` type added in SQLAlchemy 1.1, rendering the inner type correctly regardless of whether or not the Postgresql dialect is present. .. change:: :tags: bug, mysql :tickets: 455 Fixed bug where server default comparison of CURRENT_TIMESTAMP would fail on MariaDB 10.2 due to a change in how the function is represented by the database during reflection. .. change:: :tags: bug, autogenerate Fixed bug where comparison of ``Numeric`` types would produce a difference if the Python-side ``Numeric`` inadvertently specified a non-None "scale" with a "precision" of None, even though this ``Numeric`` type will pass over the "scale" argument when rendering. Pull request courtesy Ivan Mmelnychuk. .. change:: :tags: feature, commands :tickets: 447 The ``alembic history`` command will now make use of the revision environment ``env.py`` unconditionally if the ``revision_environment`` configuration flag is set to True. Previously, the environment would only be invoked if the history specification were against a database-stored revision token. .. change:: :tags: bug, batch :tickets: 457 The name of the temporary table in batch mode is now generated off of the original table name itself, to avoid conflicts for the unusual case of multiple batch operations running against the same database schema at the same time. .. change:: :tags: bug, autogenerate :tickets: 456 A :class:`.ForeignKeyConstraint` can now render correctly if the ``link_to_name`` flag is set, as it will not attempt to resolve the name from a "key" in this case. Additionally, the constraint will render as-is even if the remote column name isn't present on the referenced remote table. .. change:: :tags: bug, runtime, py3k :tickets: 449 Reworked "sourceless" system to be fully capable of handling any combination of: Python2/3x, pep3149 or not, PYTHONOPTIMIZE or not, for locating and loading both env.py files as well as versioning files. This includes: locating files inside of ``__pycache__`` as well as listing out version files that might be only in ``versions/__pycache__``, deduplicating version files that may be in ``versions/__pycache__`` and ``versions/`` at the same time, correctly looking for .pyc or .pyo files based on if pep488 is present or not. The latest Python3x deprecation warnings involving importlib are also corrected. .. changelog:: :version: 0.9.5 :released: August 9, 2017 .. change:: :tags: bug, commands :tickets: 441 A :class:`.CommandError` is raised if the "--rev-id" passed to the :func:`.revision` command contains dashes or at-signs, as this interferes with the command notation used to locate revisions. .. change:: :tags: bug, postgresql :tickets: 424 Added support for the dialect-specific keyword arguments to :meth:`.Operations.drop_index`. This includes support for ``postgresql_concurrently`` and others. .. change:: :tags: bug, commands Fixed bug in timezone feature introduced in :ticket:`425` when the creation date in a revision file is calculated, to accommodate for timezone names that contain mixed-case characters in their name as opposed to all uppercase. Pull request courtesy Nils Philippsen. .. changelog:: :version: 0.9.4 :released: July 31, 2017 .. change:: :tags: bug, runtime Added an additional attribute to the new :paramref:`.EnvironmentContext.configure.on_version_apply` API, :attr:`.MigrationInfo.up_revision_ids`, to accommodate for the uncommon case of the ``alembic stamp`` command being used to move from multiple branches down to a common branchpoint; there will be multiple "up" revisions in this one case. .. changelog:: :version: 0.9.3 :released: July 6, 2017 .. change:: :tags: feature, runtime Added a new callback hook :paramref:`.EnvironmentContext.configure.on_version_apply`, which allows user-defined code to be invoked each time an individual upgrade, downgrade, or stamp operation proceeds against a database. Pull request courtesy John Passaro. .. change:: 433 :tags: bug, autogenerate :tickets: 433 Fixed bug where autogen comparison of a :class:`.Variant` datatype would not compare to the dialect level type for the "default" implementation of the :class:`.Variant`, returning the type as changed between database and table metadata. .. change:: 431 :tags: bug, tests :tickets: 431 Fixed unit tests to run correctly under the SQLAlchemy 1.0.x series prior to version 1.0.10 where a particular bug involving Postgresql exclude constraints was fixed. .. changelog:: :version: 0.9.2 :released: May 18, 2017 .. change:: 429 :tags: bug, mssql :tickets: 429 Repaired :meth:`.Operations.rename_table` for SQL Server when the target table is in a remote schema, the schema name is omitted from the "new name" argument. .. change:: 425 :tags: feature, commands :tickets: 425 Added a new configuration option ``timezone``, a string timezone name that will be applied to the create date timestamp rendered inside the revision file as made availble to the ``file_template`` used to generate the revision filename. Note this change adds the ``python-dateutil`` package as a dependency. .. change:: 421 :tags: bug, autogenerate :tickets: 421 The autogenerate compare scheme now takes into account the name truncation rules applied by SQLAlchemy's DDL compiler to the names of the :class:`.Index` object, when these names are dynamically truncated due to a too-long identifier name. As the identifier truncation is deterministic, applying the same rule to the metadata name allows correct comparison to the database-derived name. .. change:: 419 :tags: bug environment :tickets: 419 A warning is emitted when an object that's not a :class:`~sqlalchemy.engine.Connection` is passed to :meth:`.EnvironmentContext.configure`. For the case of a :class:`~sqlalchemy.engine.Engine` passed, the check for "in transaction" introduced in version 0.9.0 has been relaxed to work in the case of an attribute error, as some users appear to be passing an :class:`~sqlalchemy.engine.Engine` and not a :class:`~sqlalchemy.engine.Connection`. .. changelog:: :version: 0.9.1 :released: March 1, 2017 .. change:: 417 :tags: bug, commands :tickets: 417, 369 An adjustment to the bug fix for :ticket:`369` to accommodate for env.py scripts that use an enclosing transaction distinct from the one that the context provides, so that the check for "didn't commit the transaction" doesn't trigger in this scenario. .. changelog:: :version: 0.9.0 :released: February 28, 2017 .. change:: 38 :tags: feature, autogenerate :tickets: 38 The :paramref:`.EnvironmentContext.configure.target_metadata` parameter may now be optionally specified as a sequence of :class:`.MetaData` objects instead of a single :class:`.MetaData` object. The autogenerate process will process the sequence of :class:`.MetaData` objects in order. .. change:: 369 :tags: bug, commands :tickets: 369 A :class:`.CommandError` is now raised when a migration file opens a database transaction and does not close/commit/rollback, when the backend database or environment options also specify transactional_ddl is False. When transactional_ddl is not in use, Alembic doesn't close any transaction so a transaction opened by a migration file will cause the following migrations to fail to apply. .. change:: 413 :tags: bug, autogenerate, mysql :tickets: 413 The ``autoincrement=True`` flag is now rendered within the :meth:`.Operations.alter_column` operation if the source column indicates that this flag should be set to True. The behavior is sensitive to the SQLAlchemy version in place, as the "auto" default option is new in SQLAlchemy 1.1. When the source column indicates autoincrement as True or "auto", the flag will render as True if the original column contextually indicates that it should have "autoincrement" keywords, and when the source column explcitly sets it to False, this is also rendered. The behavior is intended to preserve the AUTO_INCREMENT flag on MySQL as the column is fully recreated on this backend. Note that this flag does **not** support alteration of a column's "autoincrement" status, as this is not portable across backends. .. change:: 411 :tags: bug, postgresql :tickets: 411 Fixed bug where Postgresql JSON/JSONB types rendered on SQLAlchemy 1.1 would render the "astext_type" argument which defaults to the ``Text()`` type without the module prefix, similarly to the issue with ARRAY fixed in :ticket:`85`. .. change:: 85 :tags: bug, postgresql :tickets: 85 Fixed bug where Postgresql ARRAY type would not render the import prefix for the inner type; additionally, user-defined renderers take place for the inner type as well as the outer type. Pull request courtesy Paul Brackin. .. change:: process_revision_directives_command :tags: feature, autogenerate Added a keyword argument ``process_revision_directives`` to the :func:`.command.revision` API call. This function acts in the same role as the environment-level :paramref:`.EnvironmentContext.configure.process_revision_directives`, and allows API use of the command to drop in an ad-hoc directive process function. This function can be used among other things to place a complete :class:`.MigrationScript` structure in place. .. change:: 412 :tags: feature, postgresql :tickets: 412 Added support for Postgresql EXCLUDE constraints, including the operation directive :meth:`.Operations.create_exclude_constraints` as well as autogenerate render support for the ``ExcludeConstraint`` object as present in a ``Table``. Autogenerate detection for an EXCLUDE constraint added or removed to/from an existing table is **not** implemented as the SQLAlchemy Postgresql dialect does not yet support reflection of EXCLUDE constraints. Additionally, unknown constraint types now warn when encountered within an autogenerate action rather than raise. .. change:: fk_schema_compare :tags: bug, operations Fixed bug in :func:`.ops.create_foreign_key` where the internal table representation would not be created properly if the foriegn key referred to a table in a different schema of the same name. Pull request courtesy Konstantin Lebedev. .. changelog:: :version: 0.8.10 :released: January 17, 2017 .. change:: 406 :tags: bug, versioning :tickets: 406 The alembic_version table, when initially created, now establishes a primary key constraint on the "version_num" column, to suit database engines that don't support tables without primary keys. This behavior can be controlled using the parameter :paramref:`.EnvironmentContext.configure.version_table_pk`. Note that this change only applies to the initial creation of the alembic_version table; it does not impact any existing alembic_version table already present. .. change:: 402 :tags: bug, batch :tickets: 402 Fixed bug where doing ``batch_op.drop_constraint()`` against the primary key constraint would fail to remove the "primary_key" flag from the column, resulting in the constraint being recreated. .. change:: update_uq_dedupe :tags: bug, autogenerate, oracle Adjusted the logic originally added for :ticket:`276` that detects MySQL unique constraints which are actually unique indexes to be generalized for any dialect that has this behavior, for SQLAlchemy version 1.0 and greater. This is to allow for upcoming SQLAlchemy support for unique constraint reflection for Oracle, which also has no dedicated concept of "unique constraint" and instead establishes a unique index. .. change:: 356 :tags: bug, versioning :tickets: 356 Added a file ignore for Python files of the form ``.#.py``, which are generated by the Emacs editor. Pull request courtesy Markus Mattes. .. changelog:: :version: 0.8.9 :released: November 28, 2016 .. change:: 393 :tags: bug, autogenerate :tickets: 393 Adjustment to the "please adjust!" comment in the script.py.mako template so that the generated comment starts with a single pound sign, appeasing flake8. .. change:: :tags: bug, batch :tickets: 391 Batch mode will not use CAST() to copy data if type_ is given, however the basic type affinity matches that of the existing type. This to avoid SQLite's CAST of TIMESTAMP which results in truncation of the data, in those cases where the user needs to add redundant type_ for other reasons. .. change:: :tags: bug, autogenerate :tickets: 393 Continued pep8 improvements by adding appropriate whitespace in the base template for generated migrations. Pull request courtesy Markus Mattes. .. change:: :tags: bug, revisioning Added an additional check when reading in revision files to detect if the same file is being read twice; this can occur if the same directory or a symlink equivalent is present more than once in version_locations. A warning is now emitted and the file is skipped. Pull request courtesy Jiri Kuncar. .. change:: :tags: bug, autogenerate :tickets: 395 Fixed bug where usage of a custom TypeDecorator which returns a per-dialect type via :meth:`.TypeDecorator.load_dialect_impl` that differs significantly from the default "impl" for the type decorator would fail to compare correctly during autogenerate. .. change:: :tags: bug, autogenerate, postgresql :tickets: 392 Fixed bug in Postgresql "functional index skip" behavior where a functional index that ended in ASC/DESC wouldn't be detected as something we can't compare in autogenerate, leading to duplicate definitions in autogenerated files. .. change:: :tags: bug, versioning Fixed bug where the "base" specifier, as in "base:head", could not be used explicitly when ``--sql`` mode was present. .. changelog:: :version: 0.8.8 :released: September 12, 2016 .. change:: :tags: autogenerate The imports in the default script.py.mako are now at the top so that flake8 editors don't complain by default. PR courtesy Guilherme Mansur. .. change:: :tags: feature, operations, postgresql :tickets: 292 Added support for the USING clause to the ALTER COLUMN operation for Postgresql. Support is via the :paramref:`.op.alter_column.postgresql_using` parameter. Pull request courtesy Frazer McLean. .. change:: :tags: feature, autogenerate Autogenerate with type comparison enabled will pick up on the timezone setting changing between DateTime types. Pull request courtesy David Szotten. .. changelog:: :version: 0.8.7 :released: July 26, 2016 .. change:: :tags: bug, versioning :tickets: 336 Fixed bug where upgrading to the head of a branch which is already present would fail, only if that head were also the dependency of a different branch that is also upgraded, as the revision system would see this as trying to go in the wrong direction. The check here has been refined to distinguish between same-branch revisions out of order vs. movement along sibling branches. .. change:: :tags: bug, versioning :tickets: 379 Adjusted the version traversal on downgrade such that we can downgrade to a version that is a dependency for a version in a different branch, *without* needing to remove that dependent version as well. Previously, the target version would be seen as a "merge point" for it's normal up-revision as well as the dependency. This integrates with the changes for :ticket:`377` and :ticket:`378` to improve treatment of branches with dependencies overall. .. change:: :tags: bug, versioning :tickets: 377 Fixed bug where a downgrade to a version that is also a dependency to a different branch would fail, as the system attempted to treat this as an "unmerge" of a merge point, when in fact it doesn't have the other side of the merge point available for update. .. change:: :tags: bug, versioning :tickets: 378 Fixed bug where the "alembic current" command wouldn't show a revision as a current head if it were also a dependency of a version in a different branch that's also applied. Extra logic is added to extract "implied" versions of different branches from the top-level versions listed in the alembic_version table. .. change:: :tags: bug, versioning Fixed bug where a repr() or str() of a Script object would fail if the script had multiple dependencies. .. change:: :tags: bug, autogenerate Fixed bug in autogen where if the DB connection sends the default schema as "None", this "None" would be removed from the list of schemas to check if include_schemas were set. This could possibly impact using include_schemas with SQLite. .. change:: :tags: bug, batch Small adjustment made to the batch handling for reflected CHECK constraints to accommodate for SQLAlchemy 1.1 now reflecting these. Batch mode still does not support CHECK constraints from the reflected table as these can't be easily differentiated from the ones created by types such as Boolean. .. changelog:: :version: 0.8.6 :released: April 14, 2016 .. change:: :tags: bug, commands :tickets: 367 Errors which occur within the Mako render step are now intercepted and raised as CommandErrors like other failure cases; the Mako exception itself is written using template-line formatting to a temporary file which is named in the exception message. .. change:: :tags: bug, postgresql :tickets: 365 Added a fix to Postgresql server default comparison which first checks if the text of the default is identical to the original, before attempting to actually run the default. This accomodates for default-generation functions that generate a new value each time such as a uuid function. .. change:: :tags: bug, batch :tickets: 361 Fixed bug introduced by the fix for :ticket:`338` in version 0.8.4 where a server default could no longer be dropped in batch mode. Pull request courtesy Martin Domke. .. change:: :tags: bug, batch, mssql Fixed bug where SQL Server arguments for drop_column() would not be propagated when running under a batch block. Pull request courtesy Michal Petrucha. .. changelog:: :version: 0.8.5 :released: March 9, 2016 .. change:: :tags: bug, autogenerate :tickets: 335 Fixed bug where the columns rendered in a ``PrimaryKeyConstraint`` in autogenerate would inappropriately render the "key" of the column, not the name. Pull request courtesy Jesse Dhillon. .. change:: :tags: bug, batch :tickets: 354 Repaired batch migration support for "schema" types which generate constraints, in particular the ``Boolean`` datatype which generates a CHECK constraint. Previously, an alter column operation with this type would fail to correctly accommodate for the CHECK constraint on change both from and to this type. In the former case the operation would fail entirely, in the latter, the CHECK constraint would not get generated. Both of these issues are repaired. .. change:: :tags: bug, mysql :tickets: 355 Changing a schema type such as ``Boolean`` to a non-schema type would emit a drop constraint operation which emits ``NotImplementedError`` for the MySQL dialect. This drop constraint operation is now skipped when the constraint originates from a schema type. .. changelog:: :version: 0.8.4 :released: December 15, 2015 .. change:: :tags: feature, versioning A major improvement to the hash id generation function, which for some reason used an awkward arithmetic formula against uuid4() that produced values that tended to start with the digits 1-4. Replaced with a simple substring approach which provides an even distribution. Pull request courtesy Antti Haapala. .. change:: :tags: feature, autogenerate Added an autogenerate renderer for the :class:`.ExecuteSQLOp` operation object; only renders if given a plain SQL string, otherwise raises NotImplementedError. Can be of help with custom autogenerate sequences that includes straight SQL execution. Pull request courtesy Jacob Magnusson. .. change:: :tags: bug, batch :tickets: 345 Batch mode generates a FOREIGN KEY constraint that is self-referential using the ultimate table name, rather than ``_alembic_batch_temp``. When the table is renamed from ``_alembic_batch_temp`` back to the original name, the FK now points to the right name. This will **not** work if referential integrity is being enforced (eg. SQLite "PRAGMA FOREIGN_KEYS=ON") since the original table is dropped and the new table then renamed to that name, however this is now consistent with how foreign key constraints on **other** tables already operate with batch mode; these don't support batch mode if referential integrity is enabled in any case. .. change:: :tags: bug, autogenerate :tickets: 341 Added a type-level comparator that distinguishes :class:`.Integer`, :class:`.BigInteger`, and :class:`.SmallInteger` types and dialect-specific types; these all have "Integer" affinity so previously all compared as the same. .. change:: :tags: bug, batch :tickets: 338 Fixed bug where the ``server_default`` parameter of ``alter_column()`` would not function correctly in batch mode. .. change:: :tags: bug, autogenerate :tickets: 337 Adjusted the rendering for index expressions such that a :class:`.Column` object present in the source :class:`.Index` will not be rendered as table-qualified; e.g. the column name will be rendered alone. Table-qualified names here were failing on systems such as Postgresql. .. changelog:: :version: 0.8.3 :released: October 16, 2015 .. change:: :tags: bug, autogenerate :tickets: 332 Fixed an 0.8 regression whereby the "imports" dictionary member of the autogen context was removed; this collection is documented in the "render custom type" documentation as a place to add new imports. The member is now known as :attr:`.AutogenContext.imports` and the documentation is repaired. .. change:: :tags: bug, batch :tickets: 333 Fixed bug in batch mode where a table that had pre-existing indexes would create the same index on the new table with the same name, which on SQLite produces a naming conflict as index names are in a global namespace on that backend. Batch mode now defers the production of both existing and new indexes until after the entire table transfer operation is complete, which also means those indexes no longer take effect during the INSERT from SELECT section as well; the indexes are applied in a single step afterwards. .. change:: :tags: bug, tests Added "pytest-xdist" as a tox dependency, so that the -n flag in the test command works if this is not already installed. Pull request courtesy Julien Danjou. .. change:: :tags: bug, autogenerate, postgresql :tickets: 324 Fixed issue in PG server default comparison where model-side defaults configured with Python unicode literals would leak the "u" character from a ``repr()`` into the SQL used for comparison, creating an invalid SQL expression, as the server-side comparison feature in PG currently repurposes the autogenerate Python rendering feature to get a quoted version of a plain string default. .. changelog:: :version: 0.8.2 :released: August 25, 2015 .. change:: :tags: bug, autogenerate :tickets: 321 Added workaround in new foreign key option detection feature for MySQL's consideration of the "RESTRICT" option being the default, for which no value is reported from the database; the MySQL impl now corrects for when the model reports RESTRICT but the database reports nothing. A similar rule is in the default FK comparison to accommodate for the default "NO ACTION" setting being present in the model but not necessarily reported by the database, or vice versa. .. changelog:: :version: 0.8.1 :released: August 22, 2015 .. change:: :tags: feature, autogenerate A custom :paramref:`.EnvironmentContext.configure.process_revision_directives` hook can now generate op directives within the :class:`.UpgradeOps` and :class:`.DowngradeOps` containers that will be generated as Python code even when the ``--autogenerate`` flag is False; provided that ``revision_environment=True``, the full render operation will be run even in "offline" mode. .. change:: :tags: bug, autogenerate Repaired the render operation for the :class:`.ops.AlterColumnOp` object to succeed when the "existing_type" field was not present. .. change:: :tags: bug, autogenerate :tickets: 318 Fixed a regression 0.8 whereby the "multidb" environment template failed to produce independent migration script segments for the output template. This was due to the reorganization of the script rendering system for 0.8. To accommodate this change, the :class:`.MigrationScript` structure will in the case of multiple calls to :meth:`.MigrationContext.run_migrations` produce lists for the :attr:`.MigrationScript.upgrade_ops` and :attr:`.MigrationScript.downgrade_ops` attributes; each :class:`.UpgradeOps` and :class:`.DowngradeOps` instance keeps track of its own ``upgrade_token`` and ``downgrade_token``, and each are rendered individually. .. seealso:: :ref:`autogen_customizing_multiengine_revision` - additional detail on the workings of the :paramref:`.EnvironmentContext.configure.process_revision_directives` parameter when multiple calls to :meth:`.MigrationContext.run_migrations` are made. .. change:: :tags: feature, autogenerate :tickets: 317 Implemented support for autogenerate detection of changes in the ``ondelete``, ``onupdate``, ``initially`` and ``deferrable`` attributes of :class:`.ForeignKeyConstraint` objects on SQLAlchemy backends that support these on reflection (as of SQLAlchemy 1.0.8 currently Postgresql for all four, MySQL for ``ondelete`` and ``onupdate`` only). A constraint object that modifies these values will be reported as a "diff" and come out as a drop/create of the constraint with the modified values. The fields are ignored for backends which don't reflect these attributes (as of SQLA 1.0.8 this includes SQLite, Oracle, SQL Server, others). .. changelog:: :version: 0.8.0 :released: August 12, 2015 .. change:: :tags: bug, batch :tickets: 315 Fixed bug in batch mode where the ``batch_op.create_foreign_key()`` directive would be incorrectly rendered with the source table and schema names in the argument list. .. change:: :tags: feature, commands Added new command ``alembic edit``. This command takes the same arguments as ``alembic show``, however runs the target script file within $EDITOR. Makes use of the ``python-editor`` library in order to facilitate the handling of $EDITOR with reasonable default behaviors across platforms. Pull request courtesy Michel Albert. .. change:: :tags: feature, commands :tickets: 311 Added new multiple-capable argument ``--depends-on`` to the ``alembic revision`` command, allowing ``depends_on`` to be established at the command line level rather than having to edit the file after the fact. ``depends_on`` identifiers may also be specified as branch names at the command line or directly within the migration file. The values may be specified as partial revision numbers from the command line which will be resolved to full revision numbers in the output file. .. change:: :tags: change, operations A range of positional argument names have been changed to be clearer and more consistent across methods within the :class:`.Operations` namespace. The most prevalent form of name change is that the descriptive names ``constraint_name`` and ``table_name`` are now used where previously the name ``name`` would be used. This is in support of the newly modularized and extensible system of operation objects in :mod:`alembic.operations.ops`. An argument translation layer is in place across the ``alembic.op`` namespace that will ensure that named argument calling styles that use the old names will continue to function by transparently translating to the new names, also emitting a warning. This, along with the fact that these arguments are positional in any case and aren't normally passed with an explicit name, should ensure that the overwhelming majority of applications should be unaffected by this change. The *only* applications that are impacted are those that: 1. use the :class:`.Operations` object directly in some way, rather than calling upon the ``alembic.op`` namespace, and 2. invoke the methods on :class:`.Operations` using named keyword arguments for positional arguments like ``table_name``, ``constraint_name``, etc., which commonly were named ``name`` as of 0.7.6. 3. any application that is using named keyword arguments in place of positional argument for the recently added :class:`.BatchOperations` object may also be affected. The naming changes are documented as "versionchanged" for 0.8.0: * :meth:`.BatchOperations.create_check_constraint` * :meth:`.BatchOperations.create_foreign_key` * :meth:`.BatchOperations.create_index` * :meth:`.BatchOperations.create_unique_constraint` * :meth:`.BatchOperations.drop_constraint` * :meth:`.BatchOperations.drop_index` * :meth:`.Operations.create_check_constraint` * :meth:`.Operations.create_foreign_key` * :meth:`.Operations.create_primary_key` * :meth:`.Operations.create_index` * :meth:`.Operations.create_table` * :meth:`.Operations.create_unique_constraint` * :meth:`.Operations.drop_constraint` * :meth:`.Operations.drop_index` * :meth:`.Operations.drop_table` .. change:: :tags: feature, tests The default test runner via "python setup.py test" is now py.test. nose still works via run_tests.py. .. change:: :tags: feature, operations :tickets: 302 The internal system for Alembic operations has been reworked to now build upon an extensible system of operation objects. New operations can be added to the ``op.`` namespace, including that they are available in custom autogenerate schemes. .. seealso:: :ref:`operation_plugins` .. change:: :tags: feature, autogenerate :tickets: 301, 306 The internal system for autogenerate been reworked to build upon the extensible system of operation objects present in :ticket:`302`. As part of this change, autogenerate now produces a full object graph representing a list of migration scripts to be written as well as operation objects that will render all the Python code within them; a new hook :paramref:`.EnvironmentContext.configure.process_revision_directives` allows end-user code to fully customize what autogenerate will do, including not just full manipulation of the Python steps to take but also what file or files will be written and where. Additionally, autogenerate is now extensible as far as database objects compared and rendered into scripts; any new operation directive can also be registered into a series of hooks that allow custom database/model comparison functions to run as well as to render new operation directives into autogenerate scripts. .. seealso:: :ref:`alembic.autogenerate.toplevel` .. change:: :tags: bug, versioning :tickets: 314 Fixed bug where in the erroneous case that alembic_version contains duplicate revisions, some commands would fail to process the version history correctly and end up with a KeyError. The fix allows the versioning logic to proceed, however a clear error is emitted later when attempting to update the alembic_version table. .. changelog:: :version: 0.7.7 :released: July 22, 2015 .. change:: :tags: bug, versioning :tickets: 310 Fixed critical issue where a complex series of branches/merges would bog down the iteration algorithm working over redundant nodes for millions of cycles. An internal adjustment has been made so that duplicate nodes are skipped within this iteration. .. change:: :tags: feature, batch :tickets: 305 Implemented support for :meth:`.BatchOperations.create_primary_key` and :meth:`.BatchOperations.create_check_constraint`. Additionally, table keyword arguments are copied from the original reflected table, such as the "mysql_engine" keyword argument. .. change:: :tags: bug, environment :tickets: 300 The :meth:`.MigrationContext.stamp` method, added as part of the versioning refactor in 0.7 as a more granular version of :func:`.command.stamp`, now includes the "create the alembic_version table if not present" step in the same way as the command version, which was previously omitted. .. change:: :tags: bug, autogenerate :tickets: 298 Fixed bug where foreign key options including "onupdate", "ondelete" would not render within the ``op.create_foreign_key()`` directive, even though they render within a full ``ForeignKeyConstraint`` directive. .. change:: :tags: bug, tests Repaired warnings that occur when running unit tests against SQLAlchemy 1.0.5 or greater involving the "legacy_schema_aliasing" flag. .. changelog:: :version: 0.7.6 :released: May 5, 2015 .. change:: :tags: feature, versioning :tickets: 297 Fixed bug where the case of multiple mergepoints that all have the identical set of ancestor revisions would fail to be upgradable, producing an assertion failure. Merge points were previously assumed to always require at least an UPDATE in alembic_revision from one of the previous revs to the new one, however in this case, if one of the mergepoints has already been reached, the remaining mergepoints have no row to UPDATE therefore they must do an INSERT of their target version. .. change:: :tags: feature, autogenerate :tickets: 296 Added support for type comparison functions to be not just per environment, but also present on the custom types themselves, by supplying a method ``compare_against_backend``. Added a new documentation section :ref:`compare_types` describing type comparison fully. .. change:: :tags: feature, operations :tickets: 255 Added a new option :paramref:`.EnvironmentContext.configure.literal_binds`, which will pass the ``literal_binds`` flag into the compilation of SQL constructs when using "offline" mode. This has the effect that SQL objects like inserts, updates, deletes as well as textual statements sent using ``text()`` will be compiled such that the dialect will attempt to render literal values "inline" automatically. Only a subset of types is typically supported; the :meth:`.Operations.inline_literal` construct remains as the construct used to force a specific literal representation of a value. The :paramref:`.EnvironmentContext.configure.literal_binds` flag is added to the "offline" section of the ``env.py`` files generated in new environments. .. change:: :tags: bug, batch :tickets: 289 Fully implemented the :paramref:`~.Operations.batch_alter_table.copy_from` parameter for batch mode, which previously was not functioning. This allows "batch mode" to be usable in conjunction with ``--sql``. .. change:: :tags: bug, batch :tickets: 287 Repaired support for the :meth:`.BatchOperations.create_index` directive, which was mis-named internally such that the operation within a batch context could not proceed. The create index operation will proceed as part of a larger "batch table recreate" operation only if :paramref:`~.Operations.batch_alter_table.recreate` is set to "always", or if the batch operation includes other instructions that require a table recreate. .. changelog:: :version: 0.7.5 :released: March 19, 2015 .. change:: :tags: bug, autogenerate :tickets: 266 The ``--autogenerate`` option is not valid when used in conjunction with "offline" mode, e.g. ``--sql``. This now raises a ``CommandError``, rather than failing more deeply later on. Pull request courtesy Johannes Erdfelt. .. change:: :tags: bug, operations, mssql :tickets: 284 Fixed bug where the mssql DROP COLUMN directive failed to include modifiers such as "schema" when emitting the DDL. .. change:: :tags: bug, autogenerate, postgresql :tickets: 282 Postgresql "functional" indexes are necessarily skipped from the autogenerate process, as the SQLAlchemy backend currently does not support reflection of these structures. A warning is emitted both from the SQLAlchemy backend as well as from the Alembic backend for Postgresql when such an index is detected. .. change:: :tags: bug, autogenerate, mysql :tickets: 276 Fixed bug where MySQL backend would report dropped unique indexes and/or constraints as both at the same time. This is because MySQL doesn't actually have a "unique constraint" construct that reports differently than a "unique index", so it is present in both lists. The net effect though is that the MySQL backend will report a dropped unique index/constraint as an index in cases where the object was first created as a unique constraint, if no other information is available to make the decision. This differs from other backends like Postgresql which can report on unique constraints and unique indexes separately. .. change:: :tags: bug, commands :tickets: 269 Fixed bug where using a partial revision identifier as the "starting revision" in ``--sql`` mode in a downgrade operation would fail to resolve properly. As a side effect of this change, the :meth:`.EnvironmentContext.get_starting_revision_argument` method will return the "starting" revision in its originally- given "partial" form in all cases, whereas previously when running within the :meth:`.command.stamp` command, it would have been resolved to a full number before passing it to the :class:`.EnvironmentContext`. The resolution of this value to a real revision number has basically been moved to a more fundamental level within the offline migration process. .. change:: :tags: feature, commands Added a new feature :attr:`.Config.attributes`, to help with the use case of sharing state such as engines and connections on the outside with a series of Alembic API calls; also added a new cookbook section to describe this simple but pretty important use case. .. seealso:: :ref:`connection_sharing` .. change:: :tags: feature, environment The format of the default ``env.py`` script has been refined a bit; it now uses context managers not only for the scope of the transaction, but also for connectivity from the starting engine. The engine is also now called a "connectable" in support of the use case of an external connection being passed in. .. change:: :tags: feature, versioning :tickets: 267 Added support for "alembic stamp" to work when given "heads" as an argument, when multiple heads are present. .. changelog:: :version: 0.7.4 :released: January 12, 2015 .. change:: :tags: bug, autogenerate, postgresql :tickets: 241 Repaired issue where a server default specified without ``text()`` that represented a numeric or floating point (e.g. with decimal places) value would fail in the Postgresql-specific check for "compare server default"; as PG accepts the value with quotes in the table specification, it's still valid. Pull request courtesy Dimitris Theodorou. .. change:: :tags: bug, autogenerate :tickets: 259 The rendering of a :class:`~sqlalchemy.schema.ForeignKeyConstraint` will now ensure that the names of the source and target columns are the database-side name of each column, and not the value of the ``.key`` attribute as may be set only on the Python side. This is because Alembic generates the DDL for constraints as standalone objects without the need to actually refer to an in-Python :class:`~sqlalchemy.schema.Table` object, so there's no step that would resolve these Python-only key names to database column names. .. change:: :tags: bug, autogenerate :tickets: 260 Fixed bug in foreign key autogenerate where if the in-Python table used custom column keys (e.g. using the ``key='foo'`` kwarg to ``Column``), the comparison of existing foreign keys to those specified in the metadata would fail, as the reflected table would not have these keys available which to match up. Foreign key comparison for autogenerate now ensures it's looking at the database-side names of the columns in all cases; this matches the same functionality within unique constraints and indexes. .. change:: :tags: bug, autogenerate :tickets: 261 Fixed issue in autogenerate type rendering where types that belong to modules that have the name "sqlalchemy" in them would be mistaken as being part of the ``sqlalchemy.`` namespace. Pull req courtesy Bartosz Burclaf. .. changelog:: :version: 0.7.3 :released: December 30, 2014 .. change:: :tags: bug, versioning :tickets: 258 Fixed regression in new versioning system where upgrade / history operation would fail on AttributeError if no version files were present at all. .. changelog:: :version: 0.7.2 :released: December 18, 2014 .. change:: :tags: bug, sqlite, autogenerate Adjusted the SQLite backend regarding autogen of unique constraints to work fully with the current SQLAlchemy 1.0, which now will report on UNIQUE constraints that have no name. .. change:: :tags: bug, batch :tickets: 254 Fixed bug in batch where if the target table contained multiple foreign keys to the same target table, the batch mechanics would fail with a "table already exists" error. Thanks for the help on this from Lucas Kahlert. .. change:: :tags: bug, mysql :tickets: 251 Fixed an issue where the MySQL routine to skip foreign-key-implicit indexes would also catch unnamed unique indexes, as they would be named after the column and look like the FK indexes. Pull request courtesy Johannes Erdfelt. .. change:: :tags: bug, mssql, oracle :tickets: 253 Repaired a regression in both the MSSQL and Oracle dialects whereby the overridden ``_exec()`` method failed to return a value, as is needed now in the 0.7 series. .. changelog:: :version: 0.7.1 :released: December 3, 2014 .. change:: :tags: bug, batch The ``render_as_batch`` flag was inadvertently hardcoded to ``True``, so all autogenerates were spitting out batch mode...this has been fixed so that batch mode again is only when selected in env.py. .. change:: :tags: feature, autogenerate :tickets: 178 Support for autogenerate of FOREIGN KEY constraints has been added. These are delivered within the autogenerate process in the same manner as UNIQUE constraints, including ``include_object`` support. Big thanks to Ann Kamyshnikova for doing the heavy lifting here. .. change:: :tags: feature, batch Added :paramref:`~.Operations.batch_alter_table.naming_convention` argument to :meth:`.Operations.batch_alter_table`, as this is necessary in order to drop foreign key constraints; these are often unnamed on the target database, and in the case that they are named, SQLAlchemy is as of the 0.9 series not including these names yet. .. seealso:: :ref:`dropping_sqlite_foreign_keys` .. change:: :tags: bug, batch Fixed bug where the "source_schema" argument was not correctly passed when calling :meth:`.BatchOperations.create_foreign_key`. Pull request courtesy Malte Marquarding. .. change:: :tags: bug, batch :tickets: 249 Repaired the inspection, copying and rendering of CHECK constraints and so-called "schema" types such as Boolean, Enum within the batch copy system; the CHECK constraint will not be "doubled" when the table is copied, and additionally the inspection of the CHECK constraint for its member columns will no longer fail with an attribute error. .. change:: :tags: feature, batch Added two new arguments :paramref:`.Operations.batch_alter_table.reflect_args` and :paramref:`.Operations.batch_alter_table.reflect_kwargs`, so that arguments may be passed directly to suit the :class:`~.sqlalchemy.schema.Table` object that will be reflected. .. seealso:: :ref:`batch_controlling_table_reflection` .. changelog:: :version: 0.7.0 :released: November 24, 2014 .. change:: :tags: feature, versioning :tickets: 167 The "multiple heads / branches" feature has now landed. This is by far the most significant change Alembic has seen since its inception; while the workflow of most commands hasn't changed, and the format of version files and the ``alembic_version`` table are unchanged as well, a new suite of features opens up in the case where multiple version files refer to the same parent, or to the "base". Merging of branches, operating across distinct named heads, and multiple independent bases are now all supported. The feature incurs radical changes to the internals of versioning and traversal, and should be treated as "beta mode" for the next several subsequent releases within 0.7. .. seealso:: :ref:`branches` .. change:: :tags: feature, versioning :tickets: 124 In conjunction with support for multiple independent bases, the specific version directories are now also configurable to include multiple, user-defined directories. When multiple directories exist, the creation of a revision file with no down revision requires that the starting directory is indicated; the creation of subsequent revisions along that lineage will then automatically use that directory for new files. .. seealso:: :ref:`multiple_version_directories` .. change:: :tags: feature, operations, sqlite :tickets: 21 Added "move and copy" workflow, where a table to be altered is copied to a new one with the new structure and the old one dropped, is now implemented for SQLite as well as all database backends in general using the new :meth:`.Operations.batch_alter_table` system. This directive provides a table-specific operations context which gathers column- and constraint-level mutations specific to that table, and at the end of the context creates a new table combining the structure of the old one with the given changes, copies data from old table to new, and finally drops the old table, renaming the new one to the existing name. This is required for fully featured SQLite migrations, as SQLite has very little support for the traditional ALTER directive. The batch directive is intended to produce code that is still compatible with other databases, in that the "move and copy" process only occurs for SQLite by default, while still providing some level of sanity to SQLite's requirement by allowing multiple table mutation operations to proceed within one "move and copy" as well as providing explicit control over when this operation actually occurs. The "move and copy" feature may be optionally applied to other backends as well, however dealing with referential integrity constraints from other tables must still be handled explicitly. .. seealso:: :ref:`batch_migrations` .. change:: :tags: feature, commands Relative revision identifiers as used with ``alembic upgrade``, ``alembic downgrade`` and ``alembic history`` can be combined with specific revisions as well, e.g. ``alembic upgrade ae10+3``, to produce a migration target relative to the given exact version. .. change:: :tags: bug, commands :tickets: 248 The ``alembic revision`` command accepts the ``--sql`` option to suit some very obscure use case where the ``revision_environment`` flag is set up, so that ``env.py`` is run when ``alembic revision`` is run even though autogenerate isn't specified. As this flag is otherwise confusing, error messages are now raised if ``alembic revision`` is invoked with both ``--sql`` and ``--autogenerate`` or with ``--sql`` without ``revision_environment`` being set. .. change:: :tags: bug, autogenerate, postgresql :tickets: 247 Added a rule for Postgresql to not render a "drop unique" and "drop index" given the same name; for now it is assumed that the "index" is the implicit one Postgreql generates. Future integration with new SQLAlchemy 1.0 features will improve this to be more resilient. .. change:: :tags: bug, autogenerate :tickets: 247 A change in the ordering when columns and constraints are dropped; autogenerate will now place the "drop constraint" calls *before* the "drop column" calls, so that columns involved in those constraints still exist when the constraint is dropped. .. change:: :tags: feature, commands New commands added: ``alembic show``, ``alembic heads`` and ``alembic merge``. Also, a new option ``--verbose`` has been added to several informational commands, such as ``alembic history``, ``alembic current``, ``alembic branches``, and ``alembic heads``. ``alembic revision`` also contains several new options used within the new branch management system. The output of commands has been altered in many cases to support new fields and attributes; the ``history`` command in particular now returns it's "verbose" output only if ``--verbose`` is sent; without this flag it reverts to it's older behavior of short line items (which was never changed in the docs). .. change:: :tags: changed, commands The ``--head_only`` option to the ``alembic current`` command is deprecated; the ``current`` command now lists just the version numbers alone by default; use ``--verbose`` to get at additional output. .. change:: :tags: feature, config Added new argument :paramref:`.Config.config_args`, allows a dictionary of replacement variables to be passed which will serve as substitution values when an API-produced :class:`.Config` consumes the ``.ini`` file. Pull request courtesy Noufal Ibrahim. .. change:: :tags: bug, oracle :tickets: 245 The Oracle dialect sets "transactional DDL" to False by default, as Oracle does not support transactional DDL. .. change:: :tags: bug, autogenerate :tickets: 243 Fixed a variety of issues surrounding rendering of Python code that contains unicode literals. The first is that the "quoted_name" construct that SQLAlchemy uses to represent table and column names as well as schema names does not ``repr()`` correctly on Py2K when the value contains unicode characters; therefore an explicit stringification is added to these. Additionally, SQL expressions such as server defaults were not being generated in a unicode-safe fashion leading to decode errors if server defaults contained non-ascii characters. .. change:: :tags: bug, operations :tickets: 174 The :meth:`.Operations.add_column` directive will now additionally emit the appropriate ``CREATE INDEX`` statement if the :class:`~sqlalchemy.schema.Column` object specifies ``index=True``. Pull request courtesy David Szotten. .. change:: :tags: feature, operations :tickets: 205 The :class:`~sqlalchemy.schema.Table` object is now returned when the :meth:`.Operations.create_table` method is used. This ``Table`` is suitable for use in subsequent SQL operations, in particular the :meth:`.Operations.bulk_insert` operation. .. change:: :tags: feature, autogenerate :tickets: 203 Indexes and unique constraints are now included in the :paramref:`.EnvironmentContext.configure.include_object` hook. Indexes are sent with type ``"index"`` and unique constraints with type ``"unique_constraint"``. .. change:: :tags: bug, autogenerate :tickets: 219 Bound parameters are now resolved as "literal" values within the SQL expression inside of a CheckConstraint(), when rendering the SQL as a text string; supported for SQLAlchemy 0.8.0 and forward. .. change:: :tags: bug, autogenerate :tickets: 199 Added a workaround for SQLAlchemy issue #3023 (fixed in 0.9.5) where a column that's part of an explicit PrimaryKeyConstraint would not have its "nullable" flag set to False, thus producing a false autogenerate. Also added a related correction to MySQL which will correct for MySQL's implicit server default of '0' when a NULL integer column is turned into a primary key column. .. change:: :tags: bug, autogenerate, mysql :tickets: 240 Repaired issue related to the fix for #208 and others; a composite foreign key reported by MySQL would cause a KeyError as Alembic attempted to remove MySQL's implicitly generated indexes from the autogenerate list. .. change:: :tags: bug, autogenerate :tickets: 28 If the "alembic_version" table is present in the target metadata, autogenerate will skip this also. Pull request courtesy Dj Gilcrease. .. change:: :tags: bug, autogenerate :tickets: 77 The :paramref:`.EnvironmentContext.configure.version_table` and :paramref:`.EnvironmentContext.configure.version_table_schema` arguments are now honored during the autogenerate process, such that these names will be used as the "skip" names on both the database reflection and target metadata sides. .. change:: :tags: changed, autogenerate :tickets: 229 The default value of the :paramref:`.EnvironmentContext.configure.user_module_prefix` parameter is **no longer the same as the SQLAlchemy prefix**. When omitted, user-defined types will now use the ``__module__`` attribute of the type class itself when rendering in an autogenerated module. .. change:: :tags: bug, templates :tickets: 234 Revision files are now written out using the ``'wb'`` modifier to ``open()``, since Mako reads the templates with ``'rb'``, thus preventing CRs from being doubled up as has been observed on windows. The encoding of the output now defaults to 'utf-8', which can be configured using a newly added config file parameter ``output_encoding``. .. change:: :tags: bug, operations :tickets: 230 Added support for use of the :class:`~sqlalchemy.sql.elements.quoted_name` construct when using the ``schema`` argument within operations. This allows a name containing a dot to be fully quoted, as well as to provide configurable quoting on a per-name basis. .. change:: :tags: bug, autogenerate, postgresql :tickets: 73 Added a routine by which the Postgresql Alembic dialect inspects the server default of INTEGER/BIGINT columns as they are reflected during autogenerate for the pattern ``nextval(...)`` containing a potential sequence name, then queries ``pg_catalog`` to see if this sequence is "owned" by the column being reflected; if so, it assumes this is a SERIAL or BIGSERIAL column and the server default is omitted from the column reflection as well as any kind of server_default comparison or rendering, along with an INFO message in the logs indicating this has taken place. This allows SERIAL/BIGSERIAL columns to keep the SEQUENCE from being unnecessarily present within the autogenerate operation. .. change:: :tags: bug, autogenerate :tickets: 197, 64, 196 The system by which autogenerate renders expressions within a :class:`~sqlalchemy.schema.Index`, the ``server_default`` of :class:`~sqlalchemy.schema.Column`, and the ``existing_server_default`` of :meth:`.Operations.alter_column` has been overhauled to anticipate arbitrary SQLAlchemy SQL constructs, such as ``func.somefunction()``, ``cast()``, ``desc()``, and others. The system does not, as might be preferred, render the full-blown Python expression as originally created within the application's source code, as this would be exceedingly complex and difficult. Instead, it renders the SQL expression against the target backend that's subject to the autogenerate, and then renders that SQL inside of a :func:`~sqlalchemy.sql.expression.text` construct as a literal SQL string. This approach still has the downside that the rendered SQL construct may not be backend-agnostic in all cases, so there is still a need for manual intervention in that small number of cases, but overall the majority of cases should work correctly now. Big thanks to Carlos Rivera for pull requests and support on this. .. change:: :tags: feature SQLAlchemy's testing infrastructure is now used to run tests. This system supports both nose and pytest and opens the way for Alembic testing to support any number of backends, parallel testing, and 3rd party dialect testing. .. change:: :tags: changed, compatibility Minimum SQLAlchemy version is now 0.7.6, however at least 0.8.4 is strongly recommended. The overhaul of the test suite allows for fully passing tests on all SQLAlchemy versions from 0.7.6 on forward. .. change:: :tags: bug, operations The "match" keyword is not sent to :class:`.ForeignKeyConstraint` by :meth:`.Operations.create_foreign_key` when SQLAlchemy 0.7 is in use; this keyword was added to SQLAlchemy as of 0.8.0. .. changelog:: :version: 0.6.7 :released: September 9, 2014 .. change:: :tags: bug, mssql Fixed bug in MSSQL dialect where "rename table" wasn't using ``sp_rename()`` as is required on SQL Server. Pull request courtesy Åukasz BoÅ‚dys. .. change:: :tags: feature :tickets: 222 Added support for functional indexes when using the :meth:`.Operations.create_index` directive. Within the list of columns, the SQLAlchemy ``text()`` construct can be sent, embedding a literal SQL expression; the :meth:`.Operations.create_index` will perform some hackery behind the scenes to get the :class:`.Index` construct to cooperate. This works around some current limitations in :class:`.Index` which should be resolved on the SQLAlchemy side at some point. .. changelog:: :version: 0.6.6 :released: August 7, 2014 .. change:: :tags: bug :tickets: 95 A file named ``__init__.py`` in the ``versions/`` directory is now ignored by Alembic when the collection of version files is retrieved. Pull request courtesy Michael Floering. .. change:: :tags: bug Fixed Py3K bug where an attempt would be made to sort None against string values when autogenerate would detect tables across multiple schemas, including the default schema. Pull request courtesy paradoxxxzero. .. change:: :tags: bug Autogenerate render will render the arguments within a Table construct using ``*[...]`` when the number of columns/elements is greater than 255. Pull request courtesy Ryan P. Kelly. .. change:: :tags: bug Fixed bug where foreign key constraints would fail to render in autogenerate when a schema name was present. Pull request courtesy Andreas Zeidler. .. change:: :tags: bug :tickets: 212 Some deep-in-the-weeds fixes to try to get "server default" comparison working better across platforms and expressions, in particular on the Postgresql backend, mostly dealing with quoting/not quoting of various expressions at the appropriate time and on a per-backend basis. Repaired and tested support for such defaults as Postgresql interval and array defaults. .. change:: :tags: enhancement :tickets: 209 When a run of Alembic command line fails due to ``CommandError``, the output now prefixes the string with ``"FAILED:"``, and the error is also written to the log output using ``log.error()``. .. change:: :tags: bug :tickets: 208 Liberalized even more the check for MySQL indexes that shouldn't be counted in autogenerate as "drops"; this time it's been reported that an implicitly created index might be named the same as a composite foreign key constraint, and not the actual columns, so we now skip those when detected as well. .. change:: :tags: feature Added a new accessor :attr:`.MigrationContext.config`, when used in conjunction with a :class:`.EnvironmentContext` and :class:`.Config`, this config will be returned. Patch courtesy Marc Abramowitz. .. changelog:: :version: 0.6.5 :released: May 3, 2014 .. change:: :tags: bug, autogenerate, mysql :tickets: 202 This releases' "autogenerate index detection" bug, when a MySQL table includes an Index with the same name as a column, autogenerate reported it as an "add" even though its not; this is because we ignore reflected indexes of this nature due to MySQL creating them implicitly. Indexes that are named the same as a column are now ignored on MySQL if we see that the backend is reporting that it already exists; this indicates that we can still detect additions of these indexes but not drops, as we cannot distinguish a backend index same-named as the column as one that is user generated or mysql-generated. .. change:: :tags: feature, environment :tickets: 201 Added new feature :paramref:`.EnvironmentContext.configure.transaction_per_migration`, which when True causes the BEGIN/COMMIT pair to incur for each migration individually, rather than for the whole series of migrations. This is to assist with some database directives that need to be within individual transactions, without the need to disable transactional DDL entirely. .. change:: :tags: bug, autogenerate :tickets: 200 Fixed bug where the ``include_object()`` filter would not receive the original :class:`.Column` object when evaluating a database-only column to be dropped; the object would not include the parent :class:`.Table` nor other aspects of the column that are important for generating the "downgrade" case where the column is recreated. .. change:: :tags: bug, environment :tickets: 195 Fixed bug where :meth:`.EnvironmentContext.get_x_argument` would fail if the :class:`.Config` in use didn't actually originate from a command line call. .. change:: :tags: bug, autogenerate :tickets: 194 Fixed another bug regarding naming conventions, continuing from :ticket:`183`, where add_index() drop_index() directives would not correctly render the ``f()`` construct when the index contained a convention-driven name. .. changelog:: :version: 0.6.4 :released: March 28, 2014 .. change:: :tags: bug, mssql :tickets: 186 Added quoting to the table name when the special EXEC is run to drop any existing server defaults or constraints when the :paramref:`.drop_column.mssql_drop_check` or :paramref:`.drop_column.mssql_drop_default` arguments are used. .. change:: :tags: bug, mysql :tickets: 103 Added/fixed support for MySQL "SET DEFAULT" / "DROP DEFAULT" phrases, which will now be rendered if only the server default is changing or being dropped (e.g. specify None to alter_column() to indicate "DROP DEFAULT"). Also added support for rendering MODIFY rather than CHANGE when the column name isn't changing. .. change:: :tags: bug :tickets: 190 Added support for the ``initially``, ``match`` keyword arguments as well as dialect-specific keyword arguments to :meth:`.Operations.create_foreign_key`. :tags: feature :tickets: 163 Altered the support for "sourceless" migration files (e.g. only .pyc or .pyo present) so that the flag "sourceless=true" needs to be in alembic.ini for this behavior to take effect. .. change:: :tags: bug, mssql :tickets: 185 The feature that keeps on giving, index/unique constraint autogenerate detection, has even more fixes, this time to accommodate database dialects that both don't yet report on unique constraints, but the backend does report unique constraints as indexes. The logic Alembic uses to distinguish between "this is an index!" vs. "this is a unique constraint that is also reported as an index!" has now been further enhanced to not produce unwanted migrations when the dialect is observed to not yet implement get_unique_constraints() (e.g. mssql). Note that such a backend will no longer report index drops for unique indexes, as these cannot be distinguished from an unreported unique index. .. change:: :tags: bug :tickets: 183 Extensive changes have been made to more fully support SQLAlchemy's new naming conventions feature. Note that while SQLAlchemy has added this feature as of 0.9.2, some additional fixes in 0.9.4 are needed to resolve some of the issues: 1. The :class:`.Operations` object now takes into account the naming conventions that are present on the :class:`.MetaData` object that's associated using :paramref:`~.EnvironmentContext.configure.target_metadata`. When :class:`.Operations` renders a constraint directive like ``ADD CONSTRAINT``, it now will make use of this naming convention when it produces its own temporary :class:`.MetaData` object. 2. Note however that the autogenerate feature in most cases generates constraints like foreign keys and unique constraints with the final names intact; the only exception are the constraints implicit with a schema-type like Boolean or Enum. In most of these cases, the naming convention feature will not take effect for these constraints and will instead use the given name as is, with one exception.... 3. Naming conventions which use the ``"%(constraint_name)s"`` token, that is, produce a new name that uses the original name as a component, will still be pulled into the naming convention converter and be converted. The problem arises when autogenerate renders a constraint with it's already-generated name present in the migration file's source code, the name will be doubled up at render time due to the combination of #1 and #2. So to work around this, autogenerate now renders these already-tokenized names using the new :meth:`.Operations.f` component. This component is only generated if **SQLAlchemy 0.9.4** or greater is in use. Therefore it is highly recommended that an upgrade to Alembic 0.6.4 be accompanied by an upgrade of SQLAlchemy 0.9.4, if the new naming conventions feature is used. .. seealso:: :ref:`autogen_naming_conventions` .. change:: :tags: bug :tickets: 160 Suppressed IOErrors which can raise when program output pipe is closed under a program like ``head``; however this only works on Python 2. On Python 3, there is not yet a known way to suppress the BrokenPipeError warnings without prematurely terminating the program via signals. .. change:: :tags: bug :tickets: 179 Fixed bug where :meth:`.Operations.bulk_insert` would not function properly when :meth:`.Operations.inline_literal` values were used, either in --sql or non-sql mode. The values will now render directly in --sql mode. For compatibility with "online" mode, a new flag :paramref:`~.Operations.bulk_insert.multiinsert` can be set to False which will cause each parameter set to be compiled and executed with individual INSERT statements. .. change:: :tags: bug, py3k :tickets: 175 Fixed a failure of the system that allows "legacy keyword arguments" to be understood, which arose as of a change in Python 3.4 regarding decorators. A workaround is applied that allows the code to work across Python 3 versions. .. change:: :tags: feature The :func:`.command.revision` command now returns the :class:`.Script` object corresponding to the newly generated revision. From this structure, one can get the revision id, the module documentation, and everything else, for use in scripts that call upon this command. Pull request courtesy Robbie Coomber. .. changelog:: :version: 0.6.3 :released: February 2, 2014 .. change:: :tags: bug :tickets: 172 Added a workaround for when we call ``fcntl.ioctl()`` to get at ``TERMWIDTH``; if the function returns zero, as is reported to occur in some pseudo-ttys, the message wrapping system is disabled in the same way as if ``ioctl()`` failed. .. change:: :tags: feature :tickets: 171 Added new argument :paramref:`.EnvironmentContext.configure.user_module_prefix`. This prefix is applied when autogenerate renders a user-defined type, which here is defined as any type that is from a module outside of the ``sqlalchemy.`` hierarchy. This prefix defaults to ``None``, in which case the :paramref:`.EnvironmentContext.configure.sqlalchemy_module_prefix` is used, thus preserving the current behavior. .. change:: :tags: bug :tickets: 170 Added support for autogenerate covering the use case where :class:`.Table` objects specified in the metadata have an explicit ``schema`` attribute whose name matches that of the connection's default schema (e.g. "public" for Postgresql). Previously, it was assumed that "schema" was ``None`` when it matched the "default" schema, now the comparison adjusts for this. .. change:: :tags: bug The :func:`.compare_metadata` public API function now takes into account the settings for :paramref:`.EnvironmentContext.configure.include_object`, :paramref:`.EnvironmentContext.configure.include_symbol`, and :paramref:`.EnvironmentContext.configure.include_schemas`, in the same way that the ``--autogenerate`` command does. Pull request courtesy Roman Podoliaka. .. change:: :tags: bug :tickets: 168 Calling :func:`.bulk_insert` with an empty list will not emit any commands on the current connection. This was already the case with ``--sql`` mode, so is now the case with "online" mode. .. change:: :tags: bug Enabled schema support for index and unique constraint autodetection; previously these were non-functional and could in some cases lead to attribute errors. Pull request courtesy Dimitris Theodorou. .. change:: :tags: bug :tickets: 164 More fixes to index autodetection; indexes created with expressions like DESC or functional indexes will no longer cause AttributeError exceptions when attempting to compare the columns. .. change:: :tags: feature :tickets: 163 The :class:`.ScriptDirectory` system that loads migration files from a ``versions/`` directory now supports so-called "sourceless" operation, where the ``.py`` files are not present and instead ``.pyc`` or ``.pyo`` files are directly present where the ``.py`` files should be. Note that while Python 3.3 has a new system of locating ``.pyc``/``.pyo`` files within a directory called ``__pycache__`` (e.g. PEP-3147), PEP-3147 maintains support for the "source-less imports" use case, where the ``.pyc``/``.pyo`` are in present in the "old" location, e.g. next to the ``.py`` file; this is the usage that's supported even when running Python3.3. .. changelog:: :version: 0.6.2 :released: Fri Dec 27 2013 .. change:: :tags: bug Autogenerate for ``op.create_table()`` will not include a ``PrimaryKeyConstraint()`` that has no columns. .. change:: :tags: bug Fixed bug in the not-internally-used :meth:`.ScriptDirectory.get_base` method which would fail if called on an empty versions directory. .. change:: :tags: bug :tickets: 157 An almost-rewrite of the new unique constraint/index autogenerate detection, to accommodate a variety of issues. The emphasis is on not generating false positives for those cases where no net change is present, as these errors are the ones that impact all autogenerate runs: * Fixed an issue with unique constraint autogenerate detection where a named ``UniqueConstraint`` on both sides with column changes would render with the "add" operation before the "drop", requiring the user to reverse the order manually. * Corrected for MySQL's apparent addition of an implicit index for a foreign key column, so that it doesn't show up as "removed". This required that the index/constraint autogen system query the dialect-specific implementation for special exceptions. * reworked the "dedupe" logic to accommodate MySQL's bi-directional duplication of unique indexes as unique constraints, and unique constraints as unique indexes. Postgresql's slightly different logic of duplicating unique constraints into unique indexes continues to be accommodated as well. Note that a unique index or unique constraint removal on a backend that duplicates these may show up as a distinct "remove_constraint()" / "remove_index()" pair, which may need to be corrected in the post-autogenerate if multiple backends are being supported. * added another dialect-specific exception to the SQLite backend when dealing with unnamed unique constraints, as the backend can't currently report on constraints that were made with this technique, hence they'd come out as "added" on every run. * the ``op.create_table()`` directive will be auto-generated with the ``UniqueConstraint`` objects inline, but will not double them up with a separate ``create_unique_constraint()`` call, which may have been occurring. Indexes still get rendered as distinct ``op.create_index()`` calls even when the corresponding table was created in the same script. * the inline ``UniqueConstraint`` within ``op.create_table()`` includes all the options like ``deferrable``, ``initially``, etc. Previously these weren't rendering. .. change:: :tags: feature, mssql Added new argument ``mssql_drop_foreign_key`` to :meth:`.Operations.drop_column`. Like ``mssql_drop_default`` and ``mssql_drop_check``, will do an inline lookup for a single foreign key which applies to this column, and drop it. For a column with more than one FK, you'd still need to explicitly use :meth:`.Operations.drop_constraint` given the name, even though only MSSQL has this limitation in the first place. .. change:: :tags: bug, mssql The MSSQL backend will add the batch separator (e.g. ``"GO"``) in ``--sql`` mode after the final ``COMMIT`` statement, to ensure that statement is also processed in batch mode. Courtesy Derek Harland. .. changelog:: :version: 0.6.1 :released: Wed Nov 27 2013 .. change:: :tags: bug, mysql :tickets: 152 Fixed bug where :func:`.op.alter_column` in the MySQL dialect would fail to apply quotes to column names that had mixed casing or spaces. .. change:: :tags: feature Expanded the size of the "slug" generated by "revision" to 40 characters, which is also configurable by new field ``truncate_slug_length``; and also split on the word rather than the character; courtesy Frozenball. .. change:: :tags: bug :tickets: 135 Fixed the output wrapping for Alembic message output, so that we either get the terminal width for "pretty printing" with indentation, or if not we just output the text as is; in any case the text won't be wrapped too short. .. change:: :tags: bug Fixes to Py3k in-place compatibity regarding output encoding and related; the use of the new io.* package introduced some incompatibilities on Py2k. These should be resolved, due to the introduction of new adapter types for translating from io.* to Py2k file types, StringIO types. Thanks to Javier Santacruz for help with this. .. change:: :tags: bug :tickets: 145 Fixed py3k bug where the wrong form of ``next()`` was being called when using the list_templates command. Courtesy Chris Wilkes. .. change:: :tags: feature :tickets: 107 Support for autogeneration detection and rendering of indexes and unique constraints has been added. The logic goes through some effort in order to differentiate between true unique constraints and unique indexes, where there are some quirks on backends like Postgresql. The effort here in producing the feature and tests is courtesy of IJL. .. change:: :tags: bug Fixed bug introduced by new ``include_object`` argument where the inspected column would be misinterpreted when using a user-defined type comparison function, causing a KeyError or similar expression-related error. Fix courtesy Maarten van Schaik. .. change:: :tags: bug Added the "deferrable" keyword argument to :func:`.op.create_foreign_key` so that ``DEFERRABLE`` constraint generation is supported; courtesy Pedro Romano. .. change:: :tags: bug :tickets: 137 Ensured that strings going to stdout go through an encode/decode phase, so that any non-ASCII characters get to the output stream correctly in both Py2k and Py3k. Also added source encoding detection using Mako's parse_encoding() routine in Py2k so that the __doc__ of a non-ascii revision file can be treated as unicode in Py2k. .. changelog:: :version: 0.6.0 :released: Fri July 19 2013 .. change:: :tags: feature :tickets: 101 Added new kw argument to :meth:`.EnvironmentContext.configure` ``include_object``. This is a more flexible version of the ``include_symbol`` argument which allows filtering of columns as well as tables from the autogenerate process, and in the future will also work for types, constraints and other constructs. The fully constructed schema object is passed, including its name and type as well as a flag indicating if the object is from the local application metadata or is reflected. .. change:: :tags: feature The output of the ``alembic history`` command is now expanded to show information about each change on multiple lines, including the full top message, resembling the formatting of git log. .. change:: :tags: feature Added :attr:`alembic.config.Config.cmd_opts` attribute, allows access to the ``argparse`` options passed to the ``alembic`` runner. .. change:: :tags: feature :tickets: 120 Added new command line argument ``-x``, allows extra arguments to be appended to the command line which can be consumed within an ``env.py`` script by looking at ``context.config.cmd_opts.x``, or more simply a new method :meth:`.EnvironmentContext.get_x_argument`. .. change:: :tags: bug :tickets: 125 Added support for options like "name" etc. to be rendered within CHECK constraints in autogenerate. Courtesy Sok Ann Yap. .. change:: :tags: misc Source repository has been moved from Mercurial to Git. .. change:: :tags: bug Repaired autogenerate rendering of ForeignKeyConstraint to include use_alter argument, if present. .. change:: :tags: feature Added ``-r`` argument to ``alembic history`` command, allows specification of ``[start]:[end]`` to view a slice of history. Accepts revision numbers, symbols "base", "head", a new symbol "current" representing the current migration, as well as relative ranges for one side at a time (i.e. ``-r-5:head``, ``-rcurrent:+3``). Courtesy Atsushi Odagiri for this feature. .. change:: :tags: feature :tickets: 55 Source base is now in-place for Python 2.6 through 3.3, without the need for 2to3. Support for Python 2.5 and below has been dropped. Huge thanks to Hong Minhee for all the effort on this! .. changelog:: :version: 0.5.0 :released: Thu Apr 4 2013 .. note:: Alembic 0.5.0 now requires at least version 0.7.3 of SQLAlchemy to run properly. Support for 0.6 has been dropped. .. change:: :tags: feature :tickets: 76 Added ``version_table_schema`` argument to :meth:`.EnvironmentContext.configure`, complements the ``version_table`` argument to set an optional remote schema for the version table. Courtesy Christian Blume. .. change:: :tags: bug, postgresql :tickets: 32 Fixed format of RENAME for table that includes schema with Postgresql; the schema name shouldn't be in the "TO" field. .. change:: :tags: feature :tickets: 90 Added ``output_encoding`` option to :meth:`.EnvironmentContext.configure`, used with ``--sql`` mode to apply an encoding to the output stream. .. change:: :tags: feature :tickets: 93 Added :meth:`.Operations.create_primary_key` operation, will genenerate an ADD CONSTRAINT for a primary key. .. change:: :tags: bug, mssql :tickets: 109 Fixed bug whereby double quoting would be applied to target column name during an ``sp_rename`` operation. .. change:: :tags: bug, sqlite, mysql :tickets: 112 transactional_ddl flag for SQLite, MySQL dialects set to False. MySQL doesn't support it, SQLite does but current pysqlite driver does not. .. change:: :tags: feature :tickets: 115 upgrade and downgrade commands will list the first line of the docstring out next to the version number. Courtesy Hong Minhee. .. change:: :tags: feature Added --head-only option to "alembic current", will print current version plus the symbol "(head)" if this version is the head or not. Courtesy Charles-Axel Dein. .. change:: :tags: bug :tickets: 110 Autogenerate will render additional table keyword arguments like "mysql_engine" and others within op.create_table(). .. change:: :tags: feature :tickets: 108 The rendering of any construct during autogenerate can be customized, in particular to allow special rendering for user-defined column, constraint subclasses, using new ``render_item`` argument to :meth:`.EnvironmentContext.configure`. .. change:: :tags: bug Fixed bug whereby create_index() would include in the constraint columns that are added to all Table objects using events, externally to the generation of the constraint. This is the same issue that was fixed for unique constraints in version 0.3.2. .. change:: :tags: bug Worked around a backwards-incompatible regression in Python3.3 regarding argparse; running "alembic" with no arguments now yields an informative error in py3.3 as with all previous versions. Courtesy Andrey Antukh. .. change:: :tags: change SQLAlchemy 0.6 is no longer supported by Alembic - minimum version is 0.7.3, full support is as of 0.7.9. .. change:: :tags: bug :tickets: 104 A host of argument name changes within migration operations for consistency. Keyword arguments will continue to work on the old name for backwards compatibility, however required positional arguments will not: :meth:`.Operations.alter_column` - ``name`` -> ``new_column_name`` - old name will work for backwards compatibility. :meth:`.Operations.create_index` - ``tablename`` -> ``table_name`` - argument is positional. :meth:`.Operations.drop_index` - ``tablename`` -> ``table_name`` - old name will work for backwards compatibility. :meth:`.Operations.drop_constraint` - ``tablename`` -> ``table_name`` - argument is positional. :meth:`.Operations.drop_constraint` - ``type`` -> ``type_`` - old name will work for backwards compatibility .. changelog:: :version: 0.4.2 :released: Fri Jan 11 2013 .. change:: :tags: bug, autogenerate :tickets: 99 Fixed bug where autogenerate would fail if a Column to be added to a table made use of the ".key" paramter. .. change:: :tags: bug, sqlite :tickets: 98 The "implicit" constraint generated by a type such as Boolean or Enum will not generate an ALTER statement when run on SQlite, which does not support ALTER for the purpose of adding/removing constraints separate from the column def itself. While SQLite supports adding a CHECK constraint at the column level, SQLAlchemy would need modification to support this. A warning is emitted indicating this constraint cannot be added in this scenario. .. change:: :tags: bug :tickets: 96 Added a workaround to setup.py to prevent "NoneType" error from occuring when "setup.py test" is run. .. change:: :tags: bug :tickets: 96 Added an append_constraint() step to each condition within test_autogenerate:AutogenRenderTest.test_render_fk_constraint_kwarg if the SQLAlchemy version is less than 0.8, as ForeignKeyConstraint does not auto-append prior to 0.8. .. change:: :tags: feature :tickets: 96 Added a README.unittests with instructions for running the test suite fully. .. changelog:: :version: 0.4.1 :released: Sun Dec 9 2012 .. change:: :tags: bug :tickets: 92 Added support for autogenerate render of ForeignKeyConstraint options onupdate, ondelete, initially, and deferred. .. change:: :tags: bug :tickets: 94 Autogenerate will include "autoincrement=False" in the rendered table metadata if this flag was set to false on the source :class:`.Column` object. .. change:: :tags: feature :tickets: 66 Explicit error message describing the case when downgrade --sql is used without specifying specific start/end versions. .. change:: :tags: bug :tickets: 81 Removed erroneous "emit_events" attribute from operations.create_table() documentation. .. change:: :tags: bug :tickets: Fixed the minute component in file_template which returned the month part of the create date. .. changelog:: :version: 0.4.0 :released: Mon Oct 01 2012 .. change:: :tags: feature :tickets: 33 Support for tables in alternate schemas has been added fully to all operations, as well as to the autogenerate feature. When using autogenerate, specifying the flag include_schemas=True to Environment.configure() will also cause autogenerate to scan all schemas located by Inspector.get_schema_names(), which is supported by *some* (but not all) SQLAlchemy dialects including Postgresql. *Enormous* thanks to Bruno Binet for a huge effort in implementing as well as writing tests. . .. change:: :tags: feature :tickets: 70 The command line runner has been organized into a reusable CommandLine object, so that other front-ends can re-use the argument parsing built in. .. change:: :tags: feature :tickets: 43 Added "stdout" option to Config, provides control over where the "print" output of commands like "history", "init", "current" etc. are sent. .. change:: :tags: bug :tickets: 71 Fixed the "multidb" template which was badly out of date. It now generates revision files using the configuration to determine the different upgrade_() methods needed as well, instead of needing to hardcode these. Huge thanks to BryceLohr for doing the heavy lifting here. .. change:: :tags: bug :tickets: 72 Fixed the regexp that was checking for .py files in the version directory to allow any .py file through. Previously it was doing some kind of defensive checking, probably from some early notions of how this directory works, that was prohibiting various filename patterns such as those which begin with numbers. .. change:: :tags: bug :tickets: Fixed MySQL rendering for server_default which didn't work if the server_default was a generated SQL expression. Courtesy Moriyoshi Koizumi. .. change:: :tags: feature :tickets: Added support for alteration of MySQL columns that have AUTO_INCREMENT, as well as enabling this flag. Courtesy Moriyoshi Koizumi. .. changelog:: :version: 0.3.6 :released: Wed Aug 15 2012 .. change:: :tags: feature :tickets: 27 Added include_symbol option to EnvironmentContext.configure(), specifies a callable which will include/exclude tables in their entirety from the autogeneration process based on name. .. change:: :tags: feature :tickets: 59 Added year, month, day, hour, minute, second variables to file_template. .. change:: :tags: feature :tickets: Added 'primary' to the list of constraint types recognized for MySQL drop_constraint(). .. change:: :tags: feature :tickets: Added --sql argument to the "revision" command, for the use case where the "revision_environment" config option is being used but SQL access isn't desired. .. change:: :tags: bug :tickets: Repaired create_foreign_key() for self-referential foreign keys, which weren't working at all. .. change:: :tags: bug :tickets: 63 'alembic' command reports an informative error message when the configuration is missing the 'script_directory' key. .. change:: :tags: bug :tickets: 62 Fixes made to the constraints created/dropped alongside so-called "schema" types such as Boolean and Enum. The create/drop constraint logic does not kick in when using a dialect that doesn't use constraints for these types, such as postgresql, even when existing_type is specified to alter_column(). Additionally, the constraints are not affected if existing_type is passed but type\_ is not, i.e. there's no net change in type. .. change:: :tags: bug :tickets: 66 Improved error message when specifiying non-ordered revision identifiers to cover the case when the "higher" rev is None, improved message overall. .. changelog:: :version: 0.3.5 :released: Sun Jul 08 2012 .. change:: :tags: bug :tickets: 31 Fixed issue whereby reflected server defaults wouldn't be quoted correctly; uses repr() now. .. change:: :tags: bug :tickets: 58 Fixed issue whereby when autogenerate would render create_table() on the upgrade side for a table that has a Boolean type, an unnecessary CheckConstraint() would be generated. .. change:: :tags: feature :tickets: Implemented SQL rendering for CheckConstraint() within autogenerate upgrade, including for literal SQL as well as SQL Expression Language expressions. .. changelog:: :version: 0.3.4 :released: Sat Jun 02 2012 .. change:: :tags: bug :tickets: Fixed command-line bug introduced by the "revision_environment" feature. .. changelog:: :version: 0.3.3 :released: Sat Jun 02 2012 .. change:: :tags: feature :tickets: New config argument "revision_environment=true", causes env.py to be run unconditionally when the "revision" command is run, to support script.py.mako templates with dependencies on custom "template_args". .. change:: :tags: feature :tickets: Added "template_args" option to configure() so that an env.py can add additional arguments to the template context when running the "revision" command. This requires either --autogenerate or the configuration directive "revision_environment=true". .. change:: :tags: bug :tickets: 44 Added "type" argument to op.drop_constraint(), and implemented full constraint drop support for MySQL. CHECK and undefined raise an error. MySQL needs the constraint type in order to emit a DROP CONSTRAINT. .. change:: :tags: feature :tickets: 34 Added version_table argument to EnvironmentContext.configure(), allowing for the configuration of the version table name. .. change:: :tags: feature :tickets: Added support for "relative" migration identifiers, i.e. "alembic upgrade +2", "alembic downgrade -1". Courtesy Atsushi Odagiri for this feature. .. change:: :tags: bug :tickets: 49 Fixed bug whereby directories inside of the template directories, such as __pycache__ on Pypy, would mistakenly be interpreted as files which are part of the template. .. changelog:: :version: 0.3.2 :released: Mon Apr 30 2012 .. change:: :tags: feature :tickets: 40 Basic support for Oracle added, courtesy shgoh. .. change:: :tags: feature :tickets: Added support for UniqueConstraint in autogenerate, courtesy Atsushi Odagiri .. change:: :tags: bug :tickets: Fixed support of schema-qualified ForeignKey target in column alter operations, courtesy Alexander Kolov. .. change:: :tags: bug :tickets: Fixed bug whereby create_unique_constraint() would include in the constraint columns that are added to all Table objects using events, externally to the generation of the constraint. .. changelog:: :version: 0.3.1 :released: Sat Apr 07 2012 .. change:: :tags: bug :tickets: 41 bulk_insert() fixes: 1. bulk_insert() operation was not working most likely since the 0.2 series when used with an engine. 2. Repaired bulk_insert() to complete when used against a lower-case-t table and executing with only one set of parameters, working around SQLAlchemy bug #2461 in this regard. 3. bulk_insert() uses "inline=True" so that phrases like RETURNING and such don't get invoked for single-row bulk inserts. 4. bulk_insert() will check that you're passing a list of dictionaries in, raises TypeError if not detected. .. changelog:: :version: 0.3.0 :released: Thu Apr 05 2012 .. change:: :tags: general :tickets: The focus of 0.3 is to clean up and more fully document the public API of Alembic, including better accessors on the MigrationContext and ScriptDirectory objects. Methods that are not considered to be public on these objects have been underscored, and methods which should be public have been cleaned up and documented, including: MigrationContext.get_current_revision() ScriptDirectory.iterate_revisions() ScriptDirectory.get_current_head() ScriptDirectory.get_heads() ScriptDirectory.get_base() ScriptDirectory.generate_revision() .. change:: :tags: feature :tickets: Added a bit of autogenerate to the public API in the form of the function alembic.autogenerate.compare_metadata. .. changelog:: :version: 0.2.2 :released: Mon Mar 12 2012 .. change:: :tags: feature :tickets: Informative error message when op.XYZ directives are invoked at module import time. .. change:: :tags: bug :tickets: 35 Fixed inappropriate direct call to util.err() and therefore sys.exit() when Config failed to locate the config file within library usage. .. change:: :tags: bug :tickets: Autogenerate will emit CREATE TABLE and DROP TABLE directives according to foreign key dependency order. .. change:: :tags: bug :tickets: implement 'tablename' parameter on drop_index() as this is needed by some backends. .. change:: :tags: feature :tickets: Added execution_options parameter to op.execute(), will call execution_options() on the Connection before executing. The immediate use case here is to allow access to the new no_parameters option in SQLAlchemy 0.7.6, which allows some DBAPIs (psycopg2, MySQLdb) to allow percent signs straight through without escaping, thus providing cross-compatible operation with DBAPI execution and static script generation. .. change:: :tags: bug :tickets: setup.py won't install argparse if on Python 2.7/3.2 .. change:: :tags: feature :tickets: 29 script_location can be interpreted by pkg_resources.resource_filename(), if it is a non-absolute URI that contains colons. This scheme is the same one used by Pyramid. .. change:: :tags: feature :tickets: added missing support for onupdate/ondelete flags for ForeignKeyConstraint, courtesy Giacomo Bagnoli .. change:: :tags: bug :tickets: 30 fixed a regression regarding an autogenerate error message, as well as various glitches in the Pylons sample template. The Pylons sample template requires that you tell it where to get the Engine from now. courtesy Marcin Kuzminski .. change:: :tags: bug :tickets: drop_index() ensures a dummy column is added when it calls "Index", as SQLAlchemy 0.7.6 will warn on index with no column names. .. changelog:: :version: 0.2.1 :released: Tue Jan 31 2012 .. change:: :tags: bug :tickets: 26 Fixed the generation of CHECK constraint, regression from 0.2.0 .. changelog:: :version: 0.2.0 :released: Mon Jan 30 2012 .. change:: :tags: feature :tickets: 19 API rearrangement allows everything Alembic does to be represented by contextual objects, including EnvironmentContext, MigrationContext, and Operations. Other libraries and applications can now use things like "alembic.op" without relying upon global configuration variables. The rearrangement was done such that existing migrations should be OK, as long as they use the pattern of "from alembic import context" and "from alembic import op", as these are now contextual objects, not modules. .. change:: :tags: feature :tickets: 24 The naming of revision files can now be customized to be some combination of "rev id" and "slug", the latter of which is based on the revision message. By default, the pattern "_" is used for new files. New script files should include the "revision" variable for this to work, which is part of the newer script.py.mako scripts. .. change:: :tags: bug :tickets: 25 env.py templates call connection.close() to better support programmatic usage of commands; use NullPool in conjunction with create_engine() as well so that no connection resources remain afterwards. .. change:: :tags: bug :tickets: 22 fix the config.main() function to honor the arguments passed, remove no longer used "scripts/alembic" as setuptools creates this for us. .. change:: :tags: bug :tickets: Fixed alteration of column type on MSSQL to not include the keyword "TYPE". .. change:: :tags: feature :tickets: 23 Can create alembic.config.Config with no filename, use set_main_option() to add values. Also added set_section_option() which will add sections. .. changelog:: :version: 0.1.1 :released: Wed Jan 04 2012 .. change:: :tags: bug :tickets: Clean up file write operations so that file handles are closed. .. change:: :tags: feature :tickets: PyPy is supported. .. change:: :tags: feature :tickets: Python 2.5 is supported, needs __future__.with_statement .. change:: :tags: bug :tickets: Fix autogenerate so that "pass" is generated between the two comments if no net migrations were present. .. change:: :tags: bug :tickets: 16 Fix autogenerate bug that prevented correct reflection of a foreign-key referenced table in the list of "to remove". .. change:: :tags: bug :tickets: 17 Fix bug where create_table() didn't handle self-referential foreign key correctly .. change:: :tags: bug :tickets: 18 Default prefix for autogenerate directives is "op.", matching the mako templates. .. change:: :tags: feature :tickets: 18 Add alembic_module_prefix argument to configure() to complement sqlalchemy_module_prefix. .. change:: :tags: bug :tickets: 14 fix quotes not being rendered in ForeignKeConstraint during autogenerate .. changelog:: :version: 0.1.0 :released: Wed Nov 30 2011 .. change:: :tags: :tickets: Initial release. Status of features: .. change:: :tags: :tickets: Alembic is used in at least one production environment, but should still be considered ALPHA LEVEL SOFTWARE as of this release, particularly in that many features are expected to be missing / unimplemented. Major API changes are not anticipated but for the moment nothing should be assumed. The author asks that you *please* report all issues, missing features, workarounds etc. to the bugtracker. .. change:: :tags: :tickets: Python 3 is supported and has been tested. .. change:: :tags: :tickets: The "Pylons" and "MultiDB" environment templates have not been directly tested - these should be considered to be samples to be modified as needed. Multiple database support itself is well tested, however. .. change:: :tags: :tickets: Postgresql and MS SQL Server environments have been tested for several weeks in a production environment. In particular, some involved workarounds were implemented to allow fully-automated dropping of default- or constraint-holding columns with SQL Server. .. change:: :tags: :tickets: MySQL support has also been implemented to a basic degree, including MySQL's awkward style of modifying columns being accommodated. .. change:: :tags: :tickets: Other database environments not included among those three have *not* been tested, *at all*. This includes Firebird, Oracle, Sybase. Adding support for these backends should be straightforward. Please report all missing/ incorrect behaviors to the bugtracker! Patches are welcome here but are optional - please just indicate the exact format expected by the target database. .. change:: :tags: :tickets: SQLite, as a backend, has almost no support for schema alterations to existing databases. The author would strongly recommend that SQLite not be used in a migration context - just dump your SQLite database into an intermediary format, then dump it back into a new schema. For dev environments, the dev installer should be building the whole DB from scratch. Or just use Postgresql, which is a much better database for non-trivial schemas. Requests for full ALTER support on SQLite should be reported to SQLite's bug tracker at http://www.sqlite.org/src/wiki?name=Bug+Reports, as Alembic will not be implementing the "rename the table to a temptable then copy the data into a new table" workaround. Note that Alembic will at some point offer an extensible API so that you can implement commands like this yourself. .. change:: :tags: :tickets: Well-tested directives include add/drop table, add/drop column, including support for SQLAlchemy "schema" types which generate additional CHECK constraints, i.e. Boolean, Enum. Other directives not included here have *not* been strongly tested in production, i.e. rename table, etc. .. change:: :tags: :tickets: Both "online" and "offline" migrations, the latter being generated SQL scripts to hand off to a DBA, have been strongly production tested against Postgresql and SQL Server. .. change:: :tags: :tickets: Modify column type, default status, nullable, is functional and tested across PG, MSSQL, MySQL, but not yet widely tested in production usage. .. change:: :tags: :tickets: Many migrations are still outright missing, i.e. create/add sequences, etc. As a workaround, execute() can be used for those which are missing, though posting of tickets for new features/missing behaviors is strongly encouraged. .. change:: :tags: :tickets: Autogenerate feature is implemented and has been tested, though only a little bit in a production setting. In particular, detection of type and server default changes are optional and are off by default; they can also be customized by a callable. Both features work but can have surprises particularly the disparity between BIT/TINYINT and boolean, which hasn't yet been worked around, as well as format changes performed by the database on defaults when it reports back. When enabled, the PG dialect will execute the two defaults to be compared to see if they are equivalent. Other backends may need to do the same thing. The autogenerate feature only generates "candidate" commands which must be hand-tailored in any case, so is still a useful feature and is safe to use. Please report missing/broken features of autogenerate! This will be a great feature and will also improve SQLAlchemy's reflection services. .. change:: :tags: :tickets: Support for non-ASCII table, column and constraint names is mostly nonexistent. This is also a straightforward feature add as SQLAlchemy itself supports unicode identifiers; Alembic itself will likely need fixes to logging, column identification by key, etc. for full support here. zzzeek-alembic-bee044a1c187/docs/build/conf.py000066400000000000000000000167131353106760100211400ustar00rootroot00000000000000# -*- coding: utf-8 -*- # # Alembic documentation build configuration file, created by # sphinx-quickstart on Sat May 1 12:47:55 2010. # # This file is execfile()d with the current directory set to its containing # dir. # # Note that not all possible configuration values are present in this # autogenerated file. # # All configuration values have a default; values that are commented out # serve to show the default. import os import sys # If extensions (or modules to document with autodoc) are in another directory, # add these directories to sys.path here. If the directory is relative to the # documentation root, use os.path.abspath to make it absolute, like shown here. sys.path.append(os.path.abspath(".")) # If your extensions are in another directory, add it here. If the directory # is relative to the documentation root, use os.path.abspath to make it # absolute, like shown here. sys.path.insert(0, os.path.abspath("../../")) if True: import alembic # noqa # -- General configuration --------------------------------------------------- # Add any Sphinx extension module names here, as strings. They can be # extensions coming with Sphinx (named 'sphinx.ext.*') or your custom ones. extensions = [ "sphinx.ext.autodoc", "sphinx.ext.intersphinx", "changelog", "sphinx_paramlinks", ] # tags to sort on inside of sections changelog_sections = [ "changed", "feature", "bug", "usecase", "moved", "removed", ] changelog_render_ticket = "https://github.com/sqlalchemy/alembic/issues/%s" changelog_render_pullreq = "https://github.com/sqlalchemy/alembic/pull/%s" changelog_render_pullreq = { "default": "https://github.com/sqlalchemy/alembic/pull/%s", "github": "https://github.com/sqlalchemy/alembic/pull/%s", } autodoc_default_flags = ["members"] # Add any paths that contain templates here, relative to this directory. templates_path = ["_templates"] # The suffix of source filenames. source_suffix = ".rst" # The encoding of source files. # source_encoding = 'utf-8' nitpicky = True # The master toctree document. master_doc = "index" # General information about the project. project = u"Alembic" copyright = u"2010-2019, Mike Bayer" # noqa # The version info for the project you're documenting, acts as replacement for # |version| and |release|, also used in various other places throughout the # built documents. # # The short X.Y version. version = alembic.__version__ # The full version, including alpha/beta/rc tags. release = "1.1.0" release_date = "August 26, 2019" # The language for content autogenerated by Sphinx. Refer to documentation # for a list of supported languages. # language = None # There are two options for replacing |today|: either, you set today to some # non-false value, then it is used: # today = '' # Else, today_fmt is used as the format for a strftime call. # today_fmt = '%B %d, %Y' # List of documents that shouldn't be included in the build. # unused_docs = [] # List of directories, relative to source directory, that shouldn't be searched # for source files. exclude_trees = [] # The reST default role (used for this markup: `text`) to use for all # documents. # default_role = None # If true, '()' will be appended to :func: etc. cross-reference text. # add_function_parentheses = True # If true, the current module name will be prepended to all description # unit titles (such as .. function::). # add_module_names = True # If true, sectionauthor and moduleauthor directives will be shown in the # output. They are ignored by default. # show_authors = False # The name of the Pygments (syntax highlighting) style to use. pygments_style = "sphinx" # A list of ignored prefixes for module index sorting. # modindex_common_prefix = [] # -- Options for HTML output ------------------------------------------------ # The theme to use for HTML and HTML Help pages. Major themes that come with # Sphinx are currently 'default' and 'sphinxdoc'. html_theme = "nature" html_style = "nature_override.css" # Theme options are theme-specific and customize the look and feel of a theme # further. For a list of options available for each theme, see the # documentation. # html_theme_options = {} # Add any paths that contain custom themes here, relative to this directory. # html_theme_path = [] # The name for this set of Sphinx documents. If None, it defaults to # " v documentation". # html_title = None # A shorter title for the navigation bar. Default is the same as html_title. # html_short_title = None # The name of an image file (relative to this directory) to place at the top # of the sidebar. # html_logo = None # The name of an image file (within the static path) to use as favicon of the # docs. This file should be a Windows icon file (.ico) being 16x16 or 32x32 # pixels large. # html_favicon = None # Add any paths that contain custom static files (such as style sheets) here, # relative to this directory. They are copied after the builtin static files, # so a file named "default.css" will overwrite the builtin "default.css". html_static_path = ["_static"] # If not '', a 'Last updated on:' timestamp is inserted at every page bottom, # using the given strftime format. # html_last_updated_fmt = '%b %d, %Y' # If true, SmartyPants will be used to convert quotes and dashes to # typographically correct entities. # html_use_smartypants = True # Custom sidebar templates, maps document names to template names. html_sidebars = { "**": [ "site_custom_sidebars.html", "localtoc.html", "searchbox.html", "relations.html", ] } # Additional templates that should be rendered to pages, maps page names to # template names. # html_additional_pages = {} # If false, no module index is generated. # html_use_modindex = True # If false, no index is generated. # html_use_index = True # If true, the index is split into individual pages for each letter. # html_split_index = False # If true, links to the reST sources are added to the pages. # html_show_sourcelink = True # If true, an OpenSearch description file will be output, and all pages will # contain a tag referring to it. The value of this option must be the # base URL from which the finished HTML is served. # html_use_opensearch = '' # If nonempty, this is the file name suffix for HTML files (e.g. ".xhtml"). # html_file_suffix = '' # Output file base name for HTML help builder. htmlhelp_basename = "Alembicdoc" # -- Options for LaTeX output ----------------------------------------------- # The paper size ('letter' or 'a4'). # latex_paper_size = 'letter' # The font size ('10pt', '11pt' or '12pt'). # latex_font_size = '10pt' # Grouping the document tree into LaTeX files. List of tuples (source start # file, target name, title, author, documentclass [howto/manual]). latex_documents = [ ("index", "Alembic.tex", u"Alembic Documentation", u"Mike Bayer", "manual") ] # The name of an image file (relative to this directory) to place at the top of # the title page. # latex_logo = None # For "manual" documents, if this is true, then toplevel headings are parts, # not chapters. # latex_use_parts = False # Additional stuff for the LaTeX preamble. # latex_preamble = '' # Documents to append as an appendix to all manuals. # latex_appendices = [] # If false, no module index is generated. # latex_use_modindex = True # {'python': ('http://docs.python.org/3.2', None)} autoclass_content = "both" intersphinx_mapping = { "sqla": ("http://www.sqlalchemy.org/docs/", None), "python": ("http://docs.python.org/", None), } zzzeek-alembic-bee044a1c187/docs/build/cookbook.rst000066400000000000000000001266511353106760100222040ustar00rootroot00000000000000======== Cookbook ======== A collection of "How-Tos" highlighting popular ways to extend Alembic. .. note:: This is a new section where we catalogue various "how-tos" based on user requests. It is often the case that users will request a feature only to learn it can be provided with a simple customization. .. _building_uptodate: Building an Up to Date Database from Scratch ============================================= There's a theory of database migrations that says that the revisions in existence for a database should be able to go from an entirely blank schema to the finished product, and back again. Alembic can roll this way. Though we think it's kind of overkill, considering that SQLAlchemy itself can emit the full CREATE statements for any given model using :meth:`~sqlalchemy.schema.MetaData.create_all`. If you check out a copy of an application, running this will give you the entire database in one shot, without the need to run through all those migration files, which are instead tailored towards applying incremental changes to an existing database. Alembic can integrate with a :meth:`~sqlalchemy.schema.MetaData.create_all` script quite easily. After running the create operation, tell Alembic to create a new version table, and to stamp it with the most recent revision (i.e. ``head``):: # inside of a "create the database" script, first create # tables: my_metadata.create_all(engine) # then, load the Alembic configuration and generate the # version table, "stamping" it with the most recent rev: from alembic.config import Config from alembic import command alembic_cfg = Config("/path/to/yourapp/alembic.ini") command.stamp(alembic_cfg, "head") When this approach is used, the application can generate the database using normal SQLAlchemy techniques instead of iterating through hundreds of migration scripts. Now, the purpose of the migration scripts is relegated just to movement between versions on out-of-date databases, not *new* databases. You can now remove old migration files that are no longer represented on any existing environments. To prune old migration files, simply delete the files. Then, in the earliest, still-remaining migration file, set ``down_revision`` to ``None``:: # replace this: #down_revision = '290696571ad2' # with this: down_revision = None That file now becomes the "base" of the migration series. Conditional Migration Elements ============================== This example features the basic idea of a common need, that of affecting how a migration runs based on command line switches. The technique to use here is simple; within a migration script, inspect the :meth:`.EnvironmentContext.get_x_argument` collection for any additional, user-defined parameters. Then take action based on the presence of those arguments. To make it such that the logic to inspect these flags is easy to use and modify, we modify our ``script.py.mako`` template to make this feature available in all new revision files: .. code-block:: mako """${message} Revision ID: ${up_revision} Revises: ${down_revision} Create Date: ${create_date} """ # revision identifiers, used by Alembic. revision = ${repr(up_revision)} down_revision = ${repr(down_revision)} from alembic import op import sqlalchemy as sa ${imports if imports else ""} from alembic import context def upgrade(): schema_upgrades() if context.get_x_argument(as_dictionary=True).get('data', None): data_upgrades() def downgrade(): if context.get_x_argument(as_dictionary=True).get('data', None): data_downgrades() schema_downgrades() def schema_upgrades(): """schema upgrade migrations go here.""" ${upgrades if upgrades else "pass"} def schema_downgrades(): """schema downgrade migrations go here.""" ${downgrades if downgrades else "pass"} def data_upgrades(): """Add any optional data upgrade migrations here!""" pass def data_downgrades(): """Add any optional data downgrade migrations here!""" pass Now, when we create a new migration file, the ``data_upgrades()`` and ``data_downgrades()`` placeholders will be available, where we can add optional data migrations:: """rev one Revision ID: 3ba2b522d10d Revises: None Create Date: 2014-03-04 18:05:36.992867 """ # revision identifiers, used by Alembic. revision = '3ba2b522d10d' down_revision = None from alembic import op import sqlalchemy as sa from sqlalchemy import String, Column from sqlalchemy.sql import table, column from alembic import context def upgrade(): schema_upgrades() if context.get_x_argument(as_dictionary=True).get('data', None): data_upgrades() def downgrade(): if context.get_x_argument(as_dictionary=True).get('data', None): data_downgrades() schema_downgrades() def schema_upgrades(): """schema upgrade migrations go here.""" op.create_table("my_table", Column('data', String)) def schema_downgrades(): """schema downgrade migrations go here.""" op.drop_table("my_table") def data_upgrades(): """Add any optional data upgrade migrations here!""" my_table = table('my_table', column('data', String), ) op.bulk_insert(my_table, [ {'data': 'data 1'}, {'data': 'data 2'}, {'data': 'data 3'}, ] ) def data_downgrades(): """Add any optional data downgrade migrations here!""" op.execute("delete from my_table") To invoke our migrations with data included, we use the ``-x`` flag:: alembic -x data=true upgrade head The :meth:`.EnvironmentContext.get_x_argument` is an easy way to support new commandline options within environment and migration scripts. .. _connection_sharing: Sharing a Connection with a Series of Migration Commands and Environments ========================================================================= It is often the case that an application will need to call upon a series of commands within :ref:`alembic.command.toplevel`, where it would be advantageous for all operations to proceed along a single transaction. The connectivity for a migration is typically solely determined within the ``env.py`` script of a migration environment, which is called within the scope of a command. The steps to take here are: 1. Produce the :class:`~sqlalchemy.engine.Connection` object to use. 2. Place it somewhere that ``env.py`` will be able to access it. This can be either a. a module-level global somewhere, or b. an attribute which we place into the :attr:`.Config.attributes` dictionary (if we are on an older Alembic version, we may also attach an attribute directly to the :class:`.Config` object). 3. The ``env.py`` script is modified such that it looks for this :class:`~sqlalchemy.engine.Connection` and makes use of it, in lieu of building up its own :class:`~sqlalchemy.engine.Engine` instance. We illustrate using :attr:`.Config.attributes`:: from alembic import command, config cfg = config.Config("/path/to/yourapp/alembic.ini") with engine.begin() as connection: cfg.attributes['connection'] = connection command.upgrade(cfg, "head") Then in ``env.py``:: def run_migrations_online(): connectable = config.attributes.get('connection', None) if connectable is None: # only create Engine if we don't have a Connection # from the outside connectable = engine_from_config( config.get_section(config.config_ini_section), prefix='sqlalchemy.', poolclass=pool.NullPool) # when connectable is already a Connection object, calling # connect() gives us a *branched connection*. with connectable.connect() as connection: context.configure( connection=connection, target_metadata=target_metadata ) with context.begin_transaction(): context.run_migrations() .. topic:: Branched Connections Note that we are calling the ``connect()`` method, **even if we are using a** :class:`~sqlalchemy.engine.Connection` **object to start with**. The effect this has when calling :meth:`~sqlalchemy.engine.Connection.connect` is that SQLAlchemy passes us a **branch** of the original connection; it is in every way the same as the :class:`~sqlalchemy.engine.Connection` we started with, except it provides **nested scope**; the context we have here as well as the :meth:`~sqlalchemy.engine.Connection.close` method of this branched connection doesn't actually close the outer connection, which stays active for continued use. .. versionadded:: 0.7.5 Added :attr:`.Config.attributes`. .. _replaceable_objects: Replaceable Objects =================== This recipe proposes a hypothetical way of dealing with what we might call a *replaceable* schema object. A replaceable object is a schema object that needs to be created and dropped all at once. Examples of such objects include views, stored procedures, and triggers. Replaceable objects present a problem in that in order to make incremental changes to them, we have to refer to the whole definition at once. If we need to add a new column to a view, for example, we have to drop it entirely and recreate it fresh with the extra column added, referring to the whole structure; but to make it even tougher, if we wish to support downgrade operarations in our migration scripts, we need to refer to the *previous* version of that construct fully, and we'd much rather not have to type out the whole definition in multiple places. This recipe proposes that we may refer to the older version of a replaceable construct by directly naming the migration version in which it was created, and having a migration refer to that previous file as migrations run. We will also demonstrate how to integrate this logic within the :ref:`operation_plugins` feature introduced in Alembic 0.8. It may be very helpful to review this section first to get an overview of this API. The Replaceable Object Structure -------------------------------- We first need to devise a simple format that represents the "CREATE XYZ" / "DROP XYZ" aspect of what it is we're building. We will work with an object that represents a textual definition; while a SQL view is an object that we can define using a `table-metadata-like system `_, this is not so much the case for things like stored procedures, where we pretty much need to have a full string definition written down somewhere. We'll use a simple value object called ``ReplaceableObject`` that can represent any named set of SQL text to send to a "CREATE" statement of some kind:: class ReplaceableObject(object): def __init__(self, name, sqltext): self.name = name self.sqltext = sqltext Using this object in a migration script, assuming a Postgresql-style syntax, looks like:: customer_view = ReplaceableObject( "customer_view", "SELECT name, order_count FROM customer WHERE order_count > 0" ) add_customer_sp = ReplaceableObject( "add_customer_sp(name varchar, order_count integer)", """ RETURNS integer AS $$ BEGIN insert into customer (name, order_count) VALUES (in_name, in_order_count); END; $$ LANGUAGE plpgsql; """ ) The ``ReplaceableObject`` class is only one very simplistic way to do this. The structure of how we represent our schema objects is not too important for the purposes of this example; we can just as well put strings inside of tuples or dictionaries, as well as that we could define any kind of series of fields and class structures we want. The only important part is that below we will illustrate how organize the code that can consume the structure we create here. Create Operations for the Target Objects ---------------------------------------- We'll use the :class:`.Operations` extension API to make new operations for create, drop, and replace of views and stored procedures. Using this API is also optional; we can just as well make any kind of Python function that we would invoke from our migration scripts. However, using this API gives us operations built directly into the Alembic ``op.*`` namespace very nicely. The most intricate class is below. This is the base of our "replaceable" operation, which includes not just a base operation for emitting CREATE and DROP instructions on a ``ReplaceableObject``, it also assumes a certain model of "reversibility" which makes use of references to other migration files in order to refer to the "previous" version of an object:: from alembic.operations import Operations, MigrateOperation class ReversibleOp(MigrateOperation): def __init__(self, target): self.target = target @classmethod def invoke_for_target(cls, operations, target): op = cls(target) return operations.invoke(op) def reverse(self): raise NotImplementedError() @classmethod def _get_object_from_version(cls, operations, ident): version, objname = ident.split(".") module = operations.get_context().script.get_revision(version).module obj = getattr(module, objname) return obj @classmethod def replace(cls, operations, target, replaces=None, replace_with=None): if replaces: old_obj = cls._get_object_from_version(operations, replaces) drop_old = cls(old_obj).reverse() create_new = cls(target) elif replace_with: old_obj = cls._get_object_from_version(operations, replace_with) drop_old = cls(target).reverse() create_new = cls(old_obj) else: raise TypeError("replaces or replace_with is required") operations.invoke(drop_old) operations.invoke(create_new) The workings of this class should become clear as we walk through the example. To create usable operations from this base, we will build a series of stub classes and use :meth:`.Operations.register_operation` to make them part of the ``op.*`` namespace:: @Operations.register_operation("create_view", "invoke_for_target") @Operations.register_operation("replace_view", "replace") class CreateViewOp(ReversibleOp): def reverse(self): return DropViewOp(self.target) @Operations.register_operation("drop_view", "invoke_for_target") class DropViewOp(ReversibleOp): def reverse(self): return CreateViewOp(self.target) @Operations.register_operation("create_sp", "invoke_for_target") @Operations.register_operation("replace_sp", "replace") class CreateSPOp(ReversibleOp): def reverse(self): return DropSPOp(self.target) @Operations.register_operation("drop_sp", "invoke_for_target") class DropSPOp(ReversibleOp): def reverse(self): return CreateSPOp(self.target) To actually run the SQL like "CREATE VIEW" and "DROP SEQUENCE", we'll provide implementations using :meth:`.Operations.implementation_for` that run straight into :meth:`.Operations.execute`:: @Operations.implementation_for(CreateViewOp) def create_view(operations, operation): operations.execute("CREATE VIEW %s AS %s" % ( operation.target.name, operation.target.sqltext )) @Operations.implementation_for(DropViewOp) def drop_view(operations, operation): operations.execute("DROP VIEW %s" % operation.target.name) @Operations.implementation_for(CreateSPOp) def create_sp(operations, operation): operations.execute( "CREATE FUNCTION %s %s" % ( operation.target.name, operation.target.sqltext ) ) @Operations.implementation_for(DropSPOp) def drop_sp(operations, operation): operations.execute("DROP FUNCTION %s" % operation.target.name) All of the above code can be present anywhere within an application's source tree; the only requirement is that when the ``env.py`` script is invoked, it includes imports that ultimately call upon these classes as well as the :meth:`.Operations.register_operation` and :meth:`.Operations.implementation_for` sequences. Create Initial Migrations ------------------------- We can now illustrate how these objects look during use. For the first step, we'll create a new migration to create a "customer" table:: $ alembic revision -m "create table" We build the first revision as follows:: """create table Revision ID: 3ab8b2dfb055 Revises: Create Date: 2015-07-27 16:22:44.918507 """ # revision identifiers, used by Alembic. revision = '3ab8b2dfb055' down_revision = None branch_labels = None depends_on = None from alembic import op import sqlalchemy as sa def upgrade(): op.create_table( "customer", sa.Column('id', sa.Integer, primary_key=True), sa.Column('name', sa.String), sa.Column('order_count', sa.Integer), ) def downgrade(): op.drop_table('customer') For the second migration, we will create a view and a stored procedure which act upon this table:: $ alembic revision -m "create views/sp" This migration will use the new directives:: """create views/sp Revision ID: 28af9800143f Revises: 3ab8b2dfb055 Create Date: 2015-07-27 16:24:03.589867 """ # revision identifiers, used by Alembic. revision = '28af9800143f' down_revision = '3ab8b2dfb055' branch_labels = None depends_on = None from alembic import op import sqlalchemy as sa from foo import ReplaceableObject customer_view = ReplaceableObject( "customer_view", "SELECT name, order_count FROM customer WHERE order_count > 0" ) add_customer_sp = ReplaceableObject( "add_customer_sp(name varchar, order_count integer)", """ RETURNS integer AS $$ BEGIN insert into customer (name, order_count) VALUES (in_name, in_order_count); END; $$ LANGUAGE plpgsql; """ ) def upgrade(): op.create_view(customer_view) op.create_sp(add_customer_sp) def downgrade(): op.drop_view(customer_view) op.drop_sp(add_customer_sp) We see the use of our new ``create_view()``, ``create_sp()``, ``drop_view()``, and ``drop_sp()`` directives. Running these to "head" we get the following (this includes an edited view of SQL emitted):: $ alembic upgrade 28af9800143 INFO [alembic.runtime.migration] Context impl PostgresqlImpl. INFO [alembic.runtime.migration] Will assume transactional DDL. INFO [sqlalchemy.engine.base.Engine] BEGIN (implicit) INFO [sqlalchemy.engine.base.Engine] select relname from pg_class c join pg_namespace n on n.oid=c.relnamespace where pg_catalog.pg_table_is_visible(c.oid) and relname=%(name)s INFO [sqlalchemy.engine.base.Engine] {'name': u'alembic_version'} INFO [sqlalchemy.engine.base.Engine] SELECT alembic_version.version_num FROM alembic_version INFO [sqlalchemy.engine.base.Engine] {} INFO [sqlalchemy.engine.base.Engine] select relname from pg_class c join pg_namespace n on n.oid=c.relnamespace where pg_catalog.pg_table_is_visible(c.oid) and relname=%(name)s INFO [sqlalchemy.engine.base.Engine] {'name': u'alembic_version'} INFO [alembic.runtime.migration] Running upgrade -> 3ab8b2dfb055, create table INFO [sqlalchemy.engine.base.Engine] CREATE TABLE customer ( id SERIAL NOT NULL, name VARCHAR, order_count INTEGER, PRIMARY KEY (id) ) INFO [sqlalchemy.engine.base.Engine] {} INFO [sqlalchemy.engine.base.Engine] INSERT INTO alembic_version (version_num) VALUES ('3ab8b2dfb055') INFO [sqlalchemy.engine.base.Engine] {} INFO [alembic.runtime.migration] Running upgrade 3ab8b2dfb055 -> 28af9800143f, create views/sp INFO [sqlalchemy.engine.base.Engine] CREATE VIEW customer_view AS SELECT name, order_count FROM customer WHERE order_count > 0 INFO [sqlalchemy.engine.base.Engine] {} INFO [sqlalchemy.engine.base.Engine] CREATE FUNCTION add_customer_sp(name varchar, order_count integer) RETURNS integer AS $$ BEGIN insert into customer (name, order_count) VALUES (in_name, in_order_count); END; $$ LANGUAGE plpgsql; INFO [sqlalchemy.engine.base.Engine] {} INFO [sqlalchemy.engine.base.Engine] UPDATE alembic_version SET version_num='28af9800143f' WHERE alembic_version.version_num = '3ab8b2dfb055' INFO [sqlalchemy.engine.base.Engine] {} INFO [sqlalchemy.engine.base.Engine] COMMIT We see that our CREATE TABLE proceeded as well as the CREATE VIEW and CREATE FUNCTION operations produced by our new directives. Create Revision Migrations -------------------------- Finally, we can illustrate how we would "revise" these objects. Let's consider we added a new column ``email`` to our ``customer`` table:: $ alembic revision -m "add email col" The migration is:: """add email col Revision ID: 191a2d20b025 Revises: 28af9800143f Create Date: 2015-07-27 16:25:59.277326 """ # revision identifiers, used by Alembic. revision = '191a2d20b025' down_revision = '28af9800143f' branch_labels = None depends_on = None from alembic import op import sqlalchemy as sa def upgrade(): op.add_column("customer", sa.Column("email", sa.String())) def downgrade(): op.drop_column("customer", "email") We now need to recreate the ``customer_view`` view and the ``add_customer_sp`` function. To include downgrade capability, we will need to refer to the **previous** version of the construct; the ``replace_view()`` and ``replace_sp()`` operations we've created make this possible, by allowing us to refer to a specific, previous revision. the ``replaces`` and ``replace_with`` arguments accept a dot-separated string, which refers to a revision number and an object name, such as ``"28af9800143f.customer_view"``. The ``ReversibleOp`` class makes use of the :meth:`.Operations.get_context` method to locate the version file we refer to:: $ alembic revision -m "update views/sp" The migration:: """update views/sp Revision ID: 199028bf9856 Revises: 191a2d20b025 Create Date: 2015-07-27 16:26:31.344504 """ # revision identifiers, used by Alembic. revision = '199028bf9856' down_revision = '191a2d20b025' branch_labels = None depends_on = None from alembic import op import sqlalchemy as sa from foo import ReplaceableObject customer_view = ReplaceableObject( "customer_view", "SELECT name, order_count, email " "FROM customer WHERE order_count > 0" ) add_customer_sp = ReplaceableObject( "add_customer_sp(name varchar, order_count integer, email varchar)", """ RETURNS integer AS $$ BEGIN insert into customer (name, order_count, email) VALUES (in_name, in_order_count, email); END; $$ LANGUAGE plpgsql; """ ) def upgrade(): op.replace_view(customer_view, replaces="28af9800143f.customer_view") op.replace_sp(add_customer_sp, replaces="28af9800143f.add_customer_sp") def downgrade(): op.replace_view(customer_view, replace_with="28af9800143f.customer_view") op.replace_sp(add_customer_sp, replace_with="28af9800143f.add_customer_sp") Above, instead of using ``create_view()``, ``create_sp()``, ``drop_view()``, and ``drop_sp()`` methods, we now use ``replace_view()`` and ``replace_sp()``. The replace operation we've built always runs a DROP *and* a CREATE. Running an upgrade to head we see:: $ alembic upgrade head INFO [alembic.runtime.migration] Context impl PostgresqlImpl. INFO [alembic.runtime.migration] Will assume transactional DDL. INFO [sqlalchemy.engine.base.Engine] BEGIN (implicit) INFO [sqlalchemy.engine.base.Engine] select relname from pg_class c join pg_namespace n on n.oid=c.relnamespace where pg_catalog.pg_table_is_visible(c.oid) and relname=%(name)s INFO [sqlalchemy.engine.base.Engine] {'name': u'alembic_version'} INFO [sqlalchemy.engine.base.Engine] SELECT alembic_version.version_num FROM alembic_version INFO [sqlalchemy.engine.base.Engine] {} INFO [alembic.runtime.migration] Running upgrade 28af9800143f -> 191a2d20b025, add email col INFO [sqlalchemy.engine.base.Engine] ALTER TABLE customer ADD COLUMN email VARCHAR INFO [sqlalchemy.engine.base.Engine] {} INFO [sqlalchemy.engine.base.Engine] UPDATE alembic_version SET version_num='191a2d20b025' WHERE alembic_version.version_num = '28af9800143f' INFO [sqlalchemy.engine.base.Engine] {} INFO [alembic.runtime.migration] Running upgrade 191a2d20b025 -> 199028bf9856, update views/sp INFO [sqlalchemy.engine.base.Engine] DROP VIEW customer_view INFO [sqlalchemy.engine.base.Engine] {} INFO [sqlalchemy.engine.base.Engine] CREATE VIEW customer_view AS SELECT name, order_count, email FROM customer WHERE order_count > 0 INFO [sqlalchemy.engine.base.Engine] {} INFO [sqlalchemy.engine.base.Engine] DROP FUNCTION add_customer_sp(name varchar, order_count integer) INFO [sqlalchemy.engine.base.Engine] {} INFO [sqlalchemy.engine.base.Engine] CREATE FUNCTION add_customer_sp(name varchar, order_count integer, email varchar) RETURNS integer AS $$ BEGIN insert into customer (name, order_count, email) VALUES (in_name, in_order_count, email); END; $$ LANGUAGE plpgsql; INFO [sqlalchemy.engine.base.Engine] {} INFO [sqlalchemy.engine.base.Engine] UPDATE alembic_version SET version_num='199028bf9856' WHERE alembic_version.version_num = '191a2d20b025' INFO [sqlalchemy.engine.base.Engine] {} INFO [sqlalchemy.engine.base.Engine] COMMIT After adding our new ``email`` column, we see that both ``customer_view`` and ``add_customer_sp()`` are dropped before the new version is created. If we downgrade back to the old version, we see the old version of these recreated again within the downgrade for this migration:: $ alembic downgrade 28af9800143 INFO [alembic.runtime.migration] Context impl PostgresqlImpl. INFO [alembic.runtime.migration] Will assume transactional DDL. INFO [sqlalchemy.engine.base.Engine] BEGIN (implicit) INFO [sqlalchemy.engine.base.Engine] select relname from pg_class c join pg_namespace n on n.oid=c.relnamespace where pg_catalog.pg_table_is_visible(c.oid) and relname=%(name)s INFO [sqlalchemy.engine.base.Engine] {'name': u'alembic_version'} INFO [sqlalchemy.engine.base.Engine] SELECT alembic_version.version_num FROM alembic_version INFO [sqlalchemy.engine.base.Engine] {} INFO [alembic.runtime.migration] Running downgrade 199028bf9856 -> 191a2d20b025, update views/sp INFO [sqlalchemy.engine.base.Engine] DROP VIEW customer_view INFO [sqlalchemy.engine.base.Engine] {} INFO [sqlalchemy.engine.base.Engine] CREATE VIEW customer_view AS SELECT name, order_count FROM customer WHERE order_count > 0 INFO [sqlalchemy.engine.base.Engine] {} INFO [sqlalchemy.engine.base.Engine] DROP FUNCTION add_customer_sp(name varchar, order_count integer, email varchar) INFO [sqlalchemy.engine.base.Engine] {} INFO [sqlalchemy.engine.base.Engine] CREATE FUNCTION add_customer_sp(name varchar, order_count integer) RETURNS integer AS $$ BEGIN insert into customer (name, order_count) VALUES (in_name, in_order_count); END; $$ LANGUAGE plpgsql; INFO [sqlalchemy.engine.base.Engine] {} INFO [sqlalchemy.engine.base.Engine] UPDATE alembic_version SET version_num='191a2d20b025' WHERE alembic_version.version_num = '199028bf9856' INFO [sqlalchemy.engine.base.Engine] {} INFO [alembic.runtime.migration] Running downgrade 191a2d20b025 -> 28af9800143f, add email col INFO [sqlalchemy.engine.base.Engine] ALTER TABLE customer DROP COLUMN email INFO [sqlalchemy.engine.base.Engine] {} INFO [sqlalchemy.engine.base.Engine] UPDATE alembic_version SET version_num='28af9800143f' WHERE alembic_version.version_num = '191a2d20b025' INFO [sqlalchemy.engine.base.Engine] {} INFO [sqlalchemy.engine.base.Engine] COMMIT Don't Generate Empty Migrations with Autogenerate ================================================= A common request is to have the ``alembic revision --autogenerate`` command not actually generate a revision file if no changes to the schema is detected. Using the :paramref:`.EnvironmentContext.configure.process_revision_directives` hook, this is straightforward; place a ``process_revision_directives`` hook in :meth:`.MigrationContext.configure` which removes the single :class:`.MigrationScript` directive if it is empty of any operations:: def run_migrations_online(): # ... def process_revision_directives(context, revision, directives): if config.cmd_opts.autogenerate: script = directives[0] if script.upgrade_ops.is_empty(): directives[:] = [] # connectable = ... with connectable.connect() as connection: context.configure( connection=connection, target_metadata=target_metadata, process_revision_directives=process_revision_directives ) with context.begin_transaction(): context.run_migrations() Don't emit DROP INDEX when the table is to be dropped as well ============================================================= MySQL may complain when dropping an index that is against a column that also has a foreign key constraint on it. If the table is to be dropped in any case, the DROP INDEX isn't necessary. This recipe will process the set of autogenerate directives such that all :class:`.DropIndexOp` directives are removed against tables that themselves are to be dropped:: def run_migrations_online(): # ... from alembic.operations import ops def process_revision_directives(context, revision, directives): script = directives[0] # process both "def upgrade()", "def downgrade()" for directive in (script.upgrade_ops, script.downgrade_ops): # make a set of tables that are being dropped within # the migration function tables_dropped = set() for op in directive.ops: if isinstance(op, ops.DropTableOp): tables_dropped.add((op.table_name, op.schema)) # now rewrite the list of "ops" such that DropIndexOp # is removed for those tables. Needs a recursive function. directive.ops = list( _filter_drop_indexes(directive.ops, tables_dropped) ) def _filter_drop_indexes(directives, tables_dropped): # given a set of (tablename, schemaname) to be dropped, filter # out DropIndexOp from the list of directives and yield the result. for directive in directives: # ModifyTableOps is a container of ALTER TABLE types of # commands. process those in place recursively. if isinstance(directive, ops.ModifyTableOps) and \ (directive.table_name, directive.schema) in tables_dropped: directive.ops = list( _filter_drop_indexes(directive.ops, tables_dropped) ) # if we emptied out the directives, then skip the # container altogether. if not directive.ops: continue elif isinstance(directive, ops.DropIndexOp) and \ (directive.table_name, directive.schema) in tables_dropped: # we found a target DropIndexOp. keep looping continue # otherwise if not filtered, yield out the directive yield directive # connectable = ... with connectable.connect() as connection: context.configure( connection=connection, target_metadata=target_metadata, process_revision_directives=process_revision_directives ) with context.begin_transaction(): context.run_migrations() Whereas autogenerate, when dropping two tables with a foreign key and an index, would previously generate something like:: def downgrade(): # ### commands auto generated by Alembic - please adjust! ### op.drop_index(op.f('ix_b_aid'), table_name='b') op.drop_table('b') op.drop_table('a') # ### end Alembic commands ### With the above rewriter, it generates as:: def downgrade(): # ### commands auto generated by Alembic - please adjust! ### op.drop_table('b') op.drop_table('a') # ### end Alembic commands ### Don't generate any DROP TABLE directives with autogenerate ========================================================== When running autogenerate against a database that has existing tables outside of the application's autogenerated metadata, it may be desirable to prevent autogenerate from considering any of those existing tables to be dropped. This will prevent autogenerate from detecting tables removed from the local metadata as well however this is only a small caveat. The most direct way to achieve this using the :paramref:`.EnvironmentContext.configure.include_object` hook. There is no need to hardcode a fixed "whitelist" of table names; the hook gives enough information in the given arguments to determine if a particular table name is not part of the local :class:`.MetaData` being autogenerated, by checking first that the type of object is ``"table"``, then that ``reflected`` is ``True``, indicating this table name is from the local database connection, not the :class:`.MetaData`, and finally that ``compare_to`` is ``None``, indicating autogenerate is not comparing this :class:`.Table` to any :class:`.Table` in the local :class:`.MetaData` collection:: # in env.py def include_object(object, name, type_, reflected, compare_to): if type_ == "table" and reflected and compare_to is None: return False else: return True context.configure( # ... include_object = include_object ) Don't emit CREATE TABLE statements for Views ============================================ It is sometimes convenient to create :class:`~sqlalchemy.schema.Table` instances for views so that they can be queried using normal SQLAlchemy techniques. Unfortunately this causes Alembic to treat them as tables in need of creation and to generate spurious ``create_table()`` operations. This is easily fixable by flagging such Tables and using the :paramref:`~.EnvironmentContext.configure.include_object` hook to exclude them:: my_view = Table('my_view', metadata, autoload=True, info=dict(is_view=True)) # Flag this as a view Then define ``include_object`` as:: def include_object(object, name, type_, reflected, compare_to): """ Exclude views from Alembic's consideration. """ return not object.info.get('is_view', False) Finally, in ``env.py`` pass your ``include_object`` as a keyword argument to :meth:`.EnvironmentContext.configure`. .. _multiple_environments: Run Multiple Alembic Environments from one .ini file ==================================================== Long before Alembic had the "multiple bases" feature described in :ref:`multiple_bases`, projects had a need to maintain more than one Alembic version history in a single project, where these version histories are completely independent of each other and each refer to their own alembic_version table, either across multiple databases, schemas, or namespaces. A simple approach was added to support this, the ``--name`` flag on the commandline. First, one would create an alembic.ini file of this form:: [DEFAULT] # all defaults shared between environments go here sqlalchemy.url = postgresql://scott:tiger@hostname/mydatabase [schema1] # path to env.py and migration scripts for schema1 script_location = myproject/revisions/schema1 [schema2] # path to env.py and migration scripts for schema2 script_location = myproject/revisions/schema2 [schema3] # path to env.py and migration scripts for schema3 script_location = myproject/revisions/db2 # this schema uses a different database URL as well sqlalchemy.url = postgresql://scott:tiger@hostname/myotherdatabase Above, in the ``[DEFAULT]`` section we set up a default database URL. Then we create three sections corresponding to different revision lineages in our project. Each of these directories would have its own ``env.py`` and set of versioning files. Then when we run the ``alembic`` command, we simply give it the name of the configuration we want to use:: alembic --name schema2 revision -m "new rev for schema 2" --autogenerate Above, the ``alembic`` command makes use of the configuration in ``[schema2]``, populated with defaults from the ``[DEFAULT]`` section. The above approach can be automated by creating a custom front-end to the Alembic commandline as well. Print Python Code to Generate Particular Database Tables ======================================================== Suppose you have a database already, and want to generate some ``op.create_table()`` and other directives that you'd have in a migration file. How can we automate generating that code? Suppose the database schema looks like (assume MySQL):: CREATE TABLE IF NOT EXISTS `users` ( `id` int(11) NOT NULL, KEY `id` (`id`) ); CREATE TABLE IF NOT EXISTS `user_properties` ( `users_id` int(11) NOT NULL, `property_name` varchar(255) NOT NULL, `property_value` mediumtext NOT NULL, UNIQUE KEY `property_name_users_id` (`property_name`,`users_id`), KEY `users_id` (`users_id`), CONSTRAINT `user_properties_ibfk_1` FOREIGN KEY (`users_id`) REFERENCES `users` (`id`) ON DELETE CASCADE ) ENGINE=InnoDB DEFAULT CHARSET=utf8; Using :class:`.ops.UpgradeOps`, :class:`.ops.CreateTableOp`, and :class:`.ops.CreateIndexOp`, we create a migration file structure, using :class:`.Table` objects that we get from SQLAlchemy reflection. The structure is passed to :func:`.autogenerate.render_python_code` to produce the Python code for a migration file:: from sqlalchemy import create_engine from sqlalchemy import MetaData, Table from alembic import autogenerate from alembic.operations import ops e = create_engine("mysql://scott:tiger@localhost/test") with e.connect() as conn: m = MetaData() user_table = Table('users', m, autoload_with=conn) user_property_table = Table('user_properties', m, autoload_with=conn) print(autogenerate.render_python_code( ops.UpgradeOps( ops=[ ops.CreateTableOp.from_table(table) for table in m.tables.values() ] + [ ops.CreateIndexOp.from_index(idx) for table in m.tables.values() for idx in table.indexes ] )) ) Output:: # ### commands auto generated by Alembic - please adjust! ### op.create_table('users', sa.Column('id', mysql.INTEGER(display_width=11), autoincrement=False, nullable=False), mysql_default_charset='latin1', mysql_engine='InnoDB' ) op.create_table('user_properties', sa.Column('users_id', mysql.INTEGER(display_width=11), autoincrement=False, nullable=False), sa.Column('property_name', mysql.VARCHAR(length=255), nullable=False), sa.Column('property_value', mysql.MEDIUMTEXT(), nullable=False), sa.ForeignKeyConstraint(['users_id'], ['users.id'], name='user_properties_ibfk_1', ondelete='CASCADE'), mysql_comment='user properties', mysql_default_charset='utf8', mysql_engine='InnoDB' ) op.create_index('id', 'users', ['id'], unique=False) op.create_index('users_id', 'user_properties', ['users_id'], unique=False) op.create_index('property_name_users_id', 'user_properties', ['property_name', 'users_id'], unique=True) # ### end Alembic commands ### Test current database revision is at head(s) ============================================ A recipe to determine if a database schema is up to date in terms of applying Alembic migrations. May be useful for test or installation suites to determine if the target database is up to date. Makes use of the :meth:`.MigrationContext.get_current_heads` as well as :meth:`.ScriptDirectory.get_heads` methods so that it accommodates for a branched revision tree:: from alembic import config, script from alembic.runtime import migration from sqlalchemy import engine def check_current_head(alembic_cfg, connectable): # type: (config.Config, engine.Engine) -> bool directory = script.ScriptDirectory.from_config(alembic_cfg) with connectable.begin() as connection: context = migration.MigrationContext.configure(connection) return set(context.get_current_heads()) == set(directory.get_heads()) e = engine.create_engine("mysql://scott:tiger@localhost/test", echo=True) cfg = config.Config("alembic.ini") print(check_current_head(cfg, e)) .. seealso:: :meth:`.MigrationContext.get_current_heads` :meth:`.ScriptDirectory.get_heads` Support Non-Ascii Migration Scripts / Messages under Python 2 ============================================================== To work with a migration file that has non-ascii characters in it under Python 2, the ``script.py.mako`` file inside of the Alembic environment has to have an encoding comment added to the top that will render into a ``.py`` file: .. code-block:: mako <%text># coding: utf-8 Additionally, individual fields if they are to have non-ascii characters in them may require decode operations on the template values. Such as, if the revision message given on the command line to ``alembic revision`` has non-ascii characters in it, under Python 2 the command interface passes this through as bytes, and Alembic has no decode step built in for this as it is not necessary under Python 3. To decode, add a decoding step to the template for each variable that potentially may have non-ascii characters within it. An example of applying this to the "message" field is as follows: .. code-block:: mako <%! import sys %>\ <%text># coding: utf-8 """${message.decode("utf-8") \ if sys.version_info < (3, ) \ and isinstance(message, str) else message} Revision ID: ${up_revision} Revises: ${down_revision | comma,n} Create Date: ${create_date} """ from alembic import op import sqlalchemy as sa ${imports if imports else ""} # revision identifiers, used by Alembic. revision = ${repr(up_revision)} down_revision = ${repr(down_revision)} branch_labels = ${repr(branch_labels)} depends_on = ${repr(depends_on)} def upgrade(): ${upgrades if upgrades else "pass"} def downgrade(): ${downgrades if downgrades else "pass"} zzzeek-alembic-bee044a1c187/docs/build/front.rst000066400000000000000000000036561353106760100215250ustar00rootroot00000000000000============ Front Matter ============ Information about the Alembic project. Project Homepage ================ Alembic is hosted on GitHub at https://github.com/sqlalchemy/alembic under the SQLAlchemy organization. Releases and project status are available on Pypi at https://pypi.python.org/pypi/alembic. The most recent published version of this documentation should be at https://alembic.sqlalchemy.org. .. _installation: Installation ============ Install released versions of Alembic from the Python package index with `pip `_ or a similar tool:: pip install alembic Installation via source distribution is via the ``setup.py`` script:: python setup.py install The install will add the ``alembic`` command to the environment. All operations with Alembic then proceed through the usage of this command. Dependencies ------------ Alembic's install process will ensure that SQLAlchemy_ is installed, in addition to other dependencies. Alembic will work with SQLAlchemy as of version **0.9.0**, however more features are available with newer versions such as the 1.1 or 1.2 series. .. versionchanged:: 1.0.0 Support for SQLAlchemy 0.8 and 0.7.9 was dropped. Alembic supports Python versions 2.7, 3.4 and above. .. versionchanged:: 1.0.0 Support for Python 2.6 and 3.3 was dropped. Community ========= Alembic is developed by `Mike Bayer `_, and is loosely associated with the SQLAlchemy_, `Pylons `_, and `Openstack `_ projects. User issues, discussion of potential bugs and features should be posted to the Alembic Google Group at `sqlalchemy-alembic `_. .. _bugs: Bugs ==== Bugs and feature enhancements to Alembic should be reported on the `GitHub issue tracker `_. .. _SQLAlchemy: https://www.sqlalchemy.org zzzeek-alembic-bee044a1c187/docs/build/index.rst000066400000000000000000000010251353106760100214700ustar00rootroot00000000000000=================================== Welcome to Alembic's documentation! =================================== `Alembic `_ is a lightweight database migration tool for usage with the `SQLAlchemy `_ Database Toolkit for Python. .. toctree:: :maxdepth: 3 front tutorial autogenerate offline naming batch branches ops cookbook api/index changelog Indices and tables ================== * :ref:`genindex` * :ref:`modindex` * :ref:`search` zzzeek-alembic-bee044a1c187/docs/build/make.bat000066400000000000000000000060051353106760100212370ustar00rootroot00000000000000@ECHO OFF REM Command file for Sphinx documentation set SPHINXBUILD=sphinx-build set BUILDDIR=build set ALLSPHINXOPTS=-d %BUILDDIR%/doctrees %SPHINXOPTS% source if NOT "%PAPER%" == "" ( set ALLSPHINXOPTS=-D latex_paper_size=%PAPER% %ALLSPHINXOPTS% ) if "%1" == "" goto help if "%1" == "help" ( :help echo.Please use `make ^` where ^ is one of echo. html to make standalone HTML files echo. dirhtml to make HTML files named index.html in directories echo. pickle to make pickle files echo. json to make JSON files echo. htmlhelp to make HTML files and a HTML help project echo. qthelp to make HTML files and a qthelp project echo. latex to make LaTeX files, you can set PAPER=a4 or PAPER=letter echo. changes to make an overview over all changed/added/deprecated items echo. linkcheck to check all external links for integrity echo. doctest to run all doctests embedded in the documentation if enabled goto end ) if "%1" == "clean" ( for /d %%i in (%BUILDDIR%\*) do rmdir /q /s %%i del /q /s %BUILDDIR%\* goto end ) if "%1" == "html" ( %SPHINXBUILD% -b html %ALLSPHINXOPTS% %BUILDDIR%/html echo. echo.Build finished. The HTML pages are in %BUILDDIR%/html. goto end ) if "%1" == "dirhtml" ( %SPHINXBUILD% -b dirhtml %ALLSPHINXOPTS% %BUILDDIR%/dirhtml echo. echo.Build finished. The HTML pages are in %BUILDDIR%/dirhtml. goto end ) if "%1" == "pickle" ( %SPHINXBUILD% -b pickle %ALLSPHINXOPTS% %BUILDDIR%/pickle echo. echo.Build finished; now you can process the pickle files. goto end ) if "%1" == "json" ( %SPHINXBUILD% -b json %ALLSPHINXOPTS% %BUILDDIR%/json echo. echo.Build finished; now you can process the JSON files. goto end ) if "%1" == "htmlhelp" ( %SPHINXBUILD% -b htmlhelp %ALLSPHINXOPTS% %BUILDDIR%/htmlhelp echo. echo.Build finished; now you can run HTML Help Workshop with the ^ .hhp project file in %BUILDDIR%/htmlhelp. goto end ) if "%1" == "qthelp" ( %SPHINXBUILD% -b qthelp %ALLSPHINXOPTS% %BUILDDIR%/qthelp echo. echo.Build finished; now you can run "qcollectiongenerator" with the ^ .qhcp project file in %BUILDDIR%/qthelp, like this: echo.^> qcollectiongenerator %BUILDDIR%\qthelp\Alembic.qhcp echo.To view the help file: echo.^> assistant -collectionFile %BUILDDIR%\qthelp\Alembic.ghc goto end ) if "%1" == "latex" ( %SPHINXBUILD% -b latex %ALLSPHINXOPTS% %BUILDDIR%/latex echo. echo.Build finished; the LaTeX files are in %BUILDDIR%/latex. goto end ) if "%1" == "changes" ( %SPHINXBUILD% -b changes %ALLSPHINXOPTS% %BUILDDIR%/changes echo. echo.The overview file is in %BUILDDIR%/changes. goto end ) if "%1" == "linkcheck" ( %SPHINXBUILD% -b linkcheck %ALLSPHINXOPTS% %BUILDDIR%/linkcheck echo. echo.Link check complete; look for any errors in the above output ^ or in %BUILDDIR%/linkcheck/output.txt. goto end ) if "%1" == "doctest" ( %SPHINXBUILD% -b doctest %ALLSPHINXOPTS% %BUILDDIR%/doctest echo. echo.Testing of doctests in the sources finished, look at the ^ results in %BUILDDIR%/doctest/output.txt. goto end ) :end zzzeek-alembic-bee044a1c187/docs/build/naming.rst000066400000000000000000000217341353106760100216430ustar00rootroot00000000000000.. _tutorial_constraint_names: The Importance of Naming Constraints ==================================== An important topic worth mentioning is that of constraint naming conventions. As we've proceeded here, we've talked about adding tables and columns, and we've also hinted at lots of other operations listed in :ref:`ops` such as those which support adding or dropping constraints like foreign keys and unique constraints. The way these constraints are referred to in migration scripts is by name, however these names by default are in most cases generated by the relational database in use, when the constraint is created. For example, if you emitted two CREATE TABLE statements like this on Postgresql:: test=> CREATE TABLE user_account (id INTEGER PRIMARY KEY); CREATE TABLE test=> CREATE TABLE user_order ( test(> id INTEGER PRIMARY KEY, test(> user_account_id INTEGER REFERENCES user_account(id)); CREATE TABLE Suppose we wanted to DROP the REFERENCES that we just applied to the ``user_order.user_account_id`` column, how do we do that? At the prompt, we'd use ``ALTER TABLE DROP CONSTRAINT ``, or if using Alembic we'd be using :meth:`.Operations.drop_constraint`. But both of those functions need a name - what's the name of this constraint? It does have a name, which in this case we can figure out by looking at the Postgresql catalog tables:: test=> SELECT r.conname FROM test-> pg_catalog.pg_class c JOIN pg_catalog.pg_namespace n ON n.oid = c.relnamespace test-> JOIN pg_catalog.pg_constraint r ON c.oid = r.conrelid test-> WHERE c.relname='user_order' AND r.contype = 'f' test-> ; conname --------------------------------- user_order_user_account_id_fkey (1 row) The name above is not something that Alembic or SQLAlchemy created; ``user_order_user_account_id_fkey`` is a naming scheme used internally by Postgresql to name constraints that are otherwise not named. This scheme doesn't seem so complicated, and we might want to just use our knowledge of it so that we know what name to use for our :meth:`.Operations.drop_constraint` call. But is that a good idea? What if for example we needed our code to run on Oracle as well. OK, certainly Oracle uses this same scheme, right? Or if not, something similar. Let's check:: Oracle Database 10g Express Edition Release 10.2.0.1.0 - Production SQL> CREATE TABLE user_account (id INTEGER PRIMARY KEY); Table created. SQL> CREATE TABLE user_order ( 2 id INTEGER PRIMARY KEY, 3 user_account_id INTEGER REFERENCES user_account(id)); Table created. SQL> SELECT constraint_name FROM all_constraints WHERE 2 table_name='USER_ORDER' AND constraint_type in ('R'); CONSTRAINT_NAME ----------------------------------------------------- SYS_C0029334 Oh, we can see that is.....much worse. Oracle's names are entirely unpredictable alphanumeric codes, and this will make being able to write migrations quite tedious, as we'd need to look up all these names. The solution to having to look up names is to make your own names. This is an easy, though tedious thing to do manually. For example, to create our model in SQLAlchemy ensuring we use names for foreign key constraints would look like:: from sqlalchemy import MetaData, Table, Column, Integer, ForeignKey meta = MetaData() user_account = Table('user_account', meta, Column('id', Integer, primary_key=True) ) user_order = Table('user_order', meta, Column('id', Integer, primary_key=True), Column('user_order_id', Integer, ForeignKey('user_account.id', name='fk_user_order_id')) ) Simple enough, though this has some disadvantages. The first is that it's tedious; we need to remember to use a name for every :class:`~sqlalchemy.schema.ForeignKey` object, not to mention every :class:`~sqlalchemy.schema.UniqueConstraint`, :class:`~sqlalchemy.schema.CheckConstraint`, :class:`~sqlalchemy.schema.Index`, and maybe even :class:`~sqlalchemy.schema.PrimaryKeyConstraint` as well if we wish to be able to alter those too, and beyond all that, all the names have to be globally unique. Even with all that effort, if we have a naming scheme in mind, it's easy to get it wrong when doing it manually each time. What's worse is that manually naming constraints (and indexes) gets even more tedious in that we can no longer use convenience features such as the ``.unique=True`` or ``.index=True`` flag on :class:`~sqlalchemy.schema.Column`:: user_account = Table('user_account', meta, Column('id', Integer, primary_key=True), Column('name', String(50), unique=True) ) Above, the ``unique=True`` flag creates a :class:`~sqlalchemy.schema.UniqueConstraint`, but again, it's not named. If we want to name it, manually we have to forego the usage of ``unique=True`` and type out the whole constraint:: user_account = Table('user_account', meta, Column('id', Integer, primary_key=True), Column('name', String(50)), UniqueConstraint('name', name='uq_user_account_name') ) There's a solution to all this naming work, which is to use an **automated naming convention**. For some years, SQLAlchemy has encourgaged the use of DDL Events in order to create naming schemes. The :meth:`~sqlalchemy.events.DDLEvents.after_parent_attach` event in particular is the best place to intercept when :class:`~sqlalchemy.schema.Constraint` and :class:`~sqlalchemy.schema.Index` objects are being associated with a parent :class:`~sqlalchemy.schema.Table` object, and to assign a ``.name`` to the constraint while making use of the name of the table and associated columns. But there is also a better way to go, which is to make use of a feature new in SQLAlchemy 0.9.2 which makes use of the events behind the scenes known as :paramref:`~sqlalchemy.schema.MetaData.naming_convention`. Here, we can create a new :class:`~sqlalchemy.schema.MetaData` object while passing a dictionary referring to a naming scheme:: convention = { "ix": "ix_%(column_0_label)s", "uq": "uq_%(table_name)s_%(column_0_name)s", "ck": "ck_%(table_name)s_%(constraint_name)s", "fk": "fk_%(table_name)s_%(column_0_name)s_%(referred_table_name)s", "pk": "pk_%(table_name)s" } metadata = MetaData(naming_convention=convention) If we define our models using a :class:`~sqlalchemy.schema.MetaData` as above, the given naming convention dictionary will be used to provide names for all constraints and indexes. .. _autogen_naming_conventions: Integration of Naming Conventions into Operations, Autogenerate --------------------------------------------------------------- As of Alembic 0.6.4, the naming convention feature is integrated into the :class:`.Operations` object, so that the convention takes effect for any constraint that is otherwise unnamed. The naming convention is passed to :class:`.Operations` using the :paramref:`.MigrationsContext.configure.target_metadata` parameter in ``env.py``, which is normally configured when autogenerate is used:: # in your application's model: meta = MetaData(naming_convention={ "ix": "ix_%(column_0_label)s", "uq": "uq_%(table_name)s_%(column_0_name)s", "ck": "ck_%(table_name)s_%(constraint_name)s", "fk": "fk_%(table_name)s_%(column_0_name)s_%(referred_table_name)s", "pk": "pk_%(table_name)s" }) Base = declarative_base(metadata=meta) # .. in your Alembic env.py: # add your model's MetaData object here # for 'autogenerate' support from myapp import mymodel target_metadata = mymodel.Base.metadata # ... def run_migrations_online(): # ... context.configure( connection=connection, target_metadata=target_metadata ) Above, when we render a directive like the following:: op.add_column('sometable', Column('q', Boolean(name='q_bool'))) The Boolean type will render a CHECK constraint with the name ``"ck_sometable_q_bool"``, assuming the backend in use does not support native boolean types. We can also use op directives with constraints and not give them a name at all, if the naming convention doesn't require one. The value of ``None`` will be converted into a name that follows the appopriate naming conventions:: def upgrade(): op.create_unique_constraint(None, 'some_table', 'x') When autogenerate renders constraints in a migration script, it renders them typically with their completed name. If using at least Alembic 0.6.4 as well as SQLAlchemy 0.9.4, these will be rendered with a special directive :meth:`.Operations.f` which denotes that the string has already been tokenized:: def upgrade(): op.create_unique_constraint(op.f('uq_const_x'), 'some_table', 'x') For more detail on the naming convention feature, see :ref:`sqla:constraint_naming_conventions`. zzzeek-alembic-bee044a1c187/docs/build/offline.rst000066400000000000000000000127641353106760100220170ustar00rootroot00000000000000Generating SQL Scripts (a.k.a. "Offline Mode") ============================================== A major capability of Alembic is to generate migrations as SQL scripts, instead of running them against the database - this is also referred to as *offline mode*. This is a critical feature when working in large organizations where access to DDL is restricted, and SQL scripts must be handed off to DBAs. Alembic makes this easy via the ``--sql`` option passed to any ``upgrade`` or ``downgrade`` command. We can, for example, generate a script that revises up to rev ``ae1027a6acf``:: $ alembic upgrade ae1027a6acf --sql INFO [alembic.context] Context class PostgresqlContext. INFO [alembic.context] Will assume transactional DDL. BEGIN; CREATE TABLE alembic_version ( version_num VARCHAR(32) NOT NULL ); INFO [alembic.context] Running upgrade None -> 1975ea83b712 CREATE TABLE account ( id SERIAL NOT NULL, name VARCHAR(50) NOT NULL, description VARCHAR(200), PRIMARY KEY (id) ); INFO [alembic.context] Running upgrade 1975ea83b712 -> ae1027a6acf ALTER TABLE account ADD COLUMN last_transaction_date TIMESTAMP WITHOUT TIME ZONE; INSERT INTO alembic_version (version_num) VALUES ('ae1027a6acf'); COMMIT; While the logging configuration dumped to standard error, the actual script was dumped to standard output - so in the absence of further configuration (described later in this section), we'd at first be using output redirection to generate a script:: $ alembic upgrade ae1027a6acf --sql > migration.sql Getting the Start Version -------------------------- Notice that our migration script started at the base - this is the default when using offline mode, as no database connection is present and there's no ``alembic_version`` table to read from. One way to provide a starting version in offline mode is to provide a range to the command line. This is accomplished by providing the "version" in ``start:end`` syntax:: $ alembic upgrade 1975ea83b712:ae1027a6acf --sql > migration.sql The ``start:end`` syntax is only allowed in offline mode; in "online" mode, the ``alembic_version`` table is always used to get at the current version. It's also possible to have the ``env.py`` script retrieve the "last" version from the local environment, such as from a local file. A scheme like this would basically treat a local file in the same way ``alembic_version`` works:: if context.is_offline_mode(): version_file = os.path.join(os.path.dirname(config.config_file_name), "version.txt") if os.path.exists(version_file): current_version = open(version_file).read() else: current_version = None context.configure(dialect_name=engine.name, starting_rev=current_version) context.run_migrations() end_version = context.get_revision_argument() if end_version and end_version != current_version: open(version_file, 'w').write(end_version) Writing Migration Scripts to Support Script Generation ------------------------------------------------------ The challenge of SQL script generation is that the scripts we generate can't rely upon any client/server database access. This means a migration script that pulls some rows into memory via a ``SELECT`` statement will not work in ``--sql`` mode. It's also important that the Alembic directives, all of which are designed specifically to work in both "live execution" as well as "offline SQL generation" mode, are used. Customizing the Environment --------------------------- Users of the ``--sql`` option are encouraged to hack their ``env.py`` files to suit their needs. The ``env.py`` script as provided is broken into two sections: ``run_migrations_online()`` and ``run_migrations_offline()``. Which function is run is determined at the bottom of the script by reading :meth:`.EnvironmentContext.is_offline_mode`, which basically determines if the ``--sql`` flag was enabled. For example, a multiple database configuration may want to run through each database and set the output of the migrations to different named files - the :meth:`.EnvironmentContext.configure` function accepts a parameter ``output_buffer`` for this purpose. Below we illustrate this within the ``run_migrations_offline()`` function:: from alembic import context import myapp import sys db_1 = myapp.db_1 db_2 = myapp.db_2 def run_migrations_offline(): """Run migrations *without* a SQL connection.""" for name, engine, file_ in [ ("db1", db_1, "db1.sql"), ("db2", db_2, "db2.sql"), ]: context.configure( url=engine.url, transactional_ddl=False, output_buffer=open(file_, 'w')) context.execute("-- running migrations for '%s'" % name) context.run_migrations(name=name) sys.stderr.write("Wrote file '%s'" % file_) def run_migrations_online(): """Run migrations *with* a SQL connection.""" for name, engine in [ ("db1", db_1), ("db2", db_2), ]: connection = engine.connect() context.configure(connection=connection) try: context.run_migrations(name=name) session.commit() except: session.rollback() raise if context.is_offline_mode(): run_migrations_offline() else: run_migrations_online() zzzeek-alembic-bee044a1c187/docs/build/ops.rst000066400000000000000000000036161353106760100211720ustar00rootroot00000000000000.. _ops: =================== Operation Reference =================== This file provides documentation on Alembic migration directives. The directives here are used within user-defined migration files, within the ``upgrade()`` and ``downgrade()`` functions, as well as any functions further invoked by those. All directives exist as methods on a class called :class:`.Operations`. When migration scripts are run, this object is made available to the script via the ``alembic.op`` datamember, which is a *proxy* to an actual instance of :class:`.Operations`. Currently, ``alembic.op`` is a real Python module, populated with individual proxies for each method on :class:`.Operations`, so symbols can be imported safely from the ``alembic.op`` namespace. The :class:`.Operations` system is also fully extensible. See :ref:`operation_plugins` for details on this. A key design philosophy to the :ref:`alembic.operations.toplevel` methods is that to the greatest degree possible, they internally generate the appropriate SQLAlchemy metadata, typically involving :class:`~sqlalchemy.schema.Table` and :class:`~sqlalchemy.schema.Constraint` objects. This so that migration instructions can be given in terms of just the string names and/or flags involved. The exceptions to this rule include the :meth:`~.Operations.add_column` and :meth:`~.Operations.create_table` directives, which require full :class:`~sqlalchemy.schema.Column` objects, though the table metadata is still generated here. The functions here all require that a :class:`.MigrationContext` has been configured within the ``env.py`` script first, which is typically via :meth:`.EnvironmentContext.configure`. Under normal circumstances they are called from an actual migration script, which itself would be invoked by the :meth:`.EnvironmentContext.run_migrations` method. .. module:: alembic.operations .. autoclass:: Operations .. autoclass:: BatchOperations zzzeek-alembic-bee044a1c187/docs/build/requirements.txt000066400000000000000000000004051353106760100231140ustar00rootroot00000000000000git+https://github.com/sqlalchemyorg/changelog.git#egg=changelog git+https://github.com/sqlalchemyorg/sphinx-paramlinks.git#egg=sphinx-paramlinks git+https://github.com/sqlalchemy/sqlalchemy.git python-dateutil # because there's a dependency in pyfiles.py Mako zzzeek-alembic-bee044a1c187/docs/build/tutorial.rst000066400000000000000000000534561353106760100222430ustar00rootroot00000000000000======== Tutorial ======== Alembic provides for the creation, management, and invocation of *change management* scripts for a relational database, using SQLAlchemy as the underlying engine. This tutorial will provide a full introduction to the theory and usage of this tool. To begin, make sure Alembic is installed as described at :ref:`installation`. The Migration Environment ========================== Usage of Alembic starts with creation of the *Migration Environment*. This is a directory of scripts that is specific to a particular application. The migration environment is created just once, and is then maintained along with the application's source code itself. The environment is created using the ``init`` command of Alembic, and is then customizable to suit the specific needs of the application. The structure of this environment, including some generated migration scripts, looks like:: yourproject/ alembic/ env.py README script.py.mako versions/ 3512b954651e_add_account.py 2b1ae634e5cd_add_order_id.py 3adcc9a56557_rename_username_field.py The directory includes these directories/files: * ``yourproject`` - this is the root of your application's source code, or some directory within it. * ``alembic`` - this directory lives within your application's source tree and is the home of the migration environment. It can be named anything, and a project that uses multiple databases may even have more than one. * ``env.py`` - This is a Python script that is run whenever the alembic migration tool is invoked. At the very least, it contains instructions to configure and generate a SQLAlchemy engine, procure a connection from that engine along with a transaction, and then invoke the migration engine, using the connection as a source of database connectivity. The ``env.py`` script is part of the generated environment so that the way migrations run is entirely customizable. The exact specifics of how to connect are here, as well as the specifics of how the migration environment are invoked. The script can be modified so that multiple engines can be operated upon, custom arguments can be passed into the migration environment, application-specific libraries and models can be loaded in and made available. Alembic includes a set of initialization templates which feature different varieties of ``env.py`` for different use cases. * ``README`` - included with the various environment templates, should have something informative. * ``script.py.mako`` - This is a `Mako `_ template file which is used to generate new migration scripts. Whatever is here is used to generate new files within ``versions/``. This is scriptable so that the structure of each migration file can be controlled, including standard imports to be within each, as well as changes to the structure of the ``upgrade()`` and ``downgrade()`` functions. For example, the ``multidb`` environment allows for multiple functions to be generated using a naming scheme ``upgrade_engine1()``, ``upgrade_engine2()``. * ``versions/`` - This directory holds the individual version scripts. Users of other migration tools may notice that the files here don't use ascending integers, and instead use a partial GUID approach. In Alembic, the ordering of version scripts is relative to directives within the scripts themselves, and it is theoretically possible to "splice" version files in between others, allowing migration sequences from different branches to be merged, albeit carefully by hand. Creating an Environment ======================= With a basic understanding of what the environment is, we can create one using ``alembic init``. This will create an environment using the "generic" template:: $ cd yourproject $ alembic init alembic Where above, the ``init`` command was called to generate a migrations directory called ``alembic``:: Creating directory /path/to/yourproject/alembic...done Creating directory /path/to/yourproject/alembic/versions...done Generating /path/to/yourproject/alembic.ini...done Generating /path/to/yourproject/alembic/env.py...done Generating /path/to/yourproject/alembic/README...done Generating /path/to/yourproject/alembic/script.py.mako...done Please edit configuration/connection/logging settings in '/path/to/yourproject/alembic.ini' before proceeding. Alembic also includes other environment templates. These can be listed out using the ``list_templates`` command:: $ alembic list_templates Available templates: generic - Generic single-database configuration. multidb - Rudimentary multi-database configuration. pylons - Configuration that reads from a Pylons project environment. Templates are used via the 'init' command, e.g.: alembic init --template pylons ./scripts Editing the .ini File ===================== Alembic placed a file ``alembic.ini`` into the current directory. This is a file that the ``alembic`` script looks for when invoked. This file can be anywhere, either in the same directory from which the ``alembic`` script will normally be invoked, or if in a different directory, can be specified by using the ``--config`` option to the ``alembic`` runner. The file generated with the "generic" configuration looks like:: # A generic, single database configuration. [alembic] # path to migration scripts script_location = alembic # template used to generate migration files # file_template = %%(rev)s_%%(slug)s # timezone to use when rendering the date # within the migration file as well as the filename. # string value is passed to dateutil.tz.gettz() # leave blank for localtime # timezone = # max length of characters to apply to the # "slug" field # truncate_slug_length = 40 # set to 'true' to run the environment during # the 'revision' command, regardless of autogenerate # revision_environment = false # set to 'true' to allow .pyc and .pyo files without # a source .py file to be detected as revisions in the # versions/ directory # sourceless = false # version location specification; this defaults # to alembic/versions. When using multiple version # directories, initial revisions must be specified with --version-path # version_locations = %(here)s/bar %(here)s/bat alembic/versions # the output encoding used when revision files # are written from script.py.mako # output_encoding = utf-8 sqlalchemy.url = driver://user:pass@localhost/dbname # Logging configuration [loggers] keys = root,sqlalchemy,alembic [handlers] keys = console [formatters] keys = generic [logger_root] level = WARN handlers = console qualname = [logger_sqlalchemy] level = WARN handlers = qualname = sqlalchemy.engine [logger_alembic] level = INFO handlers = qualname = alembic [handler_console] class = StreamHandler args = (sys.stderr,) level = NOTSET formatter = generic [formatter_generic] format = %(levelname)-5.5s [%(name)s] %(message)s datefmt = %H:%M:%S The file is read using Python's :class:`ConfigParser.SafeConfigParser` object. The ``%(here)s`` variable is provided as a substitution variable, which can be used to produce absolute pathnames to directories and files, as we do above with the path to the Alembic script location. This file contains the following features: * ``[alembic]`` - this is the section read by Alembic to determine configuration. Alembic itself does not directly read any other areas of the file. The name "alembic" can be customized using the ``--name`` commandline flag; see :ref:`multiple_environments` for a basic example of this. * ``script_location`` - this is the location of the Alembic environment. It is normally specified as a filesystem location, either relative or absolute. If the location is a relative path, it's interpreted as relative to the current directory. This is the only key required by Alembic in all cases. The generation of the .ini file by the command ``alembic init alembic`` automatically placed the directory name ``alembic`` here. The special variable ``%(here)s`` can also be used, as in ``%(here)s/alembic``. For support of applications that package themselves into .egg files, the value can also be specified as a `package resource `_, in which case ``resource_filename()`` is used to find the file (new in 0.2.2). Any non-absolute URI which contains colons is interpreted here as a resource name, rather than a straight filename. * ``file_template`` - this is the naming scheme used to generate new migration files. The value present is the default, so is commented out. Tokens available include: * ``%%(rev)s`` - revision id * ``%%(slug)s`` - a truncated string derived from the revision message * ``%%(year)d``, ``%%(month).2d``, ``%%(day).2d``, ``%%(hour).2d``, ``%%(minute).2d``, ``%%(second).2d`` - components of the create date, by default ``datetime.datetime.now()`` unless the ``timezone`` configuration option is also used. * ``timezone`` - an optional timezone name (e.g. ``UTC``, ``EST5EDT``, etc.) that will be applied to the timestamp which renders inside the migration file's comment as well as within the filename. If ``timezone`` is specified, the create date object is no longer derived from ``datetime.datetime.now()`` and is instead generated as:: datetime.datetime.utcnow().replace( tzinfo=dateutil.tz.tzutc() ).astimezone( dateutil.tz.gettz() ) .. versionadded:: 0.9.2 * ``truncate_slug_length`` - defaults to 40, the max number of characters to include in the "slug" field. .. versionadded:: 0.6.1 - added ``truncate_slug_length`` configuration * ``sqlalchemy.url`` - A URL to connect to the database via SQLAlchemy. This key is in fact only referenced within the ``env.py`` file that is specific to the "generic" configuration; a file that can be customized by the developer. A multiple database configuration may respond to multiple keys here, or may reference other sections of the file. * ``revision_environment`` - this is a flag which when set to the value 'true', will indicate that the migration environment script ``env.py`` should be run unconditionally when generating new revision files, as well as when running the ``alembic history`` command. .. versionchanged:: 0.9.6 the ``alembic history`` command uses the environment unconditionally when ``revision_environment`` is set to true. * ``sourceless`` - when set to 'true', revision files that only exist as .pyc or .pyo files in the versions directory will be used as versions, allowing "sourceless" versioning folders. When left at the default of 'false', only .py files are consumed as version files. .. versionadded:: 0.6.4 * ``version_locations`` - an optional list of revision file locations, to allow revisions to exist in multiple directories simultaneously. See :ref:`multiple_bases` for examples. .. versionadded:: 0.7.0 * ``output_encoding`` - the encoding to use when Alembic writes the ``script.py.mako`` file into a new migration file. Defaults to ``'utf-8'``. .. versionadded:: 0.7.0 * ``[loggers]``, ``[handlers]``, ``[formatters]``, ``[logger_*]``, ``[handler_*]``, ``[formatter_*]`` - these sections are all part of Python's standard logging configuration, the mechanics of which are documented at `Configuration File Format `_. As is the case with the database connection, these directives are used directly as the result of the ``logging.config.fileConfig()`` call present in the ``env.py`` script, which you're free to modify. For starting up with just a single database and the generic configuration, setting up the SQLAlchemy URL is all that's needed:: sqlalchemy.url = postgresql://scott:tiger@localhost/test .. _create_migration: Create a Migration Script ========================= With the environment in place we can create a new revision, using ``alembic revision``:: $ alembic revision -m "create account table" Generating /path/to/yourproject/alembic/versions/1975ea83b712_create_accoun t_table.py...done A new file ``1975ea83b712_create_account_table.py`` is generated. Looking inside the file:: """create account table Revision ID: 1975ea83b712 Revises: Create Date: 2011-11-08 11:40:27.089406 """ # revision identifiers, used by Alembic. revision = '1975ea83b712' down_revision = None branch_labels = None from alembic import op import sqlalchemy as sa def upgrade(): pass def downgrade(): pass The file contains some header information, identifiers for the current revision and a "downgrade" revision, an import of basic Alembic directives, and empty ``upgrade()`` and ``downgrade()`` functions. Our job here is to populate the ``upgrade()`` and ``downgrade()`` functions with directives that will apply a set of changes to our database. Typically, ``upgrade()`` is required while ``downgrade()`` is only needed if down-revision capability is desired, though it's probably a good idea. Another thing to notice is the ``down_revision`` variable. This is how Alembic knows the correct order in which to apply migrations. When we create the next revision, the new file's ``down_revision`` identifier would point to this one:: # revision identifiers, used by Alembic. revision = 'ae1027a6acf' down_revision = '1975ea83b712' Every time Alembic runs an operation against the ``versions/`` directory, it reads all the files in, and composes a list based on how the ``down_revision`` identifiers link together, with the ``down_revision`` of ``None`` representing the first file. In theory, if a migration environment had thousands of migrations, this could begin to add some latency to startup, but in practice a project should probably prune old migrations anyway (see the section :ref:`building_uptodate` for a description on how to do this, while maintaining the ability to build the current database fully). We can then add some directives to our script, suppose adding a new table ``account``:: def upgrade(): op.create_table( 'account', sa.Column('id', sa.Integer, primary_key=True), sa.Column('name', sa.String(50), nullable=False), sa.Column('description', sa.Unicode(200)), ) def downgrade(): op.drop_table('account') :meth:`~.Operations.create_table` and :meth:`~.Operations.drop_table` are Alembic directives. Alembic provides all the basic database migration operations via these directives, which are designed to be as simple and minimalistic as possible; there's no reliance upon existing table metadata for most of these directives. They draw upon a global "context" that indicates how to get at a database connection (if any; migrations can dump SQL/DDL directives to files as well) in order to invoke the command. This global context is set up, like everything else, in the ``env.py`` script. An overview of all Alembic directives is at :ref:`ops`. Running our First Migration =========================== We now want to run our migration. Assuming our database is totally clean, it's as yet unversioned. The ``alembic upgrade`` command will run upgrade operations, proceeding from the current database revision, in this example ``None``, to the given target revision. We can specify ``1975ea83b712`` as the revision we'd like to upgrade to, but it's easier in most cases just to tell it "the most recent", in this case ``head``:: $ alembic upgrade head INFO [alembic.context] Context class PostgresqlContext. INFO [alembic.context] Will assume transactional DDL. INFO [alembic.context] Running upgrade None -> 1975ea83b712 Wow that rocked! Note that the information we see on the screen is the result of the logging configuration set up in ``alembic.ini`` - logging the ``alembic`` stream to the console (standard error, specifically). The process which occurred here included that Alembic first checked if the database had a table called ``alembic_version``, and if not, created it. It looks in this table for the current version, if any, and then calculates the path from this version to the version requested, in this case ``head``, which is known to be ``1975ea83b712``. It then invokes the ``upgrade()`` method in each file to get to the target revision. Running our Second Migration ============================= Let's do another one so we have some things to play with. We again create a revision file:: $ alembic revision -m "Add a column" Generating /path/to/yourapp/alembic/versions/ae1027a6acf_add_a_column.py... done Let's edit this file and add a new column to the ``account`` table:: """Add a column Revision ID: ae1027a6acf Revises: 1975ea83b712 Create Date: 2011-11-08 12:37:36.714947 """ # revision identifiers, used by Alembic. revision = 'ae1027a6acf' down_revision = '1975ea83b712' from alembic import op import sqlalchemy as sa def upgrade(): op.add_column('account', sa.Column('last_transaction_date', sa.DateTime)) def downgrade(): op.drop_column('account', 'last_transaction_date') Running again to ``head``:: $ alembic upgrade head INFO [alembic.context] Context class PostgresqlContext. INFO [alembic.context] Will assume transactional DDL. INFO [alembic.context] Running upgrade 1975ea83b712 -> ae1027a6acf We've now added the ``last_transaction_date`` column to the database. Partial Revision Identifiers ============================= Any time we need to refer to a revision number explicitly, we have the option to use a partial number. As long as this number uniquely identifies the version, it may be used in any command in any place that version numbers are accepted:: $ alembic upgrade ae1 Above, we use ``ae1`` to refer to revision ``ae1027a6acf``. Alembic will stop and let you know if more than one version starts with that prefix. .. _relative_migrations: Relative Migration Identifiers ============================== Relative upgrades/downgrades are also supported. To move two versions from the current, a decimal value "+N" can be supplied:: $ alembic upgrade +2 Negative values are accepted for downgrades:: $ alembic downgrade -1 Relative identifiers may also be in terms of a specific revision. For example, to upgrade to revision ``ae1027a6acf`` plus two additional steps:: $ alembic upgrade ae10+2 .. versionadded:: 0.7.0 Support for relative migrations in terms of a specific revision. Getting Information =================== With a few revisions present we can get some information about the state of things. First we can view the current revision:: $ alembic current INFO [alembic.context] Context class PostgresqlContext. INFO [alembic.context] Will assume transactional DDL. Current revision for postgresql://scott:XXXXX@localhost/test: 1975ea83b712 -> ae1027a6acf (head), Add a column ``head`` is displayed only if the revision identifier for this database matches the head revision. We can also view history with ``alembic history``; the ``--verbose`` option (accepted by several commands, including ``history``, ``current``, ``heads`` and ``branches``) will show us full information about each revision:: $ alembic history --verbose Rev: ae1027a6acf (head) Parent: 1975ea83b712 Path: /path/to/yourproject/alembic/versions/ae1027a6acf_add_a_column.py add a column Revision ID: ae1027a6acf Revises: 1975ea83b712 Create Date: 2014-11-20 13:02:54.849677 Rev: 1975ea83b712 Parent: Path: /path/to/yourproject/alembic/versions/1975ea83b712_add_account_table.py create account table Revision ID: 1975ea83b712 Revises: Create Date: 2014-11-20 13:02:46.257104 Viewing History Ranges ---------------------- Using the ``-r`` option to ``alembic history``, we can also view various slices of history. The ``-r`` argument accepts an argument ``[start]:[end]``, where either may be a revision number, symbols like ``head``, ``heads`` or ``base``, ``current`` to specify the current revision(s), as well as negative relative ranges for ``[start]`` and positive relative ranges for ``[end]``:: $ alembic history -r1975ea:ae1027 A relative range starting from three revs ago up to current migration, which will invoke the migration environment against the database to get the current migration:: $ alembic history -r-3:current View all revisions from 1975 to the head:: $ alembic history -r1975ea: .. versionadded:: 0.6.0 ``alembic revision`` now accepts the ``-r`` argument to specify specific ranges based on version numbers, symbols, or relative deltas. Downgrading =========== We can illustrate a downgrade back to nothing, by calling ``alembic downgrade`` back to the beginning, which in Alembic is called ``base``:: $ alembic downgrade base INFO [alembic.context] Context class PostgresqlContext. INFO [alembic.context] Will assume transactional DDL. INFO [alembic.context] Running downgrade ae1027a6acf -> 1975ea83b712 INFO [alembic.context] Running downgrade 1975ea83b712 -> None Back to nothing - and up again:: $ alembic upgrade head INFO [alembic.context] Context class PostgresqlContext. INFO [alembic.context] Will assume transactional DDL. INFO [alembic.context] Running upgrade None -> 1975ea83b712 INFO [alembic.context] Running upgrade 1975ea83b712 -> ae1027a6acf Next Steps ========== The vast majority of Alembic environments make heavy use of the "autogenerate" feature. Continue onto the next section, :doc:`autogenerate`. zzzeek-alembic-bee044a1c187/docs/build/unreleased/000077500000000000000000000000001353106760100217605ustar00rootroot00000000000000zzzeek-alembic-bee044a1c187/docs/build/unreleased/README.txt000066400000000000000000000006351353106760100234620ustar00rootroot00000000000000Individual per-changelog files go here in .rst format, which are pulled in by changelog (version 0.4.0 or higher) to be rendered into the changelog_xx.rst file. At release time, the files here are removed and written directly into the changelog. Rationale is so that multiple changes being merged into gerrit don't produce conflicts. Note that gerrit does not support custom merge handlers unlike git itself. zzzeek-alembic-bee044a1c187/reap_dbs.py000066400000000000000000000013041353106760100177310ustar00rootroot00000000000000"""Drop Oracle, SQL Server databases that are left over from a multiprocessing test run. Currently the cx_Oracle driver seems to sometimes not release a TCP connection even if close() is called, which prevents the provisioning system from dropping a database in-process. For SQL Server, databases still remain in use after tests run and running a kill of all detected sessions does not seem to release the database in process. """ import logging import sys from sqlalchemy.testing import provision logging.basicConfig() logging.getLogger(provision.__name__).setLevel(logging.INFO) if hasattr(provision, "reap_dbs"): provision.reap_dbs(sys.argv[1]) else: provision.reap_oracle_dbs(sys.argv[1]) zzzeek-alembic-bee044a1c187/run_tests.py000077500000000000000000000012021353106760100202000ustar00rootroot00000000000000####### NOTE: ####### This file is deprecated as is nose support. ####### Please use py.test or the tox test runner to run tests. ####### See README.unittests.rst import os # use bootstrapping so that test plugins are loaded # without touching the main library before coverage starts bootstrap_file = os.path.join( os.path.dirname(__file__), "alembic", "testing", "plugin", "bootstrap.py" ) with open(bootstrap_file) as f: code = compile(f.read(), "bootstrap.py", 'exec') to_bootstrap = "nose" exec(code, globals(), locals()) from noseplugin import NoseSQLAlchemy import nose nose.main(addplugins=[NoseSQLAlchemy()]) zzzeek-alembic-bee044a1c187/setup.cfg000066400000000000000000000021311353106760100174200ustar00rootroot00000000000000[egg_info] tag_build=dev [bdist_wheel] universal = 1 [upload_docs] upload-dir = docs/build/output/html [upload] sign = 1 identity = C4DAFEE1 [nosetests] with-sqla_testing = true where = tests [flake8] enable-extensions = G # E203 is due to https://github.com/PyCQA/pycodestyle/issues/373 ignore = A003, D, E203,E305,E711,E712,E721,E722,E741, N801,N802,N806, RST304,RST303,RST299,RST399, W503,W504 exclude = .venv,.git,.tox,dist,doc,*egg,build import-order-style = google application-import-names = alembic,tests [sqla_testing] requirement_cls=tests.requirements:DefaultRequirements profile_file=tests/profiles.txt [db] default=sqlite:///:memory: sqlite=sqlite:///:memory: sqlite_file=sqlite:///querytest.db postgresql=postgresql://scott:tiger@127.0.0.1:5432/test mysql=mysql://scott:tiger@127.0.0.1:3306/test?charset=utf8 mssql=mssql+pyodbc://scott:tiger@ms_2008 oracle=oracle://scott:tiger@127.0.0.1:1521 oracle8=oracle://scott:tiger@127.0.0.1:1521/?use_ansi=0 [alembic] [tool:pytest] addopts= --tb native -v -r fxX -p no:warnings -p no:logging python_files=tests/test_*.py zzzeek-alembic-bee044a1c187/setup.py000066400000000000000000000046011353106760100173150ustar00rootroot00000000000000import os import re import sys from setuptools import find_packages from setuptools import setup from setuptools.command.test import test as TestCommand v = open(os.path.join(os.path.dirname(__file__), "alembic", "__init__.py")) VERSION = ( re.compile(r""".*__version__ = ["'](.*?)["']""", re.S) .match(v.read()) .group(1) ) v.close() readme = os.path.join(os.path.dirname(__file__), "README.rst") requires = [ "SQLAlchemy>=1.1.0", "Mako", "python-editor>=0.3", "python-dateutil", ] class UseTox(TestCommand): RED = 31 RESET_SEQ = "\033[0m" BOLD_SEQ = "\033[1m" COLOR_SEQ = "\033[1;%dm" def run_tests(self): sys.stderr.write( "%s%spython setup.py test is deprecated by pypa. Please invoke " "'tox' with no arguments for a basic test run.\n%s" % (self.COLOR_SEQ % self.RED, self.BOLD_SEQ, self.RESET_SEQ) ) sys.exit(1) setup( name="alembic", version=VERSION, description="A database migration tool for SQLAlchemy.", long_description=open(readme).read(), python_requires=">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*", classifiers=[ "Development Status :: 5 - Production/Stable", "Environment :: Console", "License :: OSI Approved :: MIT License", "Intended Audience :: Developers", "Programming Language :: Python", "Programming Language :: Python :: 2", "Programming Language :: Python :: 2.7", "Programming Language :: Python :: 3", "Programming Language :: Python :: 3.4", "Programming Language :: Python :: 3.5", "Programming Language :: Python :: 3.6", "Programming Language :: Python :: Implementation :: CPython", "Programming Language :: Python :: Implementation :: PyPy", "Topic :: Database :: Front-Ends", ], keywords="SQLAlchemy migrations", author="Mike Bayer", author_email="mike@zzzcomputing.com", url="https://alembic.sqlalchemy.org", project_urls={"Issue Tracker": "https://github.com/sqlalchemy/alembic/"}, license="MIT", packages=find_packages(".", exclude=["examples*", "test*"]), include_package_data=True, tests_require=["pytest!=3.9.1,!=3.9.2", "mock", "Mako"], cmdclass={"test": UseTox}, zip_safe=False, install_requires=requires, entry_points={"console_scripts": ["alembic = alembic.config:main"]}, ) zzzeek-alembic-bee044a1c187/tests/000077500000000000000000000000001353106760100167445ustar00rootroot00000000000000zzzeek-alembic-bee044a1c187/tests/__init__.py000066400000000000000000000000001353106760100210430ustar00rootroot00000000000000zzzeek-alembic-bee044a1c187/tests/_autogen_fixtures.py000066400000000000000000000213151353106760100230520ustar00rootroot00000000000000from sqlalchemy import CHAR from sqlalchemy import CheckConstraint from sqlalchemy import Column from sqlalchemy import event from sqlalchemy import ForeignKey from sqlalchemy import Index from sqlalchemy import Integer from sqlalchemy import MetaData from sqlalchemy import Numeric from sqlalchemy import String from sqlalchemy import Table from sqlalchemy import Text from sqlalchemy import text from sqlalchemy import UniqueConstraint from sqlalchemy.engine.reflection import Inspector from alembic import autogenerate from alembic import util from alembic.autogenerate import api from alembic.ddl.base import _fk_spec from alembic.migration import MigrationContext from alembic.operations import ops from alembic.testing import config from alembic.testing import eq_ from alembic.testing.env import clear_staging_env from alembic.testing.env import staging_env names_in_this_test = set() @event.listens_for(Table, "after_parent_attach") def new_table(table, parent): names_in_this_test.add(table.name) def _default_include_object(obj, name, type_, reflected, compare_to): if type_ == "table": return name in names_in_this_test else: return True _default_object_filters = _default_include_object class ModelOne(object): __requires__ = ("unique_constraint_reflection",) schema = None @classmethod def _get_db_schema(cls): schema = cls.schema m = MetaData(schema=schema) Table( "user", m, Column("id", Integer, primary_key=True), Column("name", String(50)), Column("a1", Text), Column("pw", String(50)), Index("pw_idx", "pw"), ) Table( "address", m, Column("id", Integer, primary_key=True), Column("email_address", String(100), nullable=False), ) Table( "order", m, Column("order_id", Integer, primary_key=True), Column( "amount", Numeric(8, 2), nullable=False, server_default=text("0"), ), CheckConstraint("amount >= 0", name="ck_order_amount"), ) Table( "extra", m, Column("x", CHAR), Column("uid", Integer, ForeignKey("user.id")), ) return m @classmethod def _get_model_schema(cls): schema = cls.schema m = MetaData(schema=schema) Table( "user", m, Column("id", Integer, primary_key=True), Column("name", String(50), nullable=False), Column("a1", Text, server_default="x"), ) Table( "address", m, Column("id", Integer, primary_key=True), Column("email_address", String(100), nullable=False), Column("street", String(50)), UniqueConstraint("email_address", name="uq_email"), ) Table( "order", m, Column("order_id", Integer, primary_key=True), Column( "amount", Numeric(10, 2), nullable=True, server_default=text("0"), ), Column("user_id", Integer, ForeignKey("user.id")), CheckConstraint("amount > -1", name="ck_order_amount"), ) Table( "item", m, Column("id", Integer, primary_key=True), Column("description", String(100)), Column("order_id", Integer, ForeignKey("order.order_id")), CheckConstraint("len(description) > 5"), ) return m class _ComparesFKs(object): def _assert_fk_diff( self, diff, type_, source_table, source_columns, target_table, target_columns, name=None, conditional_name=None, source_schema=None, onupdate=None, ondelete=None, initially=None, deferrable=None, ): # the public API for ForeignKeyConstraint was not very rich # in 0.7, 0.8, so here we use the well-known but slightly # private API to get at its elements ( fk_source_schema, fk_source_table, fk_source_columns, fk_target_schema, fk_target_table, fk_target_columns, fk_onupdate, fk_ondelete, fk_deferrable, fk_initially, ) = _fk_spec(diff[1]) eq_(diff[0], type_) eq_(fk_source_table, source_table) eq_(fk_source_columns, source_columns) eq_(fk_target_table, target_table) eq_(fk_source_schema, source_schema) eq_(fk_onupdate, onupdate) eq_(fk_ondelete, ondelete) eq_(fk_initially, initially) eq_(fk_deferrable, deferrable) eq_([elem.column.name for elem in diff[1].elements], target_columns) if conditional_name is not None: if conditional_name == "servergenerated": fks = Inspector.from_engine(self.bind).get_foreign_keys( source_table ) server_fk_name = fks[0]["name"] eq_(diff[1].name, server_fk_name) else: eq_(diff[1].name, conditional_name) else: eq_(diff[1].name, name) class AutogenTest(_ComparesFKs): def _flatten_diffs(self, diffs): for d in diffs: if isinstance(d, list): for fd in self._flatten_diffs(d): yield fd else: yield d @classmethod def _get_bind(cls): return config.db configure_opts = {} @classmethod def setup_class(cls): staging_env() cls.bind = cls._get_bind() cls.m1 = cls._get_db_schema() cls.m1.create_all(cls.bind) cls.m2 = cls._get_model_schema() @classmethod def teardown_class(cls): cls.m1.drop_all(cls.bind) clear_staging_env() def setUp(self): self.conn = conn = self.bind.connect() ctx_opts = { "compare_type": True, "compare_server_default": True, "target_metadata": self.m2, "upgrade_token": "upgrades", "downgrade_token": "downgrades", "alembic_module_prefix": "op.", "sqlalchemy_module_prefix": "sa.", "include_object": _default_object_filters, } if self.configure_opts: ctx_opts.update(self.configure_opts) self.context = context = MigrationContext.configure( connection=conn, opts=ctx_opts ) self.autogen_context = api.AutogenContext(context, self.m2) def tearDown(self): self.conn.close() def _update_context(self, object_filters=None, include_schemas=None): if include_schemas is not None: self.autogen_context.opts["include_schemas"] = include_schemas if object_filters is not None: self.autogen_context._object_filters = [object_filters] return self.autogen_context class AutogenFixtureTest(_ComparesFKs): def _fixture( self, m1, m2, include_schemas=False, opts=None, object_filters=_default_object_filters, return_ops=False, ): self.metadata, model_metadata = m1, m2 for m in util.to_list(self.metadata): m.create_all(self.bind) with self.bind.connect() as conn: ctx_opts = { "compare_type": True, "compare_server_default": True, "target_metadata": model_metadata, "upgrade_token": "upgrades", "downgrade_token": "downgrades", "alembic_module_prefix": "op.", "sqlalchemy_module_prefix": "sa.", "include_object": object_filters, "include_schemas": include_schemas, } if opts: ctx_opts.update(opts) self.context = context = MigrationContext.configure( connection=conn, opts=ctx_opts ) autogen_context = api.AutogenContext(context, model_metadata) uo = ops.UpgradeOps(ops=[]) autogenerate._produce_net_changes(autogen_context, uo) if return_ops: return uo else: return uo.as_diffs() reports_unnamed_constraints = False def setUp(self): staging_env() self.bind = config.db def tearDown(self): if hasattr(self, "metadata"): for m in util.to_list(self.metadata): m.drop_all(self.bind) clear_staging_env() zzzeek-alembic-bee044a1c187/tests/_large_map.py000066400000000000000000000162251353106760100214120ustar00rootroot00000000000000from alembic.script.revision import Revision from alembic.script.revision import RevisionMap data = [ Revision("3fc8a578bc0a", ("4878cb1cb7f6", "454a0529f84e")), Revision("69285b0faaa", ("36c31e4e1c37", "3a3b24a31b57")), Revision("3b0452c64639", "2f1a0f3667f3"), Revision("2d9d787a496", "135b5fd31062"), Revision("184f65ed83af", "3b0452c64639"), Revision("430074f99c29", "54f871bfe0b0"), Revision("3ffb59981d9a", "519c9f3ce294"), Revision("454a0529f84e", ("40f6508e4373", "38a936c6ab11")), Revision("24c2620b2e3f", ("430074f99c29", "1f5ceb1ec255")), Revision("169a948471a9", "247ad6880f93"), Revision("2f1a0f3667f3", "17dd0f165262"), Revision("27227dc4fda8", "2a66d7c4d8a1"), Revision("4b2ad1ffe2e7", ("3b409f268da4", "4f8a9b79a063")), Revision("124ef6a17781", "2529684536da"), Revision("4789d9c82ca7", "593b8076fb2c"), Revision("64ed798bcc3", ("44ed1bf512a0", "169a948471a9")), Revision("2588a3c36a0f", "50c7b21c9089"), Revision("359329c2ebb", ("5810e9eff996", "339faa12616")), Revision("540bc5634bd", "3a5db5f31209"), Revision("20fe477817d2", "53d5ff905573"), Revision("4f8a9b79a063", ("3cf34fcd6473", "300209d8594")), Revision("6918589deaf", "3314c17f6e35"), Revision("1755e3b1481c", ("17b66754be21", "31b1d4b7fc95")), Revision("58c988e1aa4e", ("219240032b88", "f067f0b825c")), Revision("593b8076fb2c", "1d94175d221b"), Revision("38d069994064", ("46b70a57edc0", "3ed56beabfb7")), Revision("3e2f6c6d1182", "7f96a01461b"), Revision("1f6969597fe7", "1811bdae9e63"), Revision("17dd0f165262", "3cf02a593a68"), Revision("3cf02a593a68", "25a7ef58d293"), Revision("34dfac7edb2d", "28f4dd53ad3a"), Revision("4009c533e05d", "42ded7355da2"), Revision("5a0003c3b09c", ("3ed56beabfb7", "2028d94d3863")), Revision("38a936c6ab11", "2588a3c36a0f"), Revision("59223c5b7b36", "2f93dd880bae"), Revision("4121bd6e99e9", "540bc5634bd"), Revision("260714a3f2de", "6918589deaf"), Revision("ae77a2ed69b", "274fd2642933"), Revision("18ff1ab3b4c4", "430133b6d46c"), Revision("2b9a327527a9", ("359329c2ebb", "593b8076fb2c")), Revision("4e6167c75ed0", "325b273d61bd"), Revision("21ab11a7c5c4", ("3da31f3323ec", "22f26011d635")), Revision("3b93e98481b1", "4e28e2f4fe2f"), Revision("145d8f1e334d", "b4143d129e"), Revision("135b5fd31062", "1d94175d221b"), Revision("300209d8594", ("52804033910e", "593b8076fb2c")), Revision("8dca95cce28", "f034666cd80"), Revision("46b70a57edc0", ("145d8f1e334d", "4cc2960cbe19")), Revision("4d45e479fbb9", "2d9d787a496"), Revision("22f085bf8bbd", "540bc5634bd"), Revision("263e91fd17d8", "2b9a327527a9"), Revision("219240032b88", ("300209d8594", "2b9a327527a9")), Revision("325b273d61bd", "4b2ad1ffe2e7"), Revision("199943ccc774", "1aa674ccfa4e"), Revision("247ad6880f93", "1f6969597fe7"), Revision("4878cb1cb7f6", "28f4dd53ad3a"), Revision("2a66d7c4d8a1", "23f1ccb18d6d"), Revision("42b079245b55", "593b8076fb2c"), Revision("1cccf82219cb", ("20fe477817d2", "915c67915c2")), Revision("b4143d129e", ("159331d6f484", "504d5168afe1")), Revision("53d5ff905573", "3013877bf5bd"), Revision("1f5ceb1ec255", "3ffb59981d9a"), Revision("ef1c1c1531f", "4738812e6ece"), Revision("1f6963d1ae02", "247ad6880f93"), Revision("44d58f1d31f0", "18ff1ab3b4c4"), Revision("c3ebe64dfb5", ("3409c57b0da", "31f352e77045")), Revision("f067f0b825c", "359329c2ebb"), Revision("52ab2d3b57ce", "96d590bd82e"), Revision("3b409f268da4", ("20e90eb3eeb6", "263e91fd17d8")), Revision("5a4ca8889674", "4e6167c75ed0"), Revision("5810e9eff996", ("2d30d79c4093", "52804033910e")), Revision("40f6508e4373", "4ed16fad67a7"), Revision("1811bdae9e63", "260714a3f2de"), Revision("3013877bf5bd", ("8dca95cce28", "3fc8a578bc0a")), Revision("16426dbea880", "28f4dd53ad3a"), Revision("22f26011d635", ("4c93d063d2ba", "3b93e98481b1")), Revision("3409c57b0da", "17b66754be21"), Revision("44373001000f", ("42b079245b55", "219240032b88")), Revision("28f4dd53ad3a", "2e71fd90eb9d"), Revision("4cc2960cbe19", "504d5168afe1"), Revision("31f352e77045", ("17b66754be21", "22f085bf8bbd")), Revision("4ed16fad67a7", "f034666cd80"), Revision("3da31f3323ec", "4c93d063d2ba"), Revision("31b1d4b7fc95", "1cc4459fd115"), Revision("11bc0ff42f87", "28f4dd53ad3a"), Revision("3a5db5f31209", "59742a546b84"), Revision("20e90eb3eeb6", ("58c988e1aa4e", "44373001000f")), Revision("23f1ccb18d6d", "52ab2d3b57ce"), Revision("1d94175d221b", "21ab11a7c5c4"), Revision("36f1a410ed", "54f871bfe0b0"), Revision("181a149173e", "2ee35cac4c62"), Revision("171ad2f0c672", "4a4e0838e206"), Revision("2f93dd880bae", "540bc5634bd"), Revision("25a7ef58d293", None), Revision("7f96a01461b", "184f65ed83af"), Revision("b21f22233f", "3e2f6c6d1182"), Revision("52804033910e", "1d94175d221b"), Revision("1e6240aba5b3", ("4121bd6e99e9", "2c50d8bab6ee")), Revision("1cc4459fd115", "1e6240aba5b3"), Revision("274fd2642933", "4009c533e05d"), Revision("1aa674ccfa4e", ("59223c5b7b36", "42050bf030fd")), Revision("4e28e2f4fe2f", "596d7b9e11"), Revision("49ddec8c7a5e", ("124ef6a17781", "47578179e766")), Revision("3e9bb349cc46", "ef1c1c1531f"), Revision("2028d94d3863", "504d5168afe1"), Revision("159331d6f484", "34dfac7edb2d"), Revision("596d7b9e11", "171ad2f0c672"), Revision("3b96bcc8da76", "f034666cd80"), Revision("4738812e6ece", "78982bf5499"), Revision("3314c17f6e35", "27227dc4fda8"), Revision("30931c545bf", "2e71fd90eb9d"), Revision("2e71fd90eb9d", ("c3ebe64dfb5", "1755e3b1481c")), Revision("3ed56beabfb7", ("11bc0ff42f87", "69285b0faaa")), Revision("96d590bd82e", "3e9bb349cc46"), Revision("339faa12616", "4d45e479fbb9"), Revision("47578179e766", "2529684536da"), Revision("2ee35cac4c62", "b21f22233f"), Revision("50c7b21c9089", ("4ed16fad67a7", "3b96bcc8da76")), Revision("78982bf5499", "ae77a2ed69b"), Revision("519c9f3ce294", "2c50d8bab6ee"), Revision("2720fc75e5fd", "1cccf82219cb"), Revision("21638ec787ba", "44d58f1d31f0"), Revision("59742a546b84", "49ddec8c7a5e"), Revision("2d30d79c4093", "135b5fd31062"), Revision("f034666cd80", ("5a0003c3b09c", "38d069994064")), Revision("430133b6d46c", "181a149173e"), Revision("3a3b24a31b57", ("16426dbea880", "4cc2960cbe19")), Revision("2529684536da", ("64ed798bcc3", "1f6963d1ae02")), Revision("17b66754be21", ("19e0db9d806a", "24c2620b2e3f")), Revision("3cf34fcd6473", ("52804033910e", "4789d9c82ca7")), Revision("36c31e4e1c37", "504d5168afe1"), Revision("54f871bfe0b0", "519c9f3ce294"), Revision("4a4e0838e206", "2a7f37cf7770"), Revision("19e0db9d806a", ("430074f99c29", "36f1a410ed")), Revision("44ed1bf512a0", "247ad6880f93"), Revision("42050bf030fd", "2f93dd880bae"), Revision("2c50d8bab6ee", "199943ccc774"), Revision("504d5168afe1", ("28f4dd53ad3a", "30931c545bf")), Revision("915c67915c2", "3fc8a578bc0a"), Revision("2a7f37cf7770", "2720fc75e5fd"), Revision("4c93d063d2ba", "4e28e2f4fe2f"), Revision("42ded7355da2", "21638ec787ba"), ] map_ = RevisionMap(lambda: data) zzzeek-alembic-bee044a1c187/tests/conftest.py000077500000000000000000000011531353106760100211460ustar00rootroot00000000000000#!/usr/bin/env python """ pytest plugin script. This script is an extension to py.test which installs SQLAlchemy's testing plugin into the local environment. """ import os import sqlalchemy # ideally, SQLAlchemy would allow us to just import bootstrap, # but for now we have to use its "load from a file" approach bootstrap_file = os.path.join( os.path.dirname(sqlalchemy.__file__), "testing", "plugin", "bootstrap.py" ) with open(bootstrap_file) as f: code = compile(f.read(), "bootstrap.py", "exec") to_bootstrap = "pytest" exec(code, globals(), locals()) from pytestplugin import * # noqa zzzeek-alembic-bee044a1c187/tests/requirements.py000066400000000000000000000163471353106760100220540ustar00rootroot00000000000000from sqlalchemy.testing import exclusions from alembic.testing.requirements import SuiteRequirements from alembic.util import sqla_compat class DefaultRequirements(SuiteRequirements): @property def alter_column(self): return exclusions.skip_if(["sqlite"], "no ALTER COLUMN support") @property def schemas(self): """Target database must support external schemas, and have one named 'test_schema'.""" return exclusions.skip_if(["sqlite", "firebird"], "no schema support") @property def no_referential_integrity(self): """test will fail if referential integrity is enforced""" return exclusions.fails_on_everything_except("sqlite") @property def non_native_boolean(self): """test will fail if native boolean is provided""" return exclusions.fails_if( exclusions.LambdaPredicate( lambda config: config.db.dialect.supports_native_boolean ) ) @property def check_constraints_w_enforcement(self): return exclusions.fails_on("mysql") @property def unnamed_constraints(self): """constraints without names are supported.""" return exclusions.only_on(["sqlite"]) @property def fk_names(self): """foreign key constraints always have names in the DB""" return exclusions.fails_on("sqlite") @property def no_name_normalize(self): return exclusions.skip_if( lambda config: config.db.dialect.requires_name_normalize ) @property def reflects_fk_options(self): return exclusions.only_on(["postgresql", "mysql", "sqlite"]) @property def fk_initially(self): """backend supports INITIALLY option in foreign keys""" return exclusions.only_on(["postgresql"]) @property def fk_deferrable(self): """backend supports DEFERRABLE option in foreign keys""" return exclusions.only_on(["postgresql"]) @property def flexible_fk_cascades(self): """target database must support ON UPDATE/DELETE..CASCADE with the full range of keywords (e.g. NO ACTION, etc.)""" return exclusions.skip_if( ["oracle"], "target backend has poor FK cascade syntax" ) @property def reflects_unique_constraints_unambiguously(self): return exclusions.fails_on("mysql", "oracle") @property def reflects_pk_names(self): """Target driver reflects the name of primary key constraints.""" return exclusions.fails_on_everything_except( "postgresql", "oracle", "mssql", "sybase", "sqlite" ) @property def postgresql_uuid_ossp(self): def check_uuid_ossp(config): if not exclusions.against(config, "postgresql"): return False try: config.db.execute("SELECT uuid_generate_v4()") return True except: return False return exclusions.only_if(check_uuid_ossp) def _has_pg_extension(self, name): def check(config): if not exclusions.against(config, "postgresql"): return False count = config.db.scalar( "SELECT count(*) FROM pg_extension " "WHERE extname='%s'" % name ) return bool(count) return exclusions.only_if(check, "needs %s extension" % name) @property def hstore(self): return self._has_pg_extension("hstore") @property def btree_gist(self): return self._has_pg_extension("btree_gist") @property def autoincrement_on_composite_pk(self): return exclusions.skip_if(["sqlite"], "not supported by database") @property def integer_subtype_comparisons(self): """if a compare of Integer and BigInteger is supported yet.""" return exclusions.skip_if(["oracle"], "not supported by alembic impl") @property def check_constraint_reflection(self): return exclusions.fails_on_everything_except( "postgresql", "sqlite", "oracle", self._mysql_and_check_constraints_exist, ) @property def mysql_check_reflection_or_none(self): # succeed if: # 1. SQLAlchemy does not reflect CHECK constraints # 2. SQLAlchemy does reflect CHECK constraints, but MySQL does not. def go(config): return ( not self._mysql_check_constraints_exist(config) or self.sqlalchemy_1115.enabled ) return exclusions.succeeds_if(go) @property def mysql_timestamp_reflection(self): def go(config): return ( not self._mariadb_102(config) or self.sqlalchemy_1115.enabled ) return exclusions.only_if(go) def _mariadb_102(self, config): return ( exclusions.against(config, "mysql") and sqla_compat._is_mariadb(config.db.dialect) and sqla_compat._mariadb_normalized_version_info(config.db.dialect) > (10, 2) ) def mysql_check_col_name_change(self, config): # MySQL has check constraints that enforce an reflect, however # they prevent a column's name from being changed due to a bug in # MariaDB 10.2 as well as MySQL 8.0.16 if exclusions.against(config, "mysql"): if sqla_compat._is_mariadb(config.db.dialect): mnvi = sqla_compat._mariadb_normalized_version_info norm_version_info = mnvi(config.db.dialect) return norm_version_info >= (10, 2) and norm_version_info < ( 10, 2, 22, ) else: norm_version_info = config.db.dialect.server_version_info return norm_version_info >= (8, 0, 16) else: return True def _mysql_and_check_constraints_exist(self, config): # 1. we have mysql / mariadb and # 2. it enforces check constraints if exclusions.against(config, "mysql"): if sqla_compat._is_mariadb(config.db.dialect): mnvi = sqla_compat._mariadb_normalized_version_info norm_version_info = mnvi(config.db.dialect) return norm_version_info >= (10, 2) else: norm_version_info = config.db.dialect.server_version_info return norm_version_info >= (8, 0, 16) else: return False def _mysql_check_constraints_exist(self, config): # 1. we dont have mysql / mariadb or # 2. we have mysql / mariadb that enforces check constraints return not exclusions.against( config, "mysql" ) or self._mysql_and_check_constraints_exist(config) def _mysql_check_constraints_dont_exist(self, config): # 1. we have mysql / mariadb and # 2. they dont enforce check constraints return not self._mysql_check_constraints_exist(config) def _mysql_not_mariadb_102(self, config): return exclusions.against(config, "mysql") and ( not sqla_compat._is_mariadb(config.db.dialect) or sqla_compat._mariadb_normalized_version_info(config.db.dialect) < (10, 2) ) zzzeek-alembic-bee044a1c187/tests/test_autogen_comments.py000066400000000000000000000143121353106760100237250ustar00rootroot00000000000000import sys from sqlalchemy import Column from sqlalchemy import Float from sqlalchemy import MetaData from sqlalchemy import String from sqlalchemy import Table from alembic.testing import eq_ from alembic.testing import mock from alembic.testing import TestBase from ._autogen_fixtures import AutogenFixtureTest py3k = sys.version_info.major >= 3 class AutogenerateCommentsTest(AutogenFixtureTest, TestBase): __backend__ = True __requires__ = ("comments",) def test_existing_table_comment_no_change(self): m1 = MetaData() m2 = MetaData() Table( "some_table", m1, Column("test", String(10), primary_key=True), comment="this is some table", ) Table( "some_table", m2, Column("test", String(10), primary_key=True), comment="this is some table", ) diffs = self._fixture(m1, m2) eq_(diffs, []) def test_add_table_comment(self): m1 = MetaData() m2 = MetaData() Table("some_table", m1, Column("test", String(10), primary_key=True)) Table( "some_table", m2, Column("test", String(10), primary_key=True), comment="this is some table", ) diffs = self._fixture(m1, m2) eq_(diffs[0][0], "add_table_comment") eq_(diffs[0][1].comment, "this is some table") eq_(diffs[0][2], None) def test_remove_table_comment(self): m1 = MetaData() m2 = MetaData() Table( "some_table", m1, Column("test", String(10), primary_key=True), comment="this is some table", ) Table("some_table", m2, Column("test", String(10), primary_key=True)) diffs = self._fixture(m1, m2) eq_(diffs[0][0], "remove_table_comment") eq_(diffs[0][1].comment, None) def test_alter_table_comment(self): m1 = MetaData() m2 = MetaData() Table( "some_table", m1, Column("test", String(10), primary_key=True), comment="this is some table", ) Table( "some_table", m2, Column("test", String(10), primary_key=True), comment="this is also some table", ) diffs = self._fixture(m1, m2) eq_(diffs[0][0], "add_table_comment") eq_(diffs[0][1].comment, "this is also some table") eq_(diffs[0][2], "this is some table") def test_existing_column_comment_no_change(self): m1 = MetaData() m2 = MetaData() Table( "some_table", m1, Column("test", String(10), primary_key=True), Column("amount", Float, comment="the amount"), ) Table( "some_table", m2, Column("test", String(10), primary_key=True), Column("amount", Float, comment="the amount"), ) diffs = self._fixture(m1, m2) eq_(diffs, []) def test_add_column_comment(self): m1 = MetaData() m2 = MetaData() Table( "some_table", m1, Column("test", String(10), primary_key=True), Column("amount", Float), ) Table( "some_table", m2, Column("test", String(10), primary_key=True), Column("amount", Float, comment="the amount"), ) diffs = self._fixture(m1, m2) eq_( diffs, [ [ ( "modify_comment", None, "some_table", "amount", { "existing_nullable": True, "existing_type": mock.ANY, "existing_server_default": False, }, None, "the amount", ) ] ], ) def test_remove_column_comment(self): m1 = MetaData() m2 = MetaData() Table( "some_table", m1, Column("test", String(10), primary_key=True), Column("amount", Float, comment="the amount"), ) Table( "some_table", m2, Column("test", String(10), primary_key=True), Column("amount", Float), ) diffs = self._fixture(m1, m2) eq_( diffs, [ [ ( "modify_comment", None, "some_table", "amount", { "existing_nullable": True, "existing_type": mock.ANY, "existing_server_default": False, }, "the amount", None, ) ] ], ) def test_alter_column_comment(self): m1 = MetaData() m2 = MetaData() Table( "some_table", m1, Column("test", String(10), primary_key=True), Column("amount", Float, comment="the amount"), ) Table( "some_table", m2, Column("test", String(10), primary_key=True), Column("amount", Float, comment="the adjusted amount"), ) diffs = self._fixture(m1, m2) eq_( diffs, [ [ ( "modify_comment", None, "some_table", "amount", { "existing_nullable": True, "existing_type": mock.ANY, "existing_server_default": False, }, "the amount", "the adjusted amount", ) ] ], ) zzzeek-alembic-bee044a1c187/tests/test_autogen_composition.py000066400000000000000000000344721353106760100244540ustar00rootroot00000000000000import re from alembic import autogenerate from alembic.migration import MigrationContext from alembic.testing import eq_ from alembic.testing import TestBase from ._autogen_fixtures import _default_include_object from ._autogen_fixtures import AutogenTest from ._autogen_fixtures import ModelOne class AutogenerateDiffTest(ModelOne, AutogenTest, TestBase): __only_on__ = "sqlite" def test_render_nothing(self): context = MigrationContext.configure( connection=self.bind.connect(), opts={ "compare_type": True, "compare_server_default": True, "target_metadata": self.m1, "upgrade_token": "upgrades", "downgrade_token": "downgrades", }, ) template_args = {} autogenerate._render_migration_diffs(context, template_args) eq_( re.sub(r"u'", "'", template_args["upgrades"]), """# ### commands auto generated by Alembic - please adjust! ### pass # ### end Alembic commands ###""", ) eq_( re.sub(r"u'", "'", template_args["downgrades"]), """# ### commands auto generated by Alembic - please adjust! ### pass # ### end Alembic commands ###""", ) def test_render_nothing_batch(self): context = MigrationContext.configure( connection=self.bind.connect(), opts={ "compare_type": True, "compare_server_default": True, "target_metadata": self.m1, "upgrade_token": "upgrades", "downgrade_token": "downgrades", "alembic_module_prefix": "op.", "sqlalchemy_module_prefix": "sa.", "render_as_batch": True, "include_symbol": lambda name, schema: False, }, ) template_args = {} autogenerate._render_migration_diffs(context, template_args) eq_( re.sub(r"u'", "'", template_args["upgrades"]), """# ### commands auto generated by Alembic - please adjust! ### pass # ### end Alembic commands ###""", ) eq_( re.sub(r"u'", "'", template_args["downgrades"]), """# ### commands auto generated by Alembic - please adjust! ### pass # ### end Alembic commands ###""", ) def test_render_diffs_standard(self): """test a full render including indentation""" template_args = {} autogenerate._render_migration_diffs(self.context, template_args) eq_( re.sub(r"u'", "'", template_args["upgrades"]), """# ### commands auto generated by Alembic - please adjust! ### op.create_table('item', sa.Column('id', sa.Integer(), nullable=False), sa.Column('description', sa.String(length=100), nullable=True), sa.Column('order_id', sa.Integer(), nullable=True), sa.CheckConstraint('len(description) > 5'), sa.ForeignKeyConstraint(['order_id'], ['order.order_id'], ), sa.PrimaryKeyConstraint('id') ) op.drop_table('extra') op.add_column('address', sa.Column('street', sa.String(length=50), \ nullable=True)) op.create_unique_constraint('uq_email', 'address', ['email_address']) op.add_column('order', sa.Column('user_id', sa.Integer(), nullable=True)) op.alter_column('order', 'amount', existing_type=sa.NUMERIC(precision=8, scale=2), type_=sa.Numeric(precision=10, scale=2), nullable=True, existing_server_default=sa.text('0')) op.create_foreign_key(None, 'order', 'user', ['user_id'], ['id']) op.alter_column('user', 'a1', existing_type=sa.TEXT(), server_default='x', existing_nullable=True) op.alter_column('user', 'name', existing_type=sa.VARCHAR(length=50), nullable=False) op.drop_index('pw_idx', table_name='user') op.drop_column('user', 'pw') # ### end Alembic commands ###""", ) eq_( re.sub(r"u'", "'", template_args["downgrades"]), """# ### commands auto generated by Alembic - please adjust! ### op.add_column('user', sa.Column('pw', sa.VARCHAR(length=50), \ nullable=True)) op.create_index('pw_idx', 'user', ['pw'], unique=False) op.alter_column('user', 'name', existing_type=sa.VARCHAR(length=50), nullable=True) op.alter_column('user', 'a1', existing_type=sa.TEXT(), server_default=None, existing_nullable=True) op.drop_constraint(None, 'order', type_='foreignkey') op.alter_column('order', 'amount', existing_type=sa.Numeric(precision=10, scale=2), type_=sa.NUMERIC(precision=8, scale=2), nullable=False, existing_server_default=sa.text('0')) op.drop_column('order', 'user_id') op.drop_constraint('uq_email', 'address', type_='unique') op.drop_column('address', 'street') op.create_table('extra', sa.Column('x', sa.CHAR(), nullable=True), sa.Column('uid', sa.INTEGER(), nullable=True), sa.ForeignKeyConstraint(['uid'], ['user.id'], ) ) op.drop_table('item') # ### end Alembic commands ###""", ) def test_render_diffs_batch(self): """test a full render in batch mode including indentation""" template_args = {} self.context.opts["render_as_batch"] = True autogenerate._render_migration_diffs(self.context, template_args) eq_( re.sub(r"u'", "'", template_args["upgrades"]), """# ### commands auto generated by Alembic - please adjust! ### op.create_table('item', sa.Column('id', sa.Integer(), nullable=False), sa.Column('description', sa.String(length=100), nullable=True), sa.Column('order_id', sa.Integer(), nullable=True), sa.CheckConstraint('len(description) > 5'), sa.ForeignKeyConstraint(['order_id'], ['order.order_id'], ), sa.PrimaryKeyConstraint('id') ) op.drop_table('extra') with op.batch_alter_table('address', schema=None) as batch_op: batch_op.add_column(sa.Column('street', sa.String(length=50), nullable=True)) batch_op.create_unique_constraint('uq_email', ['email_address']) with op.batch_alter_table('order', schema=None) as batch_op: batch_op.add_column(sa.Column('user_id', sa.Integer(), nullable=True)) batch_op.alter_column('amount', existing_type=sa.NUMERIC(precision=8, scale=2), type_=sa.Numeric(precision=10, scale=2), nullable=True, existing_server_default=sa.text('0')) batch_op.create_foreign_key(None, 'user', ['user_id'], ['id']) with op.batch_alter_table('user', schema=None) as batch_op: batch_op.alter_column('a1', existing_type=sa.TEXT(), server_default='x', existing_nullable=True) batch_op.alter_column('name', existing_type=sa.VARCHAR(length=50), nullable=False) batch_op.drop_index('pw_idx') batch_op.drop_column('pw') # ### end Alembic commands ###""", # noqa, ) eq_( re.sub(r"u'", "'", template_args["downgrades"]), """# ### commands auto generated by Alembic - please adjust! ### with op.batch_alter_table('user', schema=None) as batch_op: batch_op.add_column(sa.Column('pw', sa.VARCHAR(length=50), nullable=True)) batch_op.create_index('pw_idx', ['pw'], unique=False) batch_op.alter_column('name', existing_type=sa.VARCHAR(length=50), nullable=True) batch_op.alter_column('a1', existing_type=sa.TEXT(), server_default=None, existing_nullable=True) with op.batch_alter_table('order', schema=None) as batch_op: batch_op.drop_constraint(None, type_='foreignkey') batch_op.alter_column('amount', existing_type=sa.Numeric(precision=10, scale=2), type_=sa.NUMERIC(precision=8, scale=2), nullable=False, existing_server_default=sa.text('0')) batch_op.drop_column('user_id') with op.batch_alter_table('address', schema=None) as batch_op: batch_op.drop_constraint('uq_email', type_='unique') batch_op.drop_column('street') op.create_table('extra', sa.Column('x', sa.CHAR(), nullable=True), sa.Column('uid', sa.INTEGER(), nullable=True), sa.ForeignKeyConstraint(['uid'], ['user.id'], ) ) op.drop_table('item') # ### end Alembic commands ###""", # noqa, ) def test_imports_maintined(self): template_args = {} self.context.opts["render_as_batch"] = True def render_item(type_, col, autogen_context): autogen_context.imports.add( "from mypackage import my_special_import" ) autogen_context.imports.add("from foobar import bat") self.context.opts["render_item"] = render_item autogenerate._render_migration_diffs(self.context, template_args) eq_( set(template_args["imports"].split("\n")), set( [ "from foobar import bat", "from mypackage import my_special_import", ] ), ) class AutogenerateDiffTestWSchema(ModelOne, AutogenTest, TestBase): __only_on__ = "postgresql" schema = "test_schema" def test_render_nothing(self): context = MigrationContext.configure( connection=self.bind.connect(), opts={ "compare_type": True, "compare_server_default": True, "target_metadata": self.m1, "upgrade_token": "upgrades", "downgrade_token": "downgrades", "alembic_module_prefix": "op.", "sqlalchemy_module_prefix": "sa.", "include_symbol": lambda name, schema: False, }, ) template_args = {} autogenerate._render_migration_diffs(context, template_args) eq_( re.sub(r"u'", "'", template_args["upgrades"]), """# ### commands auto generated by Alembic - please adjust! ### pass # ### end Alembic commands ###""", ) eq_( re.sub(r"u'", "'", template_args["downgrades"]), """# ### commands auto generated by Alembic - please adjust! ### pass # ### end Alembic commands ###""", ) def test_render_diffs_extras(self): """test a full render including indentation (include and schema)""" template_args = {} self.context.opts.update( { "include_object": _default_include_object, "include_schemas": True, } ) autogenerate._render_migration_diffs(self.context, template_args) eq_( re.sub(r"u'", "'", template_args["upgrades"]), """# ### commands auto generated by Alembic - please adjust! ### op.create_table('item', sa.Column('id', sa.Integer(), nullable=False), sa.Column('description', sa.String(length=100), nullable=True), sa.Column('order_id', sa.Integer(), nullable=True), sa.CheckConstraint('len(description) > 5'), sa.ForeignKeyConstraint(['order_id'], ['%(schema)s.order.order_id'], ), sa.PrimaryKeyConstraint('id'), schema='%(schema)s' ) op.drop_table('extra', schema='%(schema)s') op.add_column('address', sa.Column('street', sa.String(length=50), \ nullable=True), schema='%(schema)s') op.create_unique_constraint('uq_email', 'address', ['email_address'], \ schema='test_schema') op.add_column('order', sa.Column('user_id', sa.Integer(), nullable=True), \ schema='%(schema)s') op.alter_column('order', 'amount', existing_type=sa.NUMERIC(precision=8, scale=2), type_=sa.Numeric(precision=10, scale=2), nullable=True, existing_server_default=sa.text('0'), schema='%(schema)s') op.create_foreign_key(None, 'order', 'user', ['user_id'], ['id'], \ source_schema='%(schema)s', referent_schema='%(schema)s') op.alter_column('user', 'a1', existing_type=sa.TEXT(), server_default='x', existing_nullable=True, schema='%(schema)s') op.alter_column('user', 'name', existing_type=sa.VARCHAR(length=50), nullable=False, schema='%(schema)s') op.drop_index('pw_idx', table_name='user', schema='test_schema') op.drop_column('user', 'pw', schema='%(schema)s') # ### end Alembic commands ###""" % {"schema": self.schema}, ) eq_( re.sub(r"u'", "'", template_args["downgrades"]), """# ### commands auto generated by Alembic - please adjust! ### op.add_column('user', sa.Column('pw', sa.VARCHAR(length=50), \ autoincrement=False, nullable=True), schema='%(schema)s') op.create_index('pw_idx', 'user', ['pw'], unique=False, schema='%(schema)s') op.alter_column('user', 'name', existing_type=sa.VARCHAR(length=50), nullable=True, schema='%(schema)s') op.alter_column('user', 'a1', existing_type=sa.TEXT(), server_default=None, existing_nullable=True, schema='%(schema)s') op.drop_constraint(None, 'order', schema='%(schema)s', type_='foreignkey') op.alter_column('order', 'amount', existing_type=sa.Numeric(precision=10, scale=2), type_=sa.NUMERIC(precision=8, scale=2), nullable=False, existing_server_default=sa.text('0'), schema='%(schema)s') op.drop_column('order', 'user_id', schema='%(schema)s') op.drop_constraint('uq_email', 'address', schema='test_schema', type_='unique') op.drop_column('address', 'street', schema='%(schema)s') op.create_table('extra', sa.Column('x', sa.CHAR(length=1), autoincrement=False, nullable=True), sa.Column('uid', sa.INTEGER(), autoincrement=False, nullable=True), sa.ForeignKeyConstraint(['uid'], ['%(schema)s.user.id'], \ name='extra_uid_fkey'), schema='%(schema)s' ) op.drop_table('item', schema='%(schema)s') # ### end Alembic commands ###""" # noqa % {"schema": self.schema}, ) zzzeek-alembic-bee044a1c187/tests/test_autogen_diffs.py000066400000000000000000001366751353106760100232140ustar00rootroot00000000000000import sys from sqlalchemy import BIGINT from sqlalchemy import BigInteger from sqlalchemy import CHAR from sqlalchemy import CheckConstraint from sqlalchemy import Column from sqlalchemy import DateTime from sqlalchemy import DECIMAL from sqlalchemy import ForeignKey from sqlalchemy import ForeignKeyConstraint from sqlalchemy import Index from sqlalchemy import INTEGER from sqlalchemy import Integer from sqlalchemy import MetaData from sqlalchemy import Numeric from sqlalchemy import PrimaryKeyConstraint from sqlalchemy import SmallInteger from sqlalchemy import String from sqlalchemy import Table from sqlalchemy import Text from sqlalchemy import text from sqlalchemy import TypeDecorator from sqlalchemy import UniqueConstraint from sqlalchemy import VARCHAR from sqlalchemy.dialects import sqlite from sqlalchemy.engine.reflection import Inspector from sqlalchemy.types import NULLTYPE from sqlalchemy.types import VARBINARY from alembic import autogenerate from alembic.migration import MigrationContext from alembic.operations import ops from alembic.testing import assert_raises_message from alembic.testing import config from alembic.testing import eq_ from alembic.testing import is_ from alembic.testing import is_not_ from alembic.testing import mock from alembic.testing import TestBase from alembic.util import CommandError from ._autogen_fixtures import AutogenFixtureTest from ._autogen_fixtures import AutogenTest py3k = sys.version_info >= (3,) class AutogenCrossSchemaTest(AutogenTest, TestBase): __only_on__ = "postgresql" __backend__ = True @classmethod def _get_db_schema(cls): m = MetaData() Table("t1", m, Column("x", Integer)) Table("t2", m, Column("y", Integer), schema=config.test_schema) Table("t6", m, Column("u", Integer)) Table("t7", m, Column("v", Integer), schema=config.test_schema) return m @classmethod def _get_model_schema(cls): m = MetaData() Table("t3", m, Column("q", Integer)) Table("t4", m, Column("z", Integer), schema=config.test_schema) Table("t6", m, Column("u", Integer)) Table("t7", m, Column("v", Integer), schema=config.test_schema) return m def test_default_schema_omitted_upgrade(self): def include_object(obj, name, type_, reflected, compare_to): if type_ == "table": return name == "t3" else: return True self._update_context( object_filters=include_object, include_schemas=True ) uo = ops.UpgradeOps(ops=[]) autogenerate._produce_net_changes(self.autogen_context, uo) diffs = uo.as_diffs() eq_(diffs[0][0], "add_table") eq_(diffs[0][1].schema, None) def test_alt_schema_included_upgrade(self): def include_object(obj, name, type_, reflected, compare_to): if type_ == "table": return name == "t4" else: return True self._update_context( object_filters=include_object, include_schemas=True ) uo = ops.UpgradeOps(ops=[]) autogenerate._produce_net_changes(self.autogen_context, uo) diffs = uo.as_diffs() eq_(diffs[0][0], "add_table") eq_(diffs[0][1].schema, config.test_schema) def test_default_schema_omitted_downgrade(self): def include_object(obj, name, type_, reflected, compare_to): if type_ == "table": return name == "t1" else: return True self._update_context( object_filters=include_object, include_schemas=True ) uo = ops.UpgradeOps(ops=[]) autogenerate._produce_net_changes(self.autogen_context, uo) diffs = uo.as_diffs() eq_(diffs[0][0], "remove_table") eq_(diffs[0][1].schema, None) def test_alt_schema_included_downgrade(self): def include_object(obj, name, type_, reflected, compare_to): if type_ == "table": return name == "t2" else: return True self._update_context( object_filters=include_object, include_schemas=True ) uo = ops.UpgradeOps(ops=[]) autogenerate._produce_net_changes(self.autogen_context, uo) diffs = uo.as_diffs() eq_(diffs[0][0], "remove_table") eq_(diffs[0][1].schema, config.test_schema) class AutogenDefaultSchemaTest(AutogenFixtureTest, TestBase): __only_on__ = "postgresql" __backend__ = True def test_uses_explcit_schema_in_default_one(self): default_schema = self.bind.dialect.default_schema_name m1 = MetaData() m2 = MetaData() Table("a", m1, Column("x", String(50))) Table("a", m2, Column("x", String(50)), schema=default_schema) diffs = self._fixture(m1, m2, include_schemas=True) eq_(diffs, []) def test_uses_explcit_schema_in_default_two(self): default_schema = self.bind.dialect.default_schema_name m1 = MetaData() m2 = MetaData() Table("a", m1, Column("x", String(50))) Table("a", m2, Column("x", String(50)), schema=default_schema) Table("a", m2, Column("y", String(50)), schema="test_schema") diffs = self._fixture(m1, m2, include_schemas=True) eq_(len(diffs), 1) eq_(diffs[0][0], "add_table") eq_(diffs[0][1].schema, "test_schema") eq_(diffs[0][1].c.keys(), ["y"]) def test_uses_explcit_schema_in_default_three(self): default_schema = self.bind.dialect.default_schema_name m1 = MetaData() m2 = MetaData() Table("a", m1, Column("y", String(50)), schema="test_schema") Table("a", m2, Column("x", String(50)), schema=default_schema) Table("a", m2, Column("y", String(50)), schema="test_schema") diffs = self._fixture(m1, m2, include_schemas=True) eq_(len(diffs), 1) eq_(diffs[0][0], "add_table") eq_(diffs[0][1].schema, default_schema) eq_(diffs[0][1].c.keys(), ["x"]) class AutogenDefaultSchemaIsNoneTest(AutogenFixtureTest, TestBase): __only_on__ = "sqlite" def setUp(self): super(AutogenDefaultSchemaIsNoneTest, self).setUp() # prerequisite eq_(self.bind.dialect.default_schema_name, None) def test_no_default_schema(self): m1 = MetaData() m2 = MetaData() Table("a", m1, Column("x", String(50))) Table("a", m2, Column("x", String(50))) def _include_object(obj, name, type_, reflected, compare_to): if type_ == "table": return name in "a" and obj.schema != "main" else: return True diffs = self._fixture( m1, m2, include_schemas=True, object_filters=_include_object ) eq_(len(diffs), 0) class ModelOne(object): __requires__ = ("unique_constraint_reflection",) schema = None @classmethod def _get_db_schema(cls): schema = cls.schema m = MetaData(schema=schema) Table( "user", m, Column("id", Integer, primary_key=True), Column("name", String(50)), Column("a1", Text), Column("pw", String(50)), Index("pw_idx", "pw"), ) Table( "address", m, Column("id", Integer, primary_key=True), Column("email_address", String(100), nullable=False), ) Table( "order", m, Column("order_id", Integer, primary_key=True), Column( "amount", Numeric(8, 2), nullable=False, server_default=text("0"), ), CheckConstraint("amount >= 0", name="ck_order_amount"), ) Table( "extra", m, Column("x", CHAR), Column("uid", Integer, ForeignKey("user.id")), ) return m @classmethod def _get_model_schema(cls): schema = cls.schema m = MetaData(schema=schema) Table( "user", m, Column("id", Integer, primary_key=True), Column("name", String(50), nullable=False), Column("a1", Text, server_default="x"), ) Table( "address", m, Column("id", Integer, primary_key=True), Column("email_address", String(100), nullable=False), Column("street", String(50)), UniqueConstraint("email_address", name="uq_email"), ) Table( "order", m, Column("order_id", Integer, primary_key=True), Column( "amount", Numeric(10, 2), nullable=True, server_default=text("0"), ), Column("user_id", Integer, ForeignKey("user.id")), CheckConstraint("amount > -1", name="ck_order_amount"), ) Table( "item", m, Column("id", Integer, primary_key=True), Column("description", String(100)), Column("order_id", Integer, ForeignKey("order.order_id")), CheckConstraint("len(description) > 5"), ) return m class AutogenerateDiffTest(ModelOne, AutogenTest, TestBase): __only_on__ = "sqlite" def test_diffs(self): """test generation of diff rules""" metadata = self.m2 uo = ops.UpgradeOps(ops=[]) ctx = self.autogen_context autogenerate._produce_net_changes(ctx, uo) diffs = uo.as_diffs() eq_(diffs[0], ("add_table", metadata.tables["item"])) eq_(diffs[1][0], "remove_table") eq_(diffs[1][1].name, "extra") eq_(diffs[2][0], "add_column") eq_(diffs[2][1], None) eq_(diffs[2][2], "address") eq_(diffs[2][3], metadata.tables["address"].c.street) eq_(diffs[3][0], "add_constraint") eq_(diffs[3][1].name, "uq_email") eq_(diffs[4][0], "add_column") eq_(diffs[4][1], None) eq_(diffs[4][2], "order") eq_(diffs[4][3], metadata.tables["order"].c.user_id) eq_(diffs[5][0][0], "modify_type") eq_(diffs[5][0][1], None) eq_(diffs[5][0][2], "order") eq_(diffs[5][0][3], "amount") eq_(repr(diffs[5][0][5]), "NUMERIC(precision=8, scale=2)") eq_(repr(diffs[5][0][6]), "Numeric(precision=10, scale=2)") self._assert_fk_diff( diffs[6], "add_fk", "order", ["user_id"], "user", ["id"] ) eq_(diffs[7][0][0], "modify_default") eq_(diffs[7][0][1], None) eq_(diffs[7][0][2], "user") eq_(diffs[7][0][3], "a1") eq_(diffs[7][0][6].arg, "x") eq_(diffs[8][0][0], "modify_nullable") eq_(diffs[8][0][5], True) eq_(diffs[8][0][6], False) eq_(diffs[9][0], "remove_index") eq_(diffs[9][1].name, "pw_idx") eq_(diffs[10][0], "remove_column") eq_(diffs[10][3].name, "pw") eq_(diffs[10][3].table.name, "user") assert isinstance(diffs[10][3].type, String) def test_include_symbol(self): diffs = [] def include_symbol(name, schema=None): return name in ("address", "order") context = MigrationContext.configure( connection=self.bind.connect(), opts={ "compare_type": True, "compare_server_default": True, "target_metadata": self.m2, "include_symbol": include_symbol, }, ) diffs = autogenerate.compare_metadata( context, context.opts["target_metadata"] ) alter_cols = set( [ d[2] for d in self._flatten_diffs(diffs) if d[0].startswith("modify") ] ) eq_(alter_cols, set(["order"])) def test_include_object(self): def include_object(obj, name, type_, reflected, compare_to): assert obj.name == name if type_ == "table": if reflected: assert obj.metadata is not self.m2 else: assert obj.metadata is self.m2 return name in ("address", "order", "user") elif type_ == "column": if reflected: assert obj.table.metadata is not self.m2 else: assert obj.table.metadata is self.m2 return name != "street" else: return True context = MigrationContext.configure( connection=self.bind.connect(), opts={ "compare_type": True, "compare_server_default": True, "target_metadata": self.m2, "include_object": include_object, }, ) diffs = autogenerate.compare_metadata( context, context.opts["target_metadata"] ) alter_cols = ( set( [ d[2] for d in self._flatten_diffs(diffs) if d[0].startswith("modify") ] ) .union( d[3].name for d in self._flatten_diffs(diffs) if d[0] == "add_column" ) .union( d[1].name for d in self._flatten_diffs(diffs) if d[0] == "add_table" ) ) eq_(alter_cols, set(["user_id", "order", "user"])) def test_skip_null_type_comparison_reflected(self): ac = ops.AlterColumnOp("sometable", "somecol") autogenerate.compare._compare_type( self.autogen_context, ac, None, "sometable", "somecol", Column("somecol", NULLTYPE), Column("somecol", Integer()), ) diff = ac.to_diff_tuple() assert not diff def test_skip_null_type_comparison_local(self): ac = ops.AlterColumnOp("sometable", "somecol") autogenerate.compare._compare_type( self.autogen_context, ac, None, "sometable", "somecol", Column("somecol", Integer()), Column("somecol", NULLTYPE), ) diff = ac.to_diff_tuple() assert not diff def test_custom_type_compare(self): class MyType(TypeDecorator): impl = Integer def compare_against_backend(self, dialect, conn_type): return isinstance(conn_type, Integer) ac = ops.AlterColumnOp("sometable", "somecol") autogenerate.compare._compare_type( self.autogen_context, ac, None, "sometable", "somecol", Column("somecol", INTEGER()), Column("somecol", MyType()), ) assert not ac.has_changes() ac = ops.AlterColumnOp("sometable", "somecol") autogenerate.compare._compare_type( self.autogen_context, ac, None, "sometable", "somecol", Column("somecol", String()), Column("somecol", MyType()), ) diff = ac.to_diff_tuple() eq_(diff[0][0:4], ("modify_type", None, "sometable", "somecol")) def test_affinity_typedec(self): class MyType(TypeDecorator): impl = CHAR def load_dialect_impl(self, dialect): if dialect.name == "sqlite": return dialect.type_descriptor(Integer()) else: return dialect.type_descriptor(CHAR(32)) uo = ops.AlterColumnOp("sometable", "somecol") autogenerate.compare._compare_type( self.autogen_context, uo, None, "sometable", "somecol", Column("somecol", Integer, nullable=True), Column("somecol", MyType()), ) assert not uo.has_changes() def test_dont_barf_on_already_reflected(self): from sqlalchemy.util import OrderedSet inspector = Inspector.from_engine(self.bind) uo = ops.UpgradeOps(ops=[]) autogenerate.compare._compare_tables( OrderedSet([(None, "extra"), (None, "user")]), OrderedSet(), inspector, uo, self.autogen_context, ) eq_( [(rec[0], rec[1].name) for rec in uo.as_diffs()], [ ("remove_table", "extra"), ("remove_index", "pw_idx"), ("remove_table", "user"), ], ) class AutogenerateDiffTestWSchema(ModelOne, AutogenTest, TestBase): __only_on__ = "postgresql" __backend__ = True schema = "test_schema" def test_diffs(self): """test generation of diff rules""" metadata = self.m2 self._update_context(include_schemas=True) uo = ops.UpgradeOps(ops=[]) autogenerate._produce_net_changes(self.autogen_context, uo) diffs = uo.as_diffs() eq_(diffs[0], ("add_table", metadata.tables["%s.item" % self.schema])) eq_(diffs[1][0], "remove_table") eq_(diffs[1][1].name, "extra") eq_(diffs[2][0], "add_column") eq_(diffs[2][1], self.schema) eq_(diffs[2][2], "address") eq_(diffs[2][3], metadata.tables["%s.address" % self.schema].c.street) eq_(diffs[3][0], "add_constraint") eq_(diffs[3][1].name, "uq_email") eq_(diffs[4][0], "add_column") eq_(diffs[4][1], self.schema) eq_(diffs[4][2], "order") eq_(diffs[4][3], metadata.tables["%s.order" % self.schema].c.user_id) eq_(diffs[5][0][0], "modify_type") eq_(diffs[5][0][1], self.schema) eq_(diffs[5][0][2], "order") eq_(diffs[5][0][3], "amount") eq_(repr(diffs[5][0][5]), "NUMERIC(precision=8, scale=2)") eq_(repr(diffs[5][0][6]), "Numeric(precision=10, scale=2)") self._assert_fk_diff( diffs[6], "add_fk", "order", ["user_id"], "user", ["id"], source_schema=config.test_schema, ) eq_(diffs[7][0][0], "modify_default") eq_(diffs[7][0][1], self.schema) eq_(diffs[7][0][2], "user") eq_(diffs[7][0][3], "a1") eq_(diffs[7][0][6].arg, "x") eq_(diffs[8][0][0], "modify_nullable") eq_(diffs[8][0][5], True) eq_(diffs[8][0][6], False) eq_(diffs[9][0], "remove_index") eq_(diffs[9][1].name, "pw_idx") eq_(diffs[10][0], "remove_column") eq_(diffs[10][3].name, "pw") class CompareTypeSpecificityTest(TestBase): def _fixture(self): from alembic.ddl import impl from sqlalchemy.engine import default return impl.DefaultImpl( default.DefaultDialect(), None, False, True, None, {} ) def test_typedec_to_nonstandard(self): class PasswordType(TypeDecorator): impl = VARBINARY def copy(self, **kw): return PasswordType(self.impl.length) def load_dialect_impl(self, dialect): if dialect.name == "default": impl = sqlite.NUMERIC(self.length) else: impl = VARBINARY(self.length) return dialect.type_descriptor(impl) impl = self._fixture() impl.compare_type( Column("x", sqlite.NUMERIC(50)), Column("x", PasswordType(50)) ) def test_string(self): t1 = String(30) t2 = String(40) t3 = VARCHAR(30) t4 = Integer impl = self._fixture() is_(impl.compare_type(Column("x", t3), Column("x", t1)), False) is_(impl.compare_type(Column("x", t3), Column("x", t2)), True) is_(impl.compare_type(Column("x", t3), Column("x", t4)), True) def test_numeric(self): t1 = Numeric(10, 5) t2 = Numeric(12, 5) t3 = DECIMAL(10, 5) t4 = DateTime impl = self._fixture() is_(impl.compare_type(Column("x", t3), Column("x", t1)), False) is_(impl.compare_type(Column("x", t3), Column("x", t2)), True) is_(impl.compare_type(Column("x", t3), Column("x", t4)), True) def test_numeric_noprecision(self): t1 = Numeric() t2 = Numeric(scale=5) impl = self._fixture() is_(impl.compare_type(Column("x", t1), Column("x", t2)), False) def test_integer(self): t1 = Integer() t2 = SmallInteger() t3 = BIGINT() t4 = String() t5 = INTEGER() t6 = BigInteger() impl = self._fixture() is_(impl.compare_type(Column("x", t5), Column("x", t1)), False) is_(impl.compare_type(Column("x", t3), Column("x", t1)), True) is_(impl.compare_type(Column("x", t3), Column("x", t6)), False) is_(impl.compare_type(Column("x", t3), Column("x", t2)), True) is_(impl.compare_type(Column("x", t5), Column("x", t2)), True) is_(impl.compare_type(Column("x", t1), Column("x", t4)), True) def test_datetime(self): t1 = DateTime() t2 = DateTime(timezone=False) t3 = DateTime(timezone=True) impl = self._fixture() is_(impl.compare_type(Column("x", t1), Column("x", t2)), False) is_(impl.compare_type(Column("x", t1), Column("x", t3)), True) is_(impl.compare_type(Column("x", t2), Column("x", t3)), True) class AutogenSystemColTest(AutogenTest, TestBase): __only_on__ = "postgresql" @classmethod def _get_db_schema(cls): m = MetaData() Table("sometable", m, Column("id", Integer, primary_key=True)) return m @classmethod def _get_model_schema(cls): m = MetaData() # 'xmin' is implicitly present, when added to a model should produce # no change Table( "sometable", m, Column("id", Integer, primary_key=True), Column("xmin", Integer, system=True), ) return m def test_dont_add_system(self): uo = ops.UpgradeOps(ops=[]) autogenerate._produce_net_changes(self.autogen_context, uo) diffs = uo.as_diffs() eq_(diffs, []) class AutogenerateVariantCompareTest(AutogenTest, TestBase): __backend__ = True @classmethod def _get_db_schema(cls): m = MetaData() Table( "sometable", m, Column( "id", BigInteger().with_variant(Integer, "sqlite"), primary_key=True, ), Column("value", String(50)), ) return m @classmethod def _get_model_schema(cls): m = MetaData() Table( "sometable", m, Column( "id", BigInteger().with_variant(Integer, "sqlite"), primary_key=True, ), Column("value", String(50)), ) return m def test_variant_no_issue(self): uo = ops.UpgradeOps(ops=[]) autogenerate._produce_net_changes(self.autogen_context, uo) diffs = uo.as_diffs() eq_(diffs, []) class AutogenerateCustomCompareTypeTest(AutogenTest, TestBase): __only_on__ = "sqlite" @classmethod def _get_db_schema(cls): m = MetaData() Table( "sometable", m, Column("id", Integer, primary_key=True), Column("value", Integer), ) return m @classmethod def _get_model_schema(cls): m = MetaData() Table( "sometable", m, Column("id", Integer, primary_key=True), Column("value", String), ) return m def test_uses_custom_compare_type_function(self): my_compare_type = mock.Mock() self.context._user_compare_type = my_compare_type uo = ops.UpgradeOps(ops=[]) ctx = self.autogen_context autogenerate._produce_net_changes(ctx, uo) first_table = self.m2.tables["sometable"] first_column = first_table.columns["id"] eq_(len(my_compare_type.mock_calls), 2) # We'll just test the first call _, args, _ = my_compare_type.mock_calls[0] ( context, inspected_column, metadata_column, inspected_type, metadata_type, ) = args eq_(context, self.context) eq_(metadata_column, first_column) eq_(metadata_type, first_column.type) eq_(inspected_column.name, first_column.name) eq_(type(inspected_type), INTEGER) def test_column_type_not_modified_custom_compare_type_returns_False(self): my_compare_type = mock.Mock() my_compare_type.return_value = False self.context._user_compare_type = my_compare_type diffs = [] ctx = self.autogen_context diffs = [] autogenerate._produce_net_changes(ctx, diffs) eq_(diffs, []) def test_column_type_modified_custom_compare_type_returns_True(self): my_compare_type = mock.Mock() my_compare_type.return_value = True self.context._user_compare_type = my_compare_type ctx = self.autogen_context uo = ops.UpgradeOps(ops=[]) autogenerate._produce_net_changes(ctx, uo) diffs = uo.as_diffs() eq_(diffs[0][0][0], "modify_type") eq_(diffs[1][0][0], "modify_type") class PKConstraintUpgradesIgnoresNullableTest(AutogenTest, TestBase): __backend__ = True # test workaround for SQLAlchemy issue #3023, alembic issue #199 @classmethod def _get_db_schema(cls): m = MetaData() Table( "person_to_role", m, Column("person_id", Integer, autoincrement=False), Column("role_id", Integer, autoincrement=False), PrimaryKeyConstraint("person_id", "role_id"), ) return m @classmethod def _get_model_schema(cls): return cls._get_db_schema() def test_no_change(self): uo = ops.UpgradeOps(ops=[]) ctx = self.autogen_context autogenerate._produce_net_changes(ctx, uo) diffs = uo.as_diffs() eq_(diffs, []) class AutogenKeyTest(AutogenTest, TestBase): __only_on__ = "sqlite" @classmethod def _get_db_schema(cls): m = MetaData() Table( "someothertable", m, Column("id", Integer, primary_key=True), Column("value", Integer, key="somekey"), ) return m @classmethod def _get_model_schema(cls): m = MetaData() Table( "sometable", m, Column("id", Integer, primary_key=True), Column("value", Integer, key="someotherkey"), ) Table( "someothertable", m, Column("id", Integer, primary_key=True), Column("value", Integer, key="somekey"), Column("othervalue", Integer, key="otherkey"), ) return m symbols = ["someothertable", "sometable"] def test_autogen(self): uo = ops.UpgradeOps(ops=[]) ctx = self.autogen_context autogenerate._produce_net_changes(ctx, uo) diffs = uo.as_diffs() eq_(diffs[0][0], "add_table") eq_(diffs[0][1].name, "sometable") eq_(diffs[1][0], "add_column") eq_(diffs[1][3].key, "otherkey") class AutogenVersionTableTest(AutogenTest, TestBase): __only_on__ = "sqlite" version_table_name = "alembic_version" version_table_schema = None @classmethod def _get_db_schema(cls): m = MetaData() Table( cls.version_table_name, m, Column("x", Integer), schema=cls.version_table_schema, ) return m @classmethod def _get_model_schema(cls): m = MetaData() return m def test_no_version_table(self): ctx = self.autogen_context uo = ops.UpgradeOps(ops=[]) autogenerate._produce_net_changes(ctx, uo) eq_(uo.as_diffs(), []) def test_version_table_in_target(self): Table( self.version_table_name, self.m2, Column("x", Integer), schema=self.version_table_schema, ) ctx = self.autogen_context uo = ops.UpgradeOps(ops=[]) autogenerate._produce_net_changes(ctx, uo) eq_(uo.as_diffs(), []) class AutogenCustomVersionTableSchemaTest(AutogenVersionTableTest): __only_on__ = "postgresql" __backend__ = True version_table_schema = "test_schema" configure_opts = {"version_table_schema": "test_schema"} class AutogenCustomVersionTableTest(AutogenVersionTableTest): version_table_name = "my_version_table" configure_opts = {"version_table": "my_version_table"} class AutogenCustomVersionTableAndSchemaTest(AutogenVersionTableTest): __only_on__ = "postgresql" __backend__ = True version_table_name = "my_version_table" version_table_schema = "test_schema" configure_opts = { "version_table": "my_version_table", "version_table_schema": "test_schema", } class AutogenerateDiffOrderTest(AutogenTest, TestBase): __only_on__ = "sqlite" @classmethod def _get_db_schema(cls): return MetaData() @classmethod def _get_model_schema(cls): m = MetaData() Table("parent", m, Column("id", Integer, primary_key=True)) Table( "child", m, Column("parent_id", Integer, ForeignKey("parent.id")) ) return m def test_diffs_order(self): """ Added in order to test that child tables(tables with FKs) are generated before their parent tables """ ctx = self.autogen_context uo = ops.UpgradeOps(ops=[]) autogenerate._produce_net_changes(ctx, uo) diffs = uo.as_diffs() eq_(diffs[0][0], "add_table") eq_(diffs[0][1].name, "parent") eq_(diffs[1][0], "add_table") eq_(diffs[1][1].name, "child") class CompareMetadataTest(ModelOne, AutogenTest, TestBase): __only_on__ = "sqlite" def test_compare_metadata(self): metadata = self.m2 diffs = autogenerate.compare_metadata(self.context, metadata) eq_(diffs[0], ("add_table", metadata.tables["item"])) eq_(diffs[1][0], "remove_table") eq_(diffs[1][1].name, "extra") eq_(diffs[2][0], "add_column") eq_(diffs[2][1], None) eq_(diffs[2][2], "address") eq_(diffs[2][3], metadata.tables["address"].c.street) eq_(diffs[3][0], "add_constraint") eq_(diffs[3][1].name, "uq_email") eq_(diffs[4][0], "add_column") eq_(diffs[4][1], None) eq_(diffs[4][2], "order") eq_(diffs[4][3], metadata.tables["order"].c.user_id) eq_(diffs[5][0][0], "modify_type") eq_(diffs[5][0][1], None) eq_(diffs[5][0][2], "order") eq_(diffs[5][0][3], "amount") eq_(repr(diffs[5][0][5]), "NUMERIC(precision=8, scale=2)") eq_(repr(diffs[5][0][6]), "Numeric(precision=10, scale=2)") self._assert_fk_diff( diffs[6], "add_fk", "order", ["user_id"], "user", ["id"] ) eq_(diffs[7][0][0], "modify_default") eq_(diffs[7][0][1], None) eq_(diffs[7][0][2], "user") eq_(diffs[7][0][3], "a1") eq_(diffs[7][0][6].arg, "x") eq_(diffs[8][0][0], "modify_nullable") eq_(diffs[8][0][5], True) eq_(diffs[8][0][6], False) eq_(diffs[9][0], "remove_index") eq_(diffs[9][1].name, "pw_idx") eq_(diffs[10][0], "remove_column") eq_(diffs[10][3].name, "pw") def test_compare_metadata_include_object(self): metadata = self.m2 def include_object(obj, name, type_, reflected, compare_to): if type_ == "table": return name in ("extra", "order") elif type_ == "column": return name != "amount" else: return True context = MigrationContext.configure( connection=self.bind.connect(), opts={ "compare_type": True, "compare_server_default": True, "include_object": include_object, }, ) diffs = autogenerate.compare_metadata(context, metadata) eq_(diffs[0][0], "remove_table") eq_(diffs[0][1].name, "extra") eq_(diffs[1][0], "add_column") eq_(diffs[1][1], None) eq_(diffs[1][2], "order") eq_(diffs[1][3], metadata.tables["order"].c.user_id) def test_compare_metadata_include_symbol(self): metadata = self.m2 def include_symbol(table_name, schema_name): return table_name in ("extra", "order") context = MigrationContext.configure( connection=self.bind.connect(), opts={ "compare_type": True, "compare_server_default": True, "include_symbol": include_symbol, }, ) diffs = autogenerate.compare_metadata(context, metadata) eq_(diffs[0][0], "remove_table") eq_(diffs[0][1].name, "extra") eq_(diffs[1][0], "add_column") eq_(diffs[1][1], None) eq_(diffs[1][2], "order") eq_(diffs[1][3], metadata.tables["order"].c.user_id) eq_(diffs[2][0][0], "modify_type") eq_(diffs[2][0][1], None) eq_(diffs[2][0][2], "order") eq_(diffs[2][0][3], "amount") eq_(repr(diffs[2][0][5]), "NUMERIC(precision=8, scale=2)") eq_(repr(diffs[2][0][6]), "Numeric(precision=10, scale=2)") eq_(diffs[2][1][0], "modify_nullable") eq_(diffs[2][1][2], "order") eq_(diffs[2][1][5], False) eq_(diffs[2][1][6], True) def test_compare_metadata_as_sql(self): context = MigrationContext.configure( connection=self.bind.connect(), opts={"as_sql": True} ) metadata = self.m2 assert_raises_message( CommandError, "autogenerate can't use as_sql=True as it prevents " "querying the database for schema information", autogenerate.compare_metadata, context, metadata, ) class PGCompareMetaData(ModelOne, AutogenTest, TestBase): __only_on__ = "postgresql" __backend__ = True schema = "test_schema" def test_compare_metadata_schema(self): metadata = self.m2 context = MigrationContext.configure( connection=self.bind.connect(), opts={"include_schemas": True} ) diffs = autogenerate.compare_metadata(context, metadata) eq_(diffs[0], ("add_table", metadata.tables["test_schema.item"])) eq_(diffs[1][0], "remove_table") eq_(diffs[1][1].name, "extra") eq_(diffs[2][0], "add_column") eq_(diffs[2][1], "test_schema") eq_(diffs[2][2], "address") eq_(diffs[2][3], metadata.tables["test_schema.address"].c.street) eq_(diffs[3][0], "add_constraint") eq_(diffs[3][1].name, "uq_email") eq_(diffs[4][0], "add_column") eq_(diffs[4][1], "test_schema") eq_(diffs[4][2], "order") eq_(diffs[4][3], metadata.tables["test_schema.order"].c.user_id) eq_(diffs[5][0][0], "modify_nullable") eq_(diffs[5][0][5], False) eq_(diffs[5][0][6], True) class OrigObjectTest(TestBase): def setUp(self): self.metadata = m = MetaData() t = Table( "t", m, Column("id", Integer(), primary_key=True), Column("x", Integer()), ) self.ix = Index("ix1", t.c.id) fk = ForeignKeyConstraint(["t_id"], ["t.id"]) q = Table("q", m, Column("t_id", Integer()), fk) self.table = t self.fk = fk self.ck = CheckConstraint(t.c.x > 5) t.append_constraint(self.ck) self.uq = UniqueConstraint(q.c.t_id) self.pk = t.primary_key def test_drop_fk(self): fk = self.fk op = ops.DropConstraintOp.from_constraint(fk) is_(op.to_constraint(), fk) is_(op.reverse().to_constraint(), fk) def test_add_fk(self): fk = self.fk op = ops.AddConstraintOp.from_constraint(fk) is_(op.to_constraint(), fk) is_(op.reverse().to_constraint(), fk) is_not_(None, op.to_constraint().table) def test_add_check(self): ck = self.ck op = ops.AddConstraintOp.from_constraint(ck) is_(op.to_constraint(), ck) is_(op.reverse().to_constraint(), ck) is_not_(None, op.to_constraint().table) def test_drop_check(self): ck = self.ck op = ops.DropConstraintOp.from_constraint(ck) is_(op.to_constraint(), ck) is_(op.reverse().to_constraint(), ck) is_not_(None, op.to_constraint().table) def test_add_unique(self): uq = self.uq op = ops.AddConstraintOp.from_constraint(uq) is_(op.to_constraint(), uq) is_(op.reverse().to_constraint(), uq) is_not_(None, op.to_constraint().table) def test_drop_unique(self): uq = self.uq op = ops.DropConstraintOp.from_constraint(uq) is_(op.to_constraint(), uq) is_(op.reverse().to_constraint(), uq) is_not_(None, op.to_constraint().table) def test_add_pk_no_orig(self): op = ops.CreatePrimaryKeyOp("pk1", "t", ["x", "y"]) pk = op.to_constraint() eq_(pk.name, "pk1") eq_(pk.table.name, "t") def test_add_pk(self): pk = self.pk op = ops.AddConstraintOp.from_constraint(pk) is_(op.to_constraint(), pk) is_(op.reverse().to_constraint(), pk) is_not_(None, op.to_constraint().table) def test_drop_pk(self): pk = self.pk op = ops.DropConstraintOp.from_constraint(pk) is_(op.to_constraint(), pk) is_(op.reverse().to_constraint(), pk) is_not_(None, op.to_constraint().table) def test_drop_column(self): t = self.table op = ops.DropColumnOp.from_column_and_tablename(None, "t", t.c.x) is_(op.to_column(), t.c.x) is_(op.reverse().to_column(), t.c.x) is_not_(None, op.to_column().table) def test_add_column(self): t = self.table op = ops.AddColumnOp.from_column_and_tablename(None, "t", t.c.x) is_(op.to_column(), t.c.x) is_(op.reverse().to_column(), t.c.x) is_not_(None, op.to_column().table) def test_drop_table(self): t = self.table op = ops.DropTableOp.from_table(t) is_(op.to_table(), t) is_(op.reverse().to_table(), t) is_(self.metadata, op.to_table().metadata) def test_add_table(self): t = self.table op = ops.CreateTableOp.from_table(t) is_(op.to_table(), t) is_(op.reverse().to_table(), t) is_(self.metadata, op.to_table().metadata) def test_drop_index(self): op = ops.DropIndexOp.from_index(self.ix) is_(op.to_index(), self.ix) is_(op.reverse().to_index(), self.ix) def test_create_index(self): op = ops.CreateIndexOp.from_index(self.ix) is_(op.to_index(), self.ix) is_(op.reverse().to_index(), self.ix) class MultipleMetaDataTest(AutogenFixtureTest, TestBase): def test_multiple(self): m1a = MetaData() m1b = MetaData() m1c = MetaData() m2a = MetaData() m2b = MetaData() m2c = MetaData() Table("a", m1a, Column("id", Integer, primary_key=True)) Table("b1", m1b, Column("id", Integer, primary_key=True)) Table("b2", m1b, Column("id", Integer, primary_key=True)) Table( "c1", m1c, Column("id", Integer, primary_key=True), Column("x", Integer), ) a = Table( "a", m2a, Column("id", Integer, primary_key=True), Column("q", Integer), ) Table("b1", m2b, Column("id", Integer, primary_key=True)) Table("c1", m2c, Column("id", Integer, primary_key=True)) c2 = Table("c2", m2c, Column("id", Integer, primary_key=True)) diffs = self._fixture([m1a, m1b, m1c], [m2a, m2b, m2c]) eq_(diffs[0], ("add_table", c2)) eq_(diffs[1][0], "remove_table") eq_(diffs[1][1].name, "b2") eq_(diffs[2], ("add_column", None, "a", a.c.q)) eq_(diffs[3][0:3], ("remove_column", None, "c1")) eq_(diffs[3][3].name, "x") def test_empty_list(self): # because they're going to do it.... diffs = self._fixture([], []) eq_(diffs, []) def test_non_list_sequence(self): # we call it "sequence", let's check that m1a = MetaData() m1b = MetaData() m2a = MetaData() m2b = MetaData() Table("a", m1a, Column("id", Integer, primary_key=True)) Table("b", m1b, Column("id", Integer, primary_key=True)) Table("a", m2a, Column("id", Integer, primary_key=True)) b = Table( "b", m2b, Column("id", Integer, primary_key=True), Column("q", Integer), ) diffs = self._fixture((m1a, m1b), (m2a, m2b)) eq_(diffs, [("add_column", None, "b", b.c.q)]) def test_raise_on_dupe(self): m1a = MetaData() m1b = MetaData() m2a = MetaData() m2b = MetaData() Table("a", m1a, Column("id", Integer, primary_key=True)) Table("b1", m1b, Column("id", Integer, primary_key=True)) Table("b2", m1b, Column("id", Integer, primary_key=True)) Table("b3", m1b, Column("id", Integer, primary_key=True)) Table("a", m2a, Column("id", Integer, primary_key=True)) Table("a", m2b, Column("id", Integer, primary_key=True)) Table("b1", m2b, Column("id", Integer, primary_key=True)) Table("b2", m2a, Column("id", Integer, primary_key=True)) Table("b2", m2b, Column("id", Integer, primary_key=True)) assert_raises_message( ValueError, 'Duplicate table keys across multiple MetaData objects: "a", "b2"', self._fixture, [m1a, m1b], [m2a, m2b], ) class AutoincrementTest(AutogenFixtureTest, TestBase): __backend__ = True __requires__ = ("integer_subtype_comparisons",) def test_alter_column_autoincrement_none(self): m1 = MetaData() m2 = MetaData() Table("a", m1, Column("x", Integer, nullable=False)) Table("a", m2, Column("x", Integer, nullable=True)) ops = self._fixture(m1, m2, return_ops=True) assert "autoincrement" not in ops.ops[0].ops[0].kw def test_alter_column_autoincrement_pk_false(self): m1 = MetaData() m2 = MetaData() Table( "a", m1, Column("x", Integer, primary_key=True, autoincrement=False), ) Table( "a", m2, Column("x", BigInteger, primary_key=True, autoincrement=False), ) ops = self._fixture(m1, m2, return_ops=True) is_(ops.ops[0].ops[0].kw["autoincrement"], False) def test_alter_column_autoincrement_pk_implicit_true(self): m1 = MetaData() m2 = MetaData() Table("a", m1, Column("x", Integer, primary_key=True)) Table("a", m2, Column("x", BigInteger, primary_key=True)) ops = self._fixture(m1, m2, return_ops=True) is_(ops.ops[0].ops[0].kw["autoincrement"], True) def test_alter_column_autoincrement_pk_explicit_true(self): m1 = MetaData() m2 = MetaData() Table( "a", m1, Column("x", Integer, primary_key=True, autoincrement=True) ) Table( "a", m2, Column("x", BigInteger, primary_key=True, autoincrement=True), ) ops = self._fixture(m1, m2, return_ops=True) is_(ops.ops[0].ops[0].kw["autoincrement"], True) def test_alter_column_autoincrement_nonpk_false(self): m1 = MetaData() m2 = MetaData() Table( "a", m1, Column("id", Integer, primary_key=True), Column("x", Integer, autoincrement=False), ) Table( "a", m2, Column("id", Integer, primary_key=True), Column("x", BigInteger, autoincrement=False), ) ops = self._fixture(m1, m2, return_ops=True) is_(ops.ops[0].ops[0].kw["autoincrement"], False) def test_alter_column_autoincrement_nonpk_implicit_false(self): m1 = MetaData() m2 = MetaData() Table( "a", m1, Column("id", Integer, primary_key=True), Column("x", Integer), ) Table( "a", m2, Column("id", Integer, primary_key=True), Column("x", BigInteger), ) ops = self._fixture(m1, m2, return_ops=True) assert "autoincrement" not in ops.ops[0].ops[0].kw def test_alter_column_autoincrement_nonpk_explicit_true(self): m1 = MetaData() m2 = MetaData() Table( "a", m1, Column("id", Integer, primary_key=True), Column("x", Integer, autoincrement=True), ) Table( "a", m2, Column("id", Integer, primary_key=True), Column("x", BigInteger, autoincrement=True), ) ops = self._fixture(m1, m2, return_ops=True) is_(ops.ops[0].ops[0].kw["autoincrement"], True) def test_alter_column_autoincrement_compositepk_false(self): m1 = MetaData() m2 = MetaData() Table( "a", m1, Column("id", Integer, primary_key=True), Column("x", Integer, primary_key=True, autoincrement=False), ) Table( "a", m2, Column("id", Integer, primary_key=True), Column("x", BigInteger, primary_key=True, autoincrement=False), ) ops = self._fixture(m1, m2, return_ops=True) is_(ops.ops[0].ops[0].kw["autoincrement"], False) def test_alter_column_autoincrement_compositepk_implicit_false(self): m1 = MetaData() m2 = MetaData() Table( "a", m1, Column("id", Integer, primary_key=True), Column("x", Integer, primary_key=True), ) Table( "a", m2, Column("id", Integer, primary_key=True), Column("x", BigInteger, primary_key=True), ) ops = self._fixture(m1, m2, return_ops=True) assert "autoincrement" not in ops.ops[0].ops[0].kw @config.requirements.autoincrement_on_composite_pk def test_alter_column_autoincrement_compositepk_explicit_true(self): m1 = MetaData() m2 = MetaData() Table( "a", m1, Column("id", Integer, primary_key=True, autoincrement=False), Column("x", Integer, primary_key=True, autoincrement=True), # on SQLA 1.0 and earlier, this being present # trips the "add KEY for the primary key" so that the # AUTO_INCREMENT keyword is accepted by MySQL. SQLA 1.1 and # greater the columns are just reorganized. mysql_engine="InnoDB", ) Table( "a", m2, Column("id", Integer, primary_key=True, autoincrement=False), Column("x", BigInteger, primary_key=True, autoincrement=True), ) ops = self._fixture(m1, m2, return_ops=True) is_(ops.ops[0].ops[0].kw["autoincrement"], True) zzzeek-alembic-bee044a1c187/tests/test_autogen_fks.py000066400000000000000000000762311353106760100226730ustar00rootroot00000000000000import sys from sqlalchemy import Column from sqlalchemy import ForeignKeyConstraint from sqlalchemy import Integer from sqlalchemy import MetaData from sqlalchemy import String from sqlalchemy import Table from alembic.testing import config from alembic.testing import eq_ from alembic.testing import mock from alembic.testing import TestBase from ._autogen_fixtures import AutogenFixtureTest py3k = sys.version_info.major >= 3 class AutogenerateForeignKeysTest(AutogenFixtureTest, TestBase): __backend__ = True def test_remove_fk(self): m1 = MetaData() m2 = MetaData() Table( "some_table", m1, Column("test", String(10), primary_key=True), mysql_engine="InnoDB", ) Table( "user", m1, Column("id", Integer, primary_key=True), Column("name", String(50), nullable=False), Column("a1", String(10), server_default="x"), Column("test2", String(10)), ForeignKeyConstraint(["test2"], ["some_table.test"]), mysql_engine="InnoDB", ) Table( "some_table", m2, Column("test", String(10), primary_key=True), mysql_engine="InnoDB", ) Table( "user", m2, Column("id", Integer, primary_key=True), Column("name", String(50), nullable=False), Column("a1", String(10), server_default="x"), Column("test2", String(10)), mysql_engine="InnoDB", ) diffs = self._fixture(m1, m2) self._assert_fk_diff( diffs[0], "remove_fk", "user", ["test2"], "some_table", ["test"], conditional_name="servergenerated", ) def test_add_fk(self): m1 = MetaData() m2 = MetaData() Table( "some_table", m1, Column("id", Integer, primary_key=True), Column("test", String(10)), mysql_engine="InnoDB", ) Table( "user", m1, Column("id", Integer, primary_key=True), Column("name", String(50), nullable=False), Column("a1", String(10), server_default="x"), Column("test2", String(10)), mysql_engine="InnoDB", ) Table( "some_table", m2, Column("id", Integer, primary_key=True), Column("test", String(10)), mysql_engine="InnoDB", ) Table( "user", m2, Column("id", Integer, primary_key=True), Column("name", String(50), nullable=False), Column("a1", String(10), server_default="x"), Column("test2", String(10)), ForeignKeyConstraint(["test2"], ["some_table.test"]), mysql_engine="InnoDB", ) diffs = self._fixture(m1, m2) self._assert_fk_diff( diffs[0], "add_fk", "user", ["test2"], "some_table", ["test"] ) def test_no_change(self): m1 = MetaData() m2 = MetaData() Table( "some_table", m1, Column("id", Integer, primary_key=True), Column("test", String(10)), mysql_engine="InnoDB", ) Table( "user", m1, Column("id", Integer, primary_key=True), Column("name", String(50), nullable=False), Column("a1", String(10), server_default="x"), Column("test2", Integer), ForeignKeyConstraint(["test2"], ["some_table.id"]), mysql_engine="InnoDB", ) Table( "some_table", m2, Column("id", Integer, primary_key=True), Column("test", String(10)), mysql_engine="InnoDB", ) Table( "user", m2, Column("id", Integer, primary_key=True), Column("name", String(50), nullable=False), Column("a1", String(10), server_default="x"), Column("test2", Integer), ForeignKeyConstraint(["test2"], ["some_table.id"]), mysql_engine="InnoDB", ) diffs = self._fixture(m1, m2) eq_(diffs, []) def test_no_change_composite_fk(self): m1 = MetaData() m2 = MetaData() Table( "some_table", m1, Column("id_1", String(10), primary_key=True), Column("id_2", String(10), primary_key=True), mysql_engine="InnoDB", ) Table( "user", m1, Column("id", Integer, primary_key=True), Column("name", String(50), nullable=False), Column("a1", String(10), server_default="x"), Column("other_id_1", String(10)), Column("other_id_2", String(10)), ForeignKeyConstraint( ["other_id_1", "other_id_2"], ["some_table.id_1", "some_table.id_2"], ), mysql_engine="InnoDB", ) Table( "some_table", m2, Column("id_1", String(10), primary_key=True), Column("id_2", String(10), primary_key=True), mysql_engine="InnoDB", ) Table( "user", m2, Column("id", Integer, primary_key=True), Column("name", String(50), nullable=False), Column("a1", String(10), server_default="x"), Column("other_id_1", String(10)), Column("other_id_2", String(10)), ForeignKeyConstraint( ["other_id_1", "other_id_2"], ["some_table.id_1", "some_table.id_2"], ), mysql_engine="InnoDB", ) diffs = self._fixture(m1, m2) eq_(diffs, []) def test_add_composite_fk_with_name(self): m1 = MetaData() m2 = MetaData() Table( "some_table", m1, Column("id_1", String(10), primary_key=True), Column("id_2", String(10), primary_key=True), mysql_engine="InnoDB", ) Table( "user", m1, Column("id", Integer, primary_key=True), Column("name", String(50), nullable=False), Column("a1", String(10), server_default="x"), Column("other_id_1", String(10)), Column("other_id_2", String(10)), mysql_engine="InnoDB", ) Table( "some_table", m2, Column("id_1", String(10), primary_key=True), Column("id_2", String(10), primary_key=True), mysql_engine="InnoDB", ) Table( "user", m2, Column("id", Integer, primary_key=True), Column("name", String(50), nullable=False), Column("a1", String(10), server_default="x"), Column("other_id_1", String(10)), Column("other_id_2", String(10)), ForeignKeyConstraint( ["other_id_1", "other_id_2"], ["some_table.id_1", "some_table.id_2"], name="fk_test_name", ), mysql_engine="InnoDB", ) diffs = self._fixture(m1, m2) self._assert_fk_diff( diffs[0], "add_fk", "user", ["other_id_1", "other_id_2"], "some_table", ["id_1", "id_2"], name="fk_test_name", ) @config.requirements.no_name_normalize def test_remove_composite_fk(self): m1 = MetaData() m2 = MetaData() Table( "some_table", m1, Column("id_1", String(10), primary_key=True), Column("id_2", String(10), primary_key=True), mysql_engine="InnoDB", ) Table( "user", m1, Column("id", Integer, primary_key=True), Column("name", String(50), nullable=False), Column("a1", String(10), server_default="x"), Column("other_id_1", String(10)), Column("other_id_2", String(10)), ForeignKeyConstraint( ["other_id_1", "other_id_2"], ["some_table.id_1", "some_table.id_2"], name="fk_test_name", ), mysql_engine="InnoDB", ) Table( "some_table", m2, Column("id_1", String(10), primary_key=True), Column("id_2", String(10), primary_key=True), mysql_engine="InnoDB", ) Table( "user", m2, Column("id", Integer, primary_key=True), Column("name", String(50), nullable=False), Column("a1", String(10), server_default="x"), Column("other_id_1", String(10)), Column("other_id_2", String(10)), mysql_engine="InnoDB", ) diffs = self._fixture(m1, m2) self._assert_fk_diff( diffs[0], "remove_fk", "user", ["other_id_1", "other_id_2"], "some_table", ["id_1", "id_2"], conditional_name="fk_test_name", ) def test_add_fk_colkeys(self): m1 = MetaData() m2 = MetaData() Table( "some_table", m1, Column("id_1", String(10), primary_key=True), Column("id_2", String(10), primary_key=True), mysql_engine="InnoDB", ) Table( "user", m1, Column("id", Integer, primary_key=True), Column("other_id_1", String(10)), Column("other_id_2", String(10)), mysql_engine="InnoDB", ) Table( "some_table", m2, Column("id_1", String(10), key="tid1", primary_key=True), Column("id_2", String(10), key="tid2", primary_key=True), mysql_engine="InnoDB", ) Table( "user", m2, Column("id", Integer, primary_key=True), Column("other_id_1", String(10), key="oid1"), Column("other_id_2", String(10), key="oid2"), ForeignKeyConstraint( ["oid1", "oid2"], ["some_table.tid1", "some_table.tid2"], name="fk_test_name", ), mysql_engine="InnoDB", ) diffs = self._fixture(m1, m2) self._assert_fk_diff( diffs[0], "add_fk", "user", ["other_id_1", "other_id_2"], "some_table", ["id_1", "id_2"], name="fk_test_name", ) def test_no_change_colkeys(self): m1 = MetaData() m2 = MetaData() Table( "some_table", m1, Column("id_1", String(10), primary_key=True), Column("id_2", String(10), primary_key=True), mysql_engine="InnoDB", ) Table( "user", m1, Column("id", Integer, primary_key=True), Column("other_id_1", String(10)), Column("other_id_2", String(10)), ForeignKeyConstraint( ["other_id_1", "other_id_2"], ["some_table.id_1", "some_table.id_2"], ), mysql_engine="InnoDB", ) Table( "some_table", m2, Column("id_1", String(10), key="tid1", primary_key=True), Column("id_2", String(10), key="tid2", primary_key=True), mysql_engine="InnoDB", ) Table( "user", m2, Column("id", Integer, primary_key=True), Column("other_id_1", String(10), key="oid1"), Column("other_id_2", String(10), key="oid2"), ForeignKeyConstraint( ["oid1", "oid2"], ["some_table.tid1", "some_table.tid2"] ), mysql_engine="InnoDB", ) diffs = self._fixture(m1, m2) eq_(diffs, []) class IncludeHooksTest(AutogenFixtureTest, TestBase): __backend__ = True __requires__ = ("fk_names",) @config.requirements.no_name_normalize def test_remove_connection_fk(self): m1 = MetaData() m2 = MetaData() ref = Table( "ref", m1, Column("id", Integer, primary_key=True), mysql_engine="InnoDB", ) t1 = Table( "t", m1, Column("x", Integer), Column("y", Integer), mysql_engine="InnoDB", ) t1.append_constraint( ForeignKeyConstraint([t1.c.x], [ref.c.id], name="fk1") ) t1.append_constraint( ForeignKeyConstraint([t1.c.y], [ref.c.id], name="fk2") ) ref = Table( "ref", m2, Column("id", Integer, primary_key=True), mysql_engine="InnoDB", ) Table( "t", m2, Column("x", Integer), Column("y", Integer), mysql_engine="InnoDB", ) def include_object(object_, name, type_, reflected, compare_to): return not ( isinstance(object_, ForeignKeyConstraint) and type_ == "foreign_key_constraint" and reflected and name == "fk1" ) diffs = self._fixture(m1, m2, object_filters=include_object) self._assert_fk_diff( diffs[0], "remove_fk", "t", ["y"], "ref", ["id"], conditional_name="fk2", ) eq_(len(diffs), 1) def test_add_metadata_fk(self): m1 = MetaData() m2 = MetaData() Table( "ref", m1, Column("id", Integer, primary_key=True), mysql_engine="InnoDB", ) Table( "t", m1, Column("x", Integer), Column("y", Integer), mysql_engine="InnoDB", ) ref = Table( "ref", m2, Column("id", Integer, primary_key=True), mysql_engine="InnoDB", ) t2 = Table( "t", m2, Column("x", Integer), Column("y", Integer), mysql_engine="InnoDB", ) t2.append_constraint( ForeignKeyConstraint([t2.c.x], [ref.c.id], name="fk1") ) t2.append_constraint( ForeignKeyConstraint([t2.c.y], [ref.c.id], name="fk2") ) def include_object(object_, name, type_, reflected, compare_to): return not ( isinstance(object_, ForeignKeyConstraint) and type_ == "foreign_key_constraint" and not reflected and name == "fk1" ) diffs = self._fixture(m1, m2, object_filters=include_object) self._assert_fk_diff( diffs[0], "add_fk", "t", ["y"], "ref", ["id"], name="fk2" ) eq_(len(diffs), 1) @config.requirements.no_name_normalize def test_change_fk(self): m1 = MetaData() m2 = MetaData() r1a = Table( "ref_a", m1, Column("a", Integer, primary_key=True), mysql_engine="InnoDB", ) Table( "ref_b", m1, Column("a", Integer, primary_key=True), Column("b", Integer, primary_key=True), mysql_engine="InnoDB", ) t1 = Table( "t", m1, Column("x", Integer), Column("y", Integer), Column("z", Integer), mysql_engine="InnoDB", ) t1.append_constraint( ForeignKeyConstraint([t1.c.x], [r1a.c.a], name="fk1") ) t1.append_constraint( ForeignKeyConstraint([t1.c.y], [r1a.c.a], name="fk2") ) Table( "ref_a", m2, Column("a", Integer, primary_key=True), mysql_engine="InnoDB", ) r2b = Table( "ref_b", m2, Column("a", Integer, primary_key=True), Column("b", Integer, primary_key=True), mysql_engine="InnoDB", ) t2 = Table( "t", m2, Column("x", Integer), Column("y", Integer), Column("z", Integer), mysql_engine="InnoDB", ) t2.append_constraint( ForeignKeyConstraint( [t2.c.x, t2.c.z], [r2b.c.a, r2b.c.b], name="fk1" ) ) t2.append_constraint( ForeignKeyConstraint( [t2.c.y, t2.c.z], [r2b.c.a, r2b.c.b], name="fk2" ) ) def include_object(object_, name, type_, reflected, compare_to): return not ( isinstance(object_, ForeignKeyConstraint) and type_ == "foreign_key_constraint" and name == "fk1" ) diffs = self._fixture(m1, m2, object_filters=include_object) self._assert_fk_diff( diffs[0], "remove_fk", "t", ["y"], "ref_a", ["a"], name="fk2" ) self._assert_fk_diff( diffs[1], "add_fk", "t", ["y", "z"], "ref_b", ["a", "b"], name="fk2", ) eq_(len(diffs), 2) class AutogenerateFKOptionsTest(AutogenFixtureTest, TestBase): __backend__ = True __requires__ = ("flexible_fk_cascades",) def _fk_opts_fixture(self, old_opts, new_opts): m1 = MetaData() m2 = MetaData() Table( "some_table", m1, Column("id", Integer, primary_key=True), Column("test", String(10)), mysql_engine="InnoDB", ) Table( "user", m1, Column("id", Integer, primary_key=True), Column("name", String(50), nullable=False), Column("tid", Integer), ForeignKeyConstraint(["tid"], ["some_table.id"], **old_opts), mysql_engine="InnoDB", ) Table( "some_table", m2, Column("id", Integer, primary_key=True), Column("test", String(10)), mysql_engine="InnoDB", ) Table( "user", m2, Column("id", Integer, primary_key=True), Column("name", String(50), nullable=False), Column("tid", Integer), ForeignKeyConstraint(["tid"], ["some_table.id"], **new_opts), mysql_engine="InnoDB", ) return self._fixture(m1, m2) def _expect_opts_supported(self, deferrable=False, initially=False): if not config.requirements.reflects_fk_options.enabled: return False if deferrable and not config.requirements.fk_deferrable.enabled: return False if initially and not config.requirements.fk_initially.enabled: return False return True def test_add_ondelete(self): diffs = self._fk_opts_fixture({}, {"ondelete": "cascade"}) if self._expect_opts_supported(): self._assert_fk_diff( diffs[0], "remove_fk", "user", ["tid"], "some_table", ["id"], ondelete=None, conditional_name="servergenerated", ) self._assert_fk_diff( diffs[1], "add_fk", "user", ["tid"], "some_table", ["id"], ondelete="cascade", ) else: eq_(diffs, []) def test_remove_ondelete(self): diffs = self._fk_opts_fixture({"ondelete": "CASCADE"}, {}) if self._expect_opts_supported(): self._assert_fk_diff( diffs[0], "remove_fk", "user", ["tid"], "some_table", ["id"], ondelete="CASCADE", conditional_name="servergenerated", ) self._assert_fk_diff( diffs[1], "add_fk", "user", ["tid"], "some_table", ["id"], ondelete=None, ) else: eq_(diffs, []) def test_nochange_ondelete(self): """test case sensitivity""" diffs = self._fk_opts_fixture( {"ondelete": "caSCAde"}, {"ondelete": "CasCade"} ) eq_(diffs, []) def test_add_onupdate(self): diffs = self._fk_opts_fixture({}, {"onupdate": "cascade"}) if self._expect_opts_supported(): self._assert_fk_diff( diffs[0], "remove_fk", "user", ["tid"], "some_table", ["id"], onupdate=None, conditional_name="servergenerated", ) self._assert_fk_diff( diffs[1], "add_fk", "user", ["tid"], "some_table", ["id"], onupdate="cascade", ) else: eq_(diffs, []) def test_remove_onupdate(self): diffs = self._fk_opts_fixture({"onupdate": "CASCADE"}, {}) if self._expect_opts_supported(): self._assert_fk_diff( diffs[0], "remove_fk", "user", ["tid"], "some_table", ["id"], onupdate="CASCADE", conditional_name="servergenerated", ) self._assert_fk_diff( diffs[1], "add_fk", "user", ["tid"], "some_table", ["id"], onupdate=None, ) else: eq_(diffs, []) def test_nochange_onupdate(self): """test case sensitivity""" diffs = self._fk_opts_fixture( {"onupdate": "caSCAde"}, {"onupdate": "CasCade"} ) eq_(diffs, []) def test_nochange_ondelete_restrict(self): """test the RESTRICT option which MySQL doesn't report on""" diffs = self._fk_opts_fixture( {"ondelete": "restrict"}, {"ondelete": "restrict"} ) eq_(diffs, []) def test_nochange_onupdate_restrict(self): """test the RESTRICT option which MySQL doesn't report on""" diffs = self._fk_opts_fixture( {"onupdate": "restrict"}, {"onupdate": "restrict"} ) eq_(diffs, []) def test_nochange_ondelete_noaction(self): """test the NO ACTION option which generally comes back as None""" diffs = self._fk_opts_fixture( {"ondelete": "no action"}, {"ondelete": "no action"} ) eq_(diffs, []) def test_nochange_onupdate_noaction(self): """test the NO ACTION option which generally comes back as None""" diffs = self._fk_opts_fixture( {"onupdate": "no action"}, {"onupdate": "no action"} ) eq_(diffs, []) def test_change_ondelete_from_restrict(self): """test the RESTRICT option which MySQL doesn't report on""" # note that this is impossible to detect if we change # from RESTRICT to NO ACTION on MySQL. diffs = self._fk_opts_fixture( {"ondelete": "restrict"}, {"ondelete": "cascade"} ) if self._expect_opts_supported(): self._assert_fk_diff( diffs[0], "remove_fk", "user", ["tid"], "some_table", ["id"], onupdate=None, ondelete=mock.ANY, # MySQL reports None, PG reports RESTRICT conditional_name="servergenerated", ) self._assert_fk_diff( diffs[1], "add_fk", "user", ["tid"], "some_table", ["id"], onupdate=None, ondelete="cascade", ) else: eq_(diffs, []) def test_change_onupdate_from_restrict(self): """test the RESTRICT option which MySQL doesn't report on""" # note that this is impossible to detect if we change # from RESTRICT to NO ACTION on MySQL. diffs = self._fk_opts_fixture( {"onupdate": "restrict"}, {"onupdate": "cascade"} ) if self._expect_opts_supported(): self._assert_fk_diff( diffs[0], "remove_fk", "user", ["tid"], "some_table", ["id"], onupdate=mock.ANY, # MySQL reports None, PG reports RESTRICT ondelete=None, conditional_name="servergenerated", ) self._assert_fk_diff( diffs[1], "add_fk", "user", ["tid"], "some_table", ["id"], onupdate="cascade", ondelete=None, ) else: eq_(diffs, []) def test_ondelete_onupdate_combo(self): diffs = self._fk_opts_fixture( {"onupdate": "CASCADE", "ondelete": "SET NULL"}, {"onupdate": "RESTRICT", "ondelete": "RESTRICT"}, ) if self._expect_opts_supported(): self._assert_fk_diff( diffs[0], "remove_fk", "user", ["tid"], "some_table", ["id"], onupdate="CASCADE", ondelete="SET NULL", conditional_name="servergenerated", ) self._assert_fk_diff( diffs[1], "add_fk", "user", ["tid"], "some_table", ["id"], onupdate="RESTRICT", ondelete="RESTRICT", ) else: eq_(diffs, []) @config.requirements.fk_initially def test_add_initially_deferred(self): diffs = self._fk_opts_fixture({}, {"initially": "deferred"}) self._assert_fk_diff( diffs[0], "remove_fk", "user", ["tid"], "some_table", ["id"], initially=None, conditional_name="servergenerated", ) self._assert_fk_diff( diffs[1], "add_fk", "user", ["tid"], "some_table", ["id"], initially="deferred", ) @config.requirements.fk_initially def test_remove_initially_deferred(self): diffs = self._fk_opts_fixture({"initially": "deferred"}, {}) self._assert_fk_diff( diffs[0], "remove_fk", "user", ["tid"], "some_table", ["id"], initially="DEFERRED", deferrable=True, conditional_name="servergenerated", ) self._assert_fk_diff( diffs[1], "add_fk", "user", ["tid"], "some_table", ["id"], initially=None, ) @config.requirements.fk_deferrable @config.requirements.fk_initially def test_add_initially_immediate_plus_deferrable(self): diffs = self._fk_opts_fixture( {}, {"initially": "immediate", "deferrable": True} ) self._assert_fk_diff( diffs[0], "remove_fk", "user", ["tid"], "some_table", ["id"], initially=None, conditional_name="servergenerated", ) self._assert_fk_diff( diffs[1], "add_fk", "user", ["tid"], "some_table", ["id"], initially="immediate", deferrable=True, ) @config.requirements.fk_deferrable @config.requirements.fk_initially def test_remove_initially_immediate_plus_deferrable(self): diffs = self._fk_opts_fixture( {"initially": "immediate", "deferrable": True}, {} ) self._assert_fk_diff( diffs[0], "remove_fk", "user", ["tid"], "some_table", ["id"], initially=None, # immediate is the default deferrable=True, conditional_name="servergenerated", ) self._assert_fk_diff( diffs[1], "add_fk", "user", ["tid"], "some_table", ["id"], initially=None, deferrable=None, ) @config.requirements.fk_initially @config.requirements.fk_deferrable def test_add_initially_deferrable_nochange_one(self): diffs = self._fk_opts_fixture( {"deferrable": True, "initially": "immediate"}, {"deferrable": True, "initially": "immediate"}, ) eq_(diffs, []) @config.requirements.fk_initially @config.requirements.fk_deferrable def test_add_initially_deferrable_nochange_two(self): diffs = self._fk_opts_fixture( {"deferrable": True, "initially": "deferred"}, {"deferrable": True, "initially": "deferred"}, ) eq_(diffs, []) @config.requirements.fk_initially @config.requirements.fk_deferrable def test_add_initially_deferrable_nochange_three(self): diffs = self._fk_opts_fixture( {"deferrable": None, "initially": "deferred"}, {"deferrable": None, "initially": "deferred"}, ) eq_(diffs, []) @config.requirements.fk_deferrable def test_add_deferrable(self): diffs = self._fk_opts_fixture({}, {"deferrable": True}) self._assert_fk_diff( diffs[0], "remove_fk", "user", ["tid"], "some_table", ["id"], deferrable=None, conditional_name="servergenerated", ) self._assert_fk_diff( diffs[1], "add_fk", "user", ["tid"], "some_table", ["id"], deferrable=True, ) @config.requirements.fk_deferrable def test_remove_deferrable(self): diffs = self._fk_opts_fixture({"deferrable": True}, {}) self._assert_fk_diff( diffs[0], "remove_fk", "user", ["tid"], "some_table", ["id"], deferrable=True, conditional_name="servergenerated", ) self._assert_fk_diff( diffs[1], "add_fk", "user", ["tid"], "some_table", ["id"], deferrable=None, ) zzzeek-alembic-bee044a1c187/tests/test_autogen_indexes.py000066400000000000000000001106201353106760100235360ustar00rootroot00000000000000import sys from sqlalchemy import Column from sqlalchemy import ForeignKey from sqlalchemy import ForeignKeyConstraint from sqlalchemy import func from sqlalchemy import Index from sqlalchemy import Integer from sqlalchemy import MetaData from sqlalchemy import Numeric from sqlalchemy import String from sqlalchemy import Table from sqlalchemy import UniqueConstraint from alembic.testing import assertions from alembic.testing import config from alembic.testing import engines from alembic.testing import eq_ from alembic.testing import TestBase from alembic.testing.env import staging_env from ._autogen_fixtures import AutogenFixtureTest py3k = sys.version_info >= (3,) class NoUqReflection(object): __requires__ = () def setUp(self): staging_env() self.bind = eng = engines.testing_engine() def unimpl(*arg, **kw): raise NotImplementedError() eng.dialect.get_unique_constraints = unimpl def test_add_ix_on_table_create(self): return super(NoUqReflection, self).test_add_ix_on_table_create() def test_add_idx_non_col(self): return super(NoUqReflection, self).test_add_idx_non_col() class AutogenerateUniqueIndexTest(AutogenFixtureTest, TestBase): reports_unique_constraints = True reports_unique_constraints_as_indexes = False __requires__ = ("unique_constraint_reflection",) __only_on__ = "sqlite" def test_index_flag_becomes_named_unique_constraint(self): m1 = MetaData() m2 = MetaData() Table( "user", m1, Column("id", Integer, primary_key=True), Column("name", String(50), nullable=False, index=True), Column("a1", String(10), server_default="x"), ) Table( "user", m2, Column("id", Integer, primary_key=True), Column("name", String(50), nullable=False), Column("a1", String(10), server_default="x"), UniqueConstraint("name", name="uq_user_name"), ) diffs = self._fixture(m1, m2) if self.reports_unique_constraints: eq_(diffs[0][0], "add_constraint") eq_(diffs[0][1].name, "uq_user_name") eq_(diffs[1][0], "remove_index") eq_(diffs[1][1].name, "ix_user_name") else: eq_(diffs[0][0], "remove_index") eq_(diffs[0][1].name, "ix_user_name") def test_add_unique_constraint(self): m1 = MetaData() m2 = MetaData() Table( "address", m1, Column("id", Integer, primary_key=True), Column("email_address", String(100), nullable=False), Column("qpr", String(10), index=True), ) Table( "address", m2, Column("id", Integer, primary_key=True), Column("email_address", String(100), nullable=False), Column("qpr", String(10), index=True), UniqueConstraint("email_address", name="uq_email_address"), ) diffs = self._fixture(m1, m2) if self.reports_unique_constraints: eq_(diffs[0][0], "add_constraint") eq_(diffs[0][1].name, "uq_email_address") else: eq_(diffs, []) def test_unique_flag_nothing_changed(self): m1 = MetaData() m2 = MetaData() Table( "unq_idx", m1, Column("id", Integer, primary_key=True), Column("x", String(20)), Index("x", "x", unique=True), ) Table( "unq_idx", m2, Column("id", Integer, primary_key=True), Column("x", String(20)), Index("x", "x", unique=True), ) diffs = self._fixture(m1, m2) eq_(diffs, []) def test_index_becomes_unique(self): m1 = MetaData() m2 = MetaData() Table( "order", m1, Column("order_id", Integer, primary_key=True), Column("amount", Numeric(10, 2), nullable=True), Column("user_id", Integer), UniqueConstraint( "order_id", "user_id", name="order_order_id_user_id_unique" ), Index("order_user_id_amount_idx", "user_id", "amount"), ) Table( "order", m2, Column("order_id", Integer, primary_key=True), Column("amount", Numeric(10, 2), nullable=True), Column("user_id", Integer), UniqueConstraint( "order_id", "user_id", name="order_order_id_user_id_unique" ), Index( "order_user_id_amount_idx", "user_id", "amount", unique=True ), ) diffs = self._fixture(m1, m2) eq_(diffs[0][0], "remove_index") eq_(diffs[0][1].name, "order_user_id_amount_idx") eq_(diffs[0][1].unique, False) eq_(diffs[1][0], "add_index") eq_(diffs[1][1].name, "order_user_id_amount_idx") eq_(diffs[1][1].unique, True) def test_mismatch_db_named_col_flag(self): m1 = MetaData() m2 = MetaData() Table( "item", m1, Column("x", Integer), UniqueConstraint("x", name="db_generated_name"), ) # test mismatch between unique=True and # named uq constraint Table("item", m2, Column("x", Integer, unique=True)) diffs = self._fixture(m1, m2) eq_(diffs, []) def test_new_table_added(self): m1 = MetaData() m2 = MetaData() Table( "extra", m2, Column("foo", Integer, index=True), Column("bar", Integer), Index("newtable_idx", "bar"), ) diffs = self._fixture(m1, m2) eq_(diffs[0][0], "add_table") eq_(diffs[1][0], "add_index") eq_(diffs[1][1].name, "ix_extra_foo") eq_(diffs[2][0], "add_index") eq_(diffs[2][1].name, "newtable_idx") def test_named_cols_changed(self): m1 = MetaData() m2 = MetaData() Table( "col_change", m1, Column("x", Integer), Column("y", Integer), UniqueConstraint("x", name="nochange"), ) Table( "col_change", m2, Column("x", Integer), Column("y", Integer), UniqueConstraint("x", "y", name="nochange"), ) diffs = self._fixture(m1, m2) if self.reports_unique_constraints: eq_(diffs[0][0], "remove_constraint") eq_(diffs[0][1].name, "nochange") eq_(diffs[1][0], "add_constraint") eq_(diffs[1][1].name, "nochange") else: eq_(diffs, []) def test_nothing_changed_one(self): m1 = MetaData() m2 = MetaData() Table( "nothing_changed", m1, Column("x", String(20), unique=True, index=True), ) Table( "nothing_changed", m2, Column("x", String(20), unique=True, index=True), ) diffs = self._fixture(m1, m2) eq_(diffs, []) def test_nothing_changed_two(self): m1 = MetaData() m2 = MetaData() Table( "nothing_changed", m1, Column("id1", Integer, primary_key=True), Column("id2", Integer, primary_key=True), Column("x", String(20), unique=True), mysql_engine="InnoDB", ) Table( "nothing_changed_related", m1, Column("id1", Integer), Column("id2", Integer), ForeignKeyConstraint( ["id1", "id2"], ["nothing_changed.id1", "nothing_changed.id2"] ), mysql_engine="InnoDB", ) Table( "nothing_changed", m2, Column("id1", Integer, primary_key=True), Column("id2", Integer, primary_key=True), Column("x", String(20), unique=True), mysql_engine="InnoDB", ) Table( "nothing_changed_related", m2, Column("id1", Integer), Column("id2", Integer), ForeignKeyConstraint( ["id1", "id2"], ["nothing_changed.id1", "nothing_changed.id2"] ), mysql_engine="InnoDB", ) diffs = self._fixture(m1, m2) eq_(diffs, []) def test_nothing_changed_unique_w_colkeys(self): m1 = MetaData() m2 = MetaData() Table( "nothing_changed", m1, Column("x", String(20), key="nx"), UniqueConstraint("nx"), ) Table( "nothing_changed", m2, Column("x", String(20), key="nx"), UniqueConstraint("nx"), ) diffs = self._fixture(m1, m2) eq_(diffs, []) def test_nothing_changed_index_w_colkeys(self): m1 = MetaData() m2 = MetaData() Table( "nothing_changed", m1, Column("x", String(20), key="nx"), Index("foobar", "nx"), ) Table( "nothing_changed", m2, Column("x", String(20), key="nx"), Index("foobar", "nx"), ) diffs = self._fixture(m1, m2) eq_(diffs, []) def test_nothing_changed_index_named_as_column(self): m1 = MetaData() m2 = MetaData() Table( "nothing_changed", m1, Column("id1", Integer, primary_key=True), Column("id2", Integer, primary_key=True), Column("x", String(20)), Index("x", "x"), ) Table( "nothing_changed", m2, Column("id1", Integer, primary_key=True), Column("id2", Integer, primary_key=True), Column("x", String(20)), Index("x", "x"), ) diffs = self._fixture(m1, m2) eq_(diffs, []) def test_nothing_changed_implicit_fk_index_named(self): m1 = MetaData() m2 = MetaData() Table( "nothing_changed", m1, Column("id", Integer, primary_key=True), Column( "other_id", ForeignKey("nc2.id", name="fk_my_table_other_table"), nullable=False, ), Column("foo", Integer), mysql_engine="InnoDB", ) Table( "nc2", m1, Column("id", Integer, primary_key=True), mysql_engine="InnoDB", ) Table( "nothing_changed", m2, Column("id", Integer, primary_key=True), Column( "other_id", ForeignKey("nc2.id", name="fk_my_table_other_table"), nullable=False, ), Column("foo", Integer), mysql_engine="InnoDB", ) Table( "nc2", m2, Column("id", Integer, primary_key=True), mysql_engine="InnoDB", ) diffs = self._fixture(m1, m2) eq_(diffs, []) def test_nothing_changed_implicit_composite_fk_index_named(self): m1 = MetaData() m2 = MetaData() Table( "nothing_changed", m1, Column("id", Integer, primary_key=True), Column("other_id_1", Integer), Column("other_id_2", Integer), Column("foo", Integer), ForeignKeyConstraint( ["other_id_1", "other_id_2"], ["nc2.id1", "nc2.id2"], name="fk_my_table_other_table", ), mysql_engine="InnoDB", ) Table( "nc2", m1, Column("id1", Integer, primary_key=True), Column("id2", Integer, primary_key=True), mysql_engine="InnoDB", ) Table( "nothing_changed", m2, Column("id", Integer, primary_key=True), Column("other_id_1", Integer), Column("other_id_2", Integer), Column("foo", Integer), ForeignKeyConstraint( ["other_id_1", "other_id_2"], ["nc2.id1", "nc2.id2"], name="fk_my_table_other_table", ), mysql_engine="InnoDB", ) Table( "nc2", m2, Column("id1", Integer, primary_key=True), Column("id2", Integer, primary_key=True), mysql_engine="InnoDB", ) diffs = self._fixture(m1, m2) eq_(diffs, []) def test_new_idx_index_named_as_column(self): m1 = MetaData() m2 = MetaData() Table( "new_idx", m1, Column("id1", Integer, primary_key=True), Column("id2", Integer, primary_key=True), Column("x", String(20)), ) idx = Index("x", "x") Table( "new_idx", m2, Column("id1", Integer, primary_key=True), Column("id2", Integer, primary_key=True), Column("x", String(20)), idx, ) diffs = self._fixture(m1, m2) eq_(diffs, [("add_index", idx)]) def test_removed_idx_index_named_as_column(self): m1 = MetaData() m2 = MetaData() idx = Index("x", "x") Table( "new_idx", m1, Column("id1", Integer, primary_key=True), Column("id2", Integer, primary_key=True), Column("x", String(20)), idx, ) Table( "new_idx", m2, Column("id1", Integer, primary_key=True), Column("id2", Integer, primary_key=True), Column("x", String(20)), ) diffs = self._fixture(m1, m2) eq_(diffs[0][0], "remove_index") def test_drop_table_w_indexes(self): m1 = MetaData() m2 = MetaData() t = Table( "some_table", m1, Column("id", Integer, primary_key=True), Column("x", String(20)), Column("y", String(20)), ) Index("xy_idx", t.c.x, t.c.y) Index("y_idx", t.c.y) diffs = self._fixture(m1, m2) eq_(diffs[0][0], "remove_index") eq_(diffs[1][0], "remove_index") eq_(diffs[2][0], "remove_table") eq_( set([diffs[0][1].name, diffs[1][1].name]), set(["xy_idx", "y_idx"]) ) def test_drop_table_w_uq_constraint(self): m1 = MetaData() m2 = MetaData() Table( "some_table", m1, Column("id", Integer, primary_key=True), Column("x", String(20)), Column("y", String(20)), UniqueConstraint("y", name="uq_y"), ) diffs = self._fixture(m1, m2) if self.reports_unique_constraints_as_indexes: # for MySQL this UQ will look like an index, so # make sure it at least sets it up correctly eq_(diffs[0][0], "remove_index") eq_(diffs[1][0], "remove_table") eq_(len(diffs), 2) constraints = [ c for c in diffs[1][1].constraints if isinstance(c, UniqueConstraint) ] eq_(len(constraints), 0) else: eq_(diffs[0][0], "remove_table") eq_(len(diffs), 1) constraints = [ c for c in diffs[0][1].constraints if isinstance(c, UniqueConstraint) ] if self.reports_unique_constraints: eq_(len(constraints), 1) def test_unnamed_cols_changed(self): m1 = MetaData() m2 = MetaData() Table( "col_change", m1, Column("x", Integer), Column("y", Integer), UniqueConstraint("x"), ) Table( "col_change", m2, Column("x", Integer), Column("y", Integer), UniqueConstraint("x", "y"), ) diffs = self._fixture(m1, m2) diffs = set( (cmd, ("x" in obj.name) if obj.name is not None else False) for cmd, obj in diffs ) if self.reports_unnamed_constraints: if self.reports_unique_constraints_as_indexes: eq_( diffs, set([("remove_index", True), ("add_constraint", False)]), ) else: eq_( diffs, set( [ ("remove_constraint", True), ("add_constraint", False), ] ), ) def test_remove_named_unique_index(self): m1 = MetaData() m2 = MetaData() Table( "remove_idx", m1, Column("x", Integer), Index("xidx", "x", unique=True), ) Table("remove_idx", m2, Column("x", Integer)) diffs = self._fixture(m1, m2) if self.reports_unique_constraints: diffs = set((cmd, obj.name) for cmd, obj in diffs) eq_(diffs, set([("remove_index", "xidx")])) else: eq_(diffs, []) def test_remove_named_unique_constraint(self): m1 = MetaData() m2 = MetaData() Table( "remove_idx", m1, Column("x", Integer), UniqueConstraint("x", name="xidx"), ) Table("remove_idx", m2, Column("x", Integer)) diffs = self._fixture(m1, m2) if self.reports_unique_constraints: diffs = set((cmd, obj.name) for cmd, obj in diffs) if self.reports_unique_constraints_as_indexes: eq_(diffs, set([("remove_index", "xidx")])) else: eq_(diffs, set([("remove_constraint", "xidx")])) else: eq_(diffs, []) def test_dont_add_uq_on_table_create(self): m1 = MetaData() m2 = MetaData() Table("no_uq", m2, Column("x", String(50), unique=True)) diffs = self._fixture(m1, m2) eq_(diffs[0][0], "add_table") eq_(len(diffs), 1) assert UniqueConstraint in set( type(c) for c in diffs[0][1].constraints ) def test_add_uq_ix_on_table_create(self): m1 = MetaData() m2 = MetaData() Table("add_ix", m2, Column("x", String(50), unique=True, index=True)) diffs = self._fixture(m1, m2) eq_(diffs[0][0], "add_table") eq_(len(diffs), 2) assert UniqueConstraint not in set( type(c) for c in diffs[0][1].constraints ) eq_(diffs[1][0], "add_index") eq_(diffs[1][1].unique, True) def test_add_ix_on_table_create(self): m1 = MetaData() m2 = MetaData() Table("add_ix", m2, Column("x", String(50), index=True)) diffs = self._fixture(m1, m2) eq_(diffs[0][0], "add_table") eq_(len(diffs), 2) assert UniqueConstraint not in set( type(c) for c in diffs[0][1].constraints ) eq_(diffs[1][0], "add_index") eq_(diffs[1][1].unique, False) def test_add_idx_non_col(self): m1 = MetaData() m2 = MetaData() Table("add_ix", m1, Column("x", String(50))) t2 = Table("add_ix", m2, Column("x", String(50))) Index("foo_idx", t2.c.x.desc()) diffs = self._fixture(m1, m2) eq_(diffs[0][0], "add_index") def test_unchanged_idx_non_col(self): m1 = MetaData() m2 = MetaData() t1 = Table("add_ix", m1, Column("x", String(50))) Index("foo_idx", t1.c.x.desc()) t2 = Table("add_ix", m2, Column("x", String(50))) Index("foo_idx", t2.c.x.desc()) diffs = self._fixture(m1, m2) eq_(diffs, []) # fails in the 0.8 series where we have truncation rules, # but no control over quoting. passes in 0.7.9 where we don't have # truncation rules either. dropping these ancient versions # is long overdue. def test_unchanged_case_sensitive_implicit_idx(self): m1 = MetaData() m2 = MetaData() Table("add_ix", m1, Column("regNumber", String(50), index=True)) Table("add_ix", m2, Column("regNumber", String(50), index=True)) diffs = self._fixture(m1, m2) eq_(diffs, []) def test_unchanged_case_sensitive_explicit_idx(self): m1 = MetaData() m2 = MetaData() t1 = Table("add_ix", m1, Column("reg_number", String(50))) Index("regNumber_idx", t1.c.reg_number) t2 = Table("add_ix", m2, Column("reg_number", String(50))) Index("regNumber_idx", t2.c.reg_number) diffs = self._fixture(m1, m2) eq_(diffs, []) class PGUniqueIndexTest(AutogenerateUniqueIndexTest): reports_unnamed_constraints = True __only_on__ = "postgresql" __backend__ = True def test_idx_added_schema(self): m1 = MetaData() m2 = MetaData() Table("add_ix", m1, Column("x", String(50)), schema="test_schema") Table( "add_ix", m2, Column("x", String(50)), Index("ix_1", "x"), schema="test_schema", ) diffs = self._fixture(m1, m2, include_schemas=True) eq_(diffs[0][0], "add_index") eq_(diffs[0][1].name, "ix_1") def test_idx_unchanged_schema(self): m1 = MetaData() m2 = MetaData() Table( "add_ix", m1, Column("x", String(50)), Index("ix_1", "x"), schema="test_schema", ) Table( "add_ix", m2, Column("x", String(50)), Index("ix_1", "x"), schema="test_schema", ) diffs = self._fixture(m1, m2, include_schemas=True) eq_(diffs, []) def test_uq_added_schema(self): m1 = MetaData() m2 = MetaData() Table("add_uq", m1, Column("x", String(50)), schema="test_schema") Table( "add_uq", m2, Column("x", String(50)), UniqueConstraint("x", name="ix_1"), schema="test_schema", ) diffs = self._fixture(m1, m2, include_schemas=True) eq_(diffs[0][0], "add_constraint") eq_(diffs[0][1].name, "ix_1") def test_uq_unchanged_schema(self): m1 = MetaData() m2 = MetaData() Table( "add_uq", m1, Column("x", String(50)), UniqueConstraint("x", name="ix_1"), schema="test_schema", ) Table( "add_uq", m2, Column("x", String(50)), UniqueConstraint("x", name="ix_1"), schema="test_schema", ) diffs = self._fixture(m1, m2, include_schemas=True) eq_(diffs, []) @config.requirements.btree_gist def test_exclude_const_unchanged(self): from sqlalchemy.dialects.postgresql import TSRANGE, ExcludeConstraint m1 = MetaData() m2 = MetaData() Table( "add_excl", m1, Column("id", Integer, primary_key=True), Column("period", TSRANGE), ExcludeConstraint(("period", "&&"), name="quarters_period_excl"), ) Table( "add_excl", m2, Column("id", Integer, primary_key=True), Column("period", TSRANGE), ExcludeConstraint(("period", "&&"), name="quarters_period_excl"), ) diffs = self._fixture(m1, m2) eq_(diffs, []) def test_same_tname_two_schemas(self): m1 = MetaData() m2 = MetaData() Table("add_ix", m1, Column("x", String(50)), Index("ix_1", "x")) Table("add_ix", m2, Column("x", String(50)), Index("ix_1", "x")) Table("add_ix", m2, Column("x", String(50)), schema="test_schema") diffs = self._fixture(m1, m2, include_schemas=True) eq_(diffs[0][0], "add_table") eq_(len(diffs), 1) def test_uq_dropped(self): m1 = MetaData() m2 = MetaData() Table( "add_uq", m1, Column("id", Integer, primary_key=True), Column("name", String), UniqueConstraint("name", name="uq_name"), ) Table( "add_uq", m2, Column("id", Integer, primary_key=True), Column("name", String), ) diffs = self._fixture(m1, m2, include_schemas=True) eq_(diffs[0][0], "remove_constraint") eq_(diffs[0][1].name, "uq_name") eq_(len(diffs), 1) def test_functional_ix_one(self): m1 = MetaData() m2 = MetaData() t1 = Table( "foo", m1, Column("id", Integer, primary_key=True), Column("email", String(50)), ) Index("email_idx", func.lower(t1.c.email), unique=True) t2 = Table( "foo", m2, Column("id", Integer, primary_key=True), Column("email", String(50)), ) Index("email_idx", func.lower(t2.c.email), unique=True) with assertions.expect_warnings( "Skipped unsupported reflection", "autogenerate skipping functional index", ): diffs = self._fixture(m1, m2) eq_(diffs, []) def test_functional_ix_two(self): m1 = MetaData() m2 = MetaData() t1 = Table( "foo", m1, Column("id", Integer, primary_key=True), Column("email", String(50)), Column("name", String(50)), ) Index( "email_idx", func.coalesce(t1.c.email, t1.c.name).desc(), unique=True, ) t2 = Table( "foo", m2, Column("id", Integer, primary_key=True), Column("email", String(50)), Column("name", String(50)), ) Index( "email_idx", func.coalesce(t2.c.email, t2.c.name).desc(), unique=True, ) with assertions.expect_warnings( "Skipped unsupported reflection", "autogenerate skipping functional index", ): diffs = self._fixture(m1, m2) eq_(diffs, []) class MySQLUniqueIndexTest(AutogenerateUniqueIndexTest): reports_unnamed_constraints = True reports_unique_constraints_as_indexes = True __only_on__ = "mysql" __backend__ = True def test_removed_idx_index_named_as_column(self): try: super( MySQLUniqueIndexTest, self ).test_removed_idx_index_named_as_column() except IndexError: assert True else: assert False, "unexpected success" class OracleUniqueIndexTest(AutogenerateUniqueIndexTest): reports_unnamed_constraints = True reports_unique_constraints_as_indexes = True __only_on__ = "oracle" __backend__ = True class NoUqReflectionIndexTest(NoUqReflection, AutogenerateUniqueIndexTest): reports_unique_constraints = False __only_on__ = "sqlite" def test_unique_not_reported(self): m1 = MetaData() Table( "order", m1, Column("order_id", Integer, primary_key=True), Column("amount", Numeric(10, 2), nullable=True), Column("user_id", Integer), UniqueConstraint( "order_id", "user_id", name="order_order_id_user_id_unique" ), ) diffs = self._fixture(m1, m1) eq_(diffs, []) def test_remove_unique_index_not_reported(self): m1 = MetaData() Table( "order", m1, Column("order_id", Integer, primary_key=True), Column("amount", Numeric(10, 2), nullable=True), Column("user_id", Integer), Index("oid_ix", "order_id", "user_id", unique=True), ) m2 = MetaData() Table( "order", m2, Column("order_id", Integer, primary_key=True), Column("amount", Numeric(10, 2), nullable=True), Column("user_id", Integer), ) diffs = self._fixture(m1, m2) eq_(diffs, []) def test_remove_plain_index_is_reported(self): m1 = MetaData() Table( "order", m1, Column("order_id", Integer, primary_key=True), Column("amount", Numeric(10, 2), nullable=True), Column("user_id", Integer), Index("oid_ix", "order_id", "user_id"), ) m2 = MetaData() Table( "order", m2, Column("order_id", Integer, primary_key=True), Column("amount", Numeric(10, 2), nullable=True), Column("user_id", Integer), ) diffs = self._fixture(m1, m2) eq_(diffs[0][0], "remove_index") class NoUqReportsIndAsUqTest(NoUqReflectionIndexTest): """this test suite simulates the condition where: a. the dialect doesn't report unique constraints b. the dialect returns unique constraints within the indexes list. Currently the mssql dialect does this, but here we force this condition so that we can test the behavior regardless of if/when mssql supports unique constraint reflection. """ __only_on__ = "sqlite" @classmethod def _get_bind(cls): eng = config.db _get_unique_constraints = eng.dialect.get_unique_constraints _get_indexes = eng.dialect.get_indexes def unimpl(*arg, **kw): raise NotImplementedError() def get_indexes(self, connection, tablename, **kw): indexes = _get_indexes(self, connection, tablename, **kw) for uq in _get_unique_constraints( self, connection, tablename, **kw ): uq["unique"] = True indexes.append(uq) return indexes eng.dialect.get_unique_constraints = unimpl eng.dialect.get_indexes = get_indexes return eng class IncludeHooksTest(AutogenFixtureTest, TestBase): __backend__ = True def test_remove_connection_index(self): m1 = MetaData() m2 = MetaData() t1 = Table("t", m1, Column("x", Integer), Column("y", Integer)) Index("ix1", t1.c.x) Index("ix2", t1.c.y) Table("t", m2, Column("x", Integer), Column("y", Integer)) def include_object(object_, name, type_, reflected, compare_to): if type_ == "unique_constraint": return False return not ( isinstance(object_, Index) and type_ == "index" and reflected and name == "ix1" ) diffs = self._fixture(m1, m2, object_filters=include_object) eq_(diffs[0][0], "remove_index") eq_(diffs[0][1].name, "ix2") eq_(len(diffs), 1) @config.requirements.unique_constraint_reflection @config.requirements.reflects_unique_constraints_unambiguously def test_remove_connection_uq(self): m1 = MetaData() m2 = MetaData() Table( "t", m1, Column("x", Integer), Column("y", Integer), UniqueConstraint("x", name="uq1"), UniqueConstraint("y", name="uq2"), ) Table("t", m2, Column("x", Integer), Column("y", Integer)) def include_object(object_, name, type_, reflected, compare_to): if type_ == "index": return False return not ( isinstance(object_, UniqueConstraint) and type_ == "unique_constraint" and reflected and name == "uq1" ) diffs = self._fixture(m1, m2, object_filters=include_object) eq_(diffs[0][0], "remove_constraint") eq_(diffs[0][1].name, "uq2") eq_(len(diffs), 1) def test_add_metadata_index(self): m1 = MetaData() m2 = MetaData() Table("t", m1, Column("x", Integer)) t2 = Table("t", m2, Column("x", Integer)) Index("ix1", t2.c.x) Index("ix2", t2.c.x) def include_object(object_, name, type_, reflected, compare_to): return not ( isinstance(object_, Index) and type_ == "index" and not reflected and name == "ix1" ) diffs = self._fixture(m1, m2, object_filters=include_object) eq_(diffs[0][0], "add_index") eq_(diffs[0][1].name, "ix2") eq_(len(diffs), 1) @config.requirements.unique_constraint_reflection def test_add_metadata_unique(self): m1 = MetaData() m2 = MetaData() Table("t", m1, Column("x", Integer)) Table( "t", m2, Column("x", Integer), UniqueConstraint("x", name="uq1"), UniqueConstraint("x", name="uq2"), ) def include_object(object_, name, type_, reflected, compare_to): return not ( isinstance(object_, UniqueConstraint) and type_ == "unique_constraint" and not reflected and name == "uq1" ) diffs = self._fixture(m1, m2, object_filters=include_object) eq_(diffs[0][0], "add_constraint") eq_(diffs[0][1].name, "uq2") eq_(len(diffs), 1) def test_change_index(self): m1 = MetaData() m2 = MetaData() t1 = Table( "t", m1, Column("x", Integer), Column("y", Integer), Column("z", Integer), ) Index("ix1", t1.c.x) Index("ix2", t1.c.y) t2 = Table( "t", m2, Column("x", Integer), Column("y", Integer), Column("z", Integer), ) Index("ix1", t2.c.x, t2.c.y) Index("ix2", t2.c.x, t2.c.z) def include_object(object_, name, type_, reflected, compare_to): return not ( isinstance(object_, Index) and type_ == "index" and not reflected and name == "ix1" and isinstance(compare_to, Index) ) diffs = self._fixture(m1, m2, object_filters=include_object) eq_(diffs[0][0], "remove_index") eq_(diffs[0][1].name, "ix2") eq_(diffs[1][0], "add_index") eq_(diffs[1][1].name, "ix2") eq_(len(diffs), 2) @config.requirements.unique_constraint_reflection def test_change_unique(self): m1 = MetaData() m2 = MetaData() Table( "t", m1, Column("x", Integer), Column("y", Integer), Column("z", Integer), UniqueConstraint("x", name="uq1"), UniqueConstraint("y", name="uq2"), ) Table( "t", m2, Column("x", Integer), Column("y", Integer), Column("z", Integer), UniqueConstraint("x", "z", name="uq1"), UniqueConstraint("y", "z", name="uq2"), ) def include_object(object_, name, type_, reflected, compare_to): if type_ == "index": return False return not ( isinstance(object_, UniqueConstraint) and type_ == "unique_constraint" and not reflected and name == "uq1" and isinstance(compare_to, UniqueConstraint) ) diffs = self._fixture(m1, m2, object_filters=include_object) eq_(diffs[0][0], "remove_constraint") eq_(diffs[0][1].name, "uq2") eq_(diffs[1][0], "add_constraint") eq_(diffs[1][1].name, "uq2") eq_(len(diffs), 2) class TruncatedIdxTest(AutogenFixtureTest, TestBase): def setUp(self): self.bind = engines.testing_engine() self.bind.dialect.max_identifier_length = 30 def test_idx_matches_long(self): from alembic.operations.base import conv m1 = MetaData() Table( "q", m1, Column("id", Integer, primary_key=True), Column("data", Integer), Index( conv("idx_q_table_this_is_more_than_thirty_characters"), "data" ), ) diffs = self._fixture(m1, m1) eq_(diffs, []) zzzeek-alembic-bee044a1c187/tests/test_autogen_render.py000066400000000000000000002167131353106760100233700ustar00rootroot00000000000000import re import sys import sqlalchemy as sa # noqa from sqlalchemy import BigInteger from sqlalchemy import Boolean from sqlalchemy import cast from sqlalchemy import CHAR from sqlalchemy import CheckConstraint from sqlalchemy import Column from sqlalchemy import DATETIME from sqlalchemy import DateTime from sqlalchemy import DefaultClause from sqlalchemy import Enum from sqlalchemy import ForeignKey from sqlalchemy import ForeignKeyConstraint from sqlalchemy import func from sqlalchemy import Index from sqlalchemy import Integer from sqlalchemy import MetaData from sqlalchemy import Numeric from sqlalchemy import PrimaryKeyConstraint from sqlalchemy import String from sqlalchemy import Table from sqlalchemy import text from sqlalchemy import types from sqlalchemy import Unicode from sqlalchemy import UniqueConstraint from sqlalchemy.engine.default import DefaultDialect from sqlalchemy.sql import and_ from sqlalchemy.sql import column from sqlalchemy.sql import false from sqlalchemy.sql import literal_column from sqlalchemy.sql import table from sqlalchemy.types import TIMESTAMP from sqlalchemy.types import UserDefinedType from alembic import autogenerate from alembic import op # noqa from alembic.autogenerate import api from alembic.migration import MigrationContext from alembic.operations import ops from alembic.testing import assert_raises from alembic.testing import assertions from alembic.testing import config from alembic.testing import eq_ from alembic.testing import eq_ignore_whitespace from alembic.testing import mock from alembic.testing import TestBase from alembic.testing.fixtures import op_fixture from alembic.util import compat py3k = sys.version_info >= (3,) class AutogenRenderTest(TestBase): """test individual directives""" def setUp(self): ctx_opts = { "sqlalchemy_module_prefix": "sa.", "alembic_module_prefix": "op.", "target_metadata": MetaData(), } context = MigrationContext.configure( dialect=DefaultDialect(), opts=ctx_opts ) self.autogen_context = api.AutogenContext(context) def test_render_add_index(self): """ autogenerate.render._add_index """ m = MetaData() t = Table( "test", m, Column("id", Integer, primary_key=True), Column("active", Boolean()), Column("code", String(255)), ) idx = Index("test_active_code_idx", t.c.active, t.c.code) op_obj = ops.CreateIndexOp.from_index(idx) eq_ignore_whitespace( autogenerate.render_op_text(self.autogen_context, op_obj), "op.create_index('test_active_code_idx', 'test', " "['active', 'code'], unique=False)", ) def test_render_add_index_batch(self): """ autogenerate.render._add_index """ m = MetaData() t = Table( "test", m, Column("id", Integer, primary_key=True), Column("active", Boolean()), Column("code", String(255)), ) idx = Index("test_active_code_idx", t.c.active, t.c.code) op_obj = ops.CreateIndexOp.from_index(idx) with self.autogen_context._within_batch(): eq_ignore_whitespace( autogenerate.render_op_text(self.autogen_context, op_obj), "batch_op.create_index('test_active_code_idx', " "['active', 'code'], unique=False)", ) def test_render_add_index_schema(self): """ autogenerate.render._add_index using schema """ m = MetaData() t = Table( "test", m, Column("id", Integer, primary_key=True), Column("active", Boolean()), Column("code", String(255)), schema="CamelSchema", ) idx = Index("test_active_code_idx", t.c.active, t.c.code) op_obj = ops.CreateIndexOp.from_index(idx) eq_ignore_whitespace( autogenerate.render_op_text(self.autogen_context, op_obj), "op.create_index('test_active_code_idx', 'test', " "['active', 'code'], unique=False, schema='CamelSchema')", ) def test_render_add_index_schema_batch(self): """ autogenerate.render._add_index using schema """ m = MetaData() t = Table( "test", m, Column("id", Integer, primary_key=True), Column("active", Boolean()), Column("code", String(255)), schema="CamelSchema", ) idx = Index("test_active_code_idx", t.c.active, t.c.code) op_obj = ops.CreateIndexOp.from_index(idx) with self.autogen_context._within_batch(): eq_ignore_whitespace( autogenerate.render_op_text(self.autogen_context, op_obj), "batch_op.create_index('test_active_code_idx', " "['active', 'code'], unique=False)", ) def test_render_add_index_func(self): m = MetaData() t = Table( "test", m, Column("id", Integer, primary_key=True), Column("code", String(255)), ) idx = Index("test_lower_code_idx", func.lower(t.c.code)) op_obj = ops.CreateIndexOp.from_index(idx) eq_ignore_whitespace( autogenerate.render_op_text(self.autogen_context, op_obj), "op.create_index('test_lower_code_idx', 'test', " "[sa.text(!U'lower(code)')], unique=False)", ) def test_render_add_index_cast(self): m = MetaData() t = Table( "test", m, Column("id", Integer, primary_key=True), Column("code", String(255)), ) idx = Index("test_lower_code_idx", cast(t.c.code, String)) op_obj = ops.CreateIndexOp.from_index(idx) eq_ignore_whitespace( autogenerate.render_op_text(self.autogen_context, op_obj), "op.create_index('test_lower_code_idx', 'test', " "[sa.text(!U'CAST(code AS VARCHAR)')], unique=False)", ) def test_render_add_index_desc(self): m = MetaData() t = Table( "test", m, Column("id", Integer, primary_key=True), Column("code", String(255)), ) idx = Index("test_desc_code_idx", t.c.code.desc()) op_obj = ops.CreateIndexOp.from_index(idx) eq_ignore_whitespace( autogenerate.render_op_text(self.autogen_context, op_obj), "op.create_index('test_desc_code_idx', 'test', " "[sa.text(!U'code DESC')], unique=False)", ) def test_drop_index(self): """ autogenerate.render._drop_index """ m = MetaData() t = Table( "test", m, Column("id", Integer, primary_key=True), Column("active", Boolean()), Column("code", String(255)), ) idx = Index("test_active_code_idx", t.c.active, t.c.code) op_obj = ops.DropIndexOp.from_index(idx) eq_ignore_whitespace( autogenerate.render_op_text(self.autogen_context, op_obj), "op.drop_index('test_active_code_idx', table_name='test')", ) def test_drop_index_batch(self): """ autogenerate.render._drop_index """ m = MetaData() t = Table( "test", m, Column("id", Integer, primary_key=True), Column("active", Boolean()), Column("code", String(255)), ) idx = Index("test_active_code_idx", t.c.active, t.c.code) op_obj = ops.DropIndexOp.from_index(idx) with self.autogen_context._within_batch(): eq_ignore_whitespace( autogenerate.render_op_text(self.autogen_context, op_obj), "batch_op.drop_index('test_active_code_idx')", ) def test_drop_index_schema(self): """ autogenerate.render._drop_index using schema """ m = MetaData() t = Table( "test", m, Column("id", Integer, primary_key=True), Column("active", Boolean()), Column("code", String(255)), schema="CamelSchema", ) idx = Index("test_active_code_idx", t.c.active, t.c.code) op_obj = ops.DropIndexOp.from_index(idx) eq_ignore_whitespace( autogenerate.render_op_text(self.autogen_context, op_obj), "op.drop_index('test_active_code_idx', " + "table_name='test', schema='CamelSchema')", ) def test_drop_index_schema_batch(self): """ autogenerate.render._drop_index using schema """ m = MetaData() t = Table( "test", m, Column("id", Integer, primary_key=True), Column("active", Boolean()), Column("code", String(255)), schema="CamelSchema", ) idx = Index("test_active_code_idx", t.c.active, t.c.code) op_obj = ops.DropIndexOp.from_index(idx) with self.autogen_context._within_batch(): eq_ignore_whitespace( autogenerate.render_op_text(self.autogen_context, op_obj), "batch_op.drop_index('test_active_code_idx')", ) def test_add_unique_constraint(self): """ autogenerate.render._add_unique_constraint """ m = MetaData() t = Table( "test", m, Column("id", Integer, primary_key=True), Column("active", Boolean()), Column("code", String(255)), ) uq = UniqueConstraint(t.c.code, name="uq_test_code") op_obj = ops.AddConstraintOp.from_constraint(uq) eq_ignore_whitespace( autogenerate.render_op_text(self.autogen_context, op_obj), "op.create_unique_constraint('uq_test_code', 'test', ['code'])", ) def test_add_unique_constraint_batch(self): """ autogenerate.render._add_unique_constraint """ m = MetaData() t = Table( "test", m, Column("id", Integer, primary_key=True), Column("active", Boolean()), Column("code", String(255)), ) uq = UniqueConstraint(t.c.code, name="uq_test_code") op_obj = ops.AddConstraintOp.from_constraint(uq) with self.autogen_context._within_batch(): eq_ignore_whitespace( autogenerate.render_op_text(self.autogen_context, op_obj), "batch_op.create_unique_constraint('uq_test_code', ['code'])", ) def test_add_unique_constraint_schema(self): """ autogenerate.render._add_unique_constraint using schema """ m = MetaData() t = Table( "test", m, Column("id", Integer, primary_key=True), Column("active", Boolean()), Column("code", String(255)), schema="CamelSchema", ) uq = UniqueConstraint(t.c.code, name="uq_test_code") op_obj = ops.AddConstraintOp.from_constraint(uq) eq_ignore_whitespace( autogenerate.render_op_text(self.autogen_context, op_obj), "op.create_unique_constraint('uq_test_code', 'test', " "['code'], schema='CamelSchema')", ) def test_add_unique_constraint_schema_batch(self): """ autogenerate.render._add_unique_constraint using schema """ m = MetaData() t = Table( "test", m, Column("id", Integer, primary_key=True), Column("active", Boolean()), Column("code", String(255)), schema="CamelSchema", ) uq = UniqueConstraint(t.c.code, name="uq_test_code") op_obj = ops.AddConstraintOp.from_constraint(uq) with self.autogen_context._within_batch(): eq_ignore_whitespace( autogenerate.render_op_text(self.autogen_context, op_obj), "batch_op.create_unique_constraint('uq_test_code', " "['code'])", ) def test_drop_unique_constraint(self): """ autogenerate.render._drop_constraint """ m = MetaData() t = Table( "test", m, Column("id", Integer, primary_key=True), Column("active", Boolean()), Column("code", String(255)), ) uq = UniqueConstraint(t.c.code, name="uq_test_code") op_obj = ops.DropConstraintOp.from_constraint(uq) eq_ignore_whitespace( autogenerate.render_op_text(self.autogen_context, op_obj), "op.drop_constraint('uq_test_code', 'test', type_='unique')", ) def test_drop_unique_constraint_schema(self): """ autogenerate.render._drop_constraint using schema """ m = MetaData() t = Table( "test", m, Column("id", Integer, primary_key=True), Column("active", Boolean()), Column("code", String(255)), schema="CamelSchema", ) uq = UniqueConstraint(t.c.code, name="uq_test_code") op_obj = ops.DropConstraintOp.from_constraint(uq) eq_ignore_whitespace( autogenerate.render_op_text(self.autogen_context, op_obj), "op.drop_constraint('uq_test_code', 'test', " "schema='CamelSchema', type_='unique')", ) def test_drop_unique_constraint_schema_reprobj(self): """ autogenerate.render._drop_constraint using schema """ class SomeObj(str): def __repr__(self): return "foo.camel_schema" op_obj = ops.DropConstraintOp( "uq_test_code", "test", type_="unique", schema=SomeObj("CamelSchema"), ) eq_ignore_whitespace( autogenerate.render_op_text(self.autogen_context, op_obj), "op.drop_constraint('uq_test_code', 'test', " "schema=foo.camel_schema, type_='unique')", ) def test_add_fk_constraint(self): m = MetaData() Table("a", m, Column("id", Integer, primary_key=True)) b = Table("b", m, Column("a_id", Integer, ForeignKey("a.id"))) fk = ForeignKeyConstraint(["a_id"], ["a.id"], name="fk_a_id") b.append_constraint(fk) op_obj = ops.AddConstraintOp.from_constraint(fk) eq_ignore_whitespace( autogenerate.render_op_text(self.autogen_context, op_obj), "op.create_foreign_key('fk_a_id', 'b', 'a', ['a_id'], ['id'])", ) def test_add_fk_constraint_batch(self): m = MetaData() Table("a", m, Column("id", Integer, primary_key=True)) b = Table("b", m, Column("a_id", Integer, ForeignKey("a.id"))) fk = ForeignKeyConstraint(["a_id"], ["a.id"], name="fk_a_id") b.append_constraint(fk) op_obj = ops.AddConstraintOp.from_constraint(fk) with self.autogen_context._within_batch(): eq_ignore_whitespace( autogenerate.render_op_text(self.autogen_context, op_obj), "batch_op.create_foreign_key" "('fk_a_id', 'a', ['a_id'], ['id'])", ) def test_add_fk_constraint_kwarg(self): m = MetaData() t1 = Table("t", m, Column("c", Integer)) t2 = Table("t2", m, Column("c_rem", Integer)) fk = ForeignKeyConstraint([t1.c.c], [t2.c.c_rem], onupdate="CASCADE") # SQLA 0.9 generates a u'' here for remote cols while 0.8 does not, # so just whack out "'u" here from the generated op_obj = ops.AddConstraintOp.from_constraint(fk) eq_ignore_whitespace( re.sub( r"u'", "'", autogenerate.render_op_text(self.autogen_context, op_obj), ), "op.create_foreign_key(None, 't', 't2', ['c'], ['c_rem'], " "onupdate='CASCADE')", ) fk = ForeignKeyConstraint([t1.c.c], [t2.c.c_rem], ondelete="CASCADE") op_obj = ops.AddConstraintOp.from_constraint(fk) eq_ignore_whitespace( re.sub( r"u'", "'", autogenerate.render_op_text(self.autogen_context, op_obj), ), "op.create_foreign_key(None, 't', 't2', ['c'], ['c_rem'], " "ondelete='CASCADE')", ) fk = ForeignKeyConstraint([t1.c.c], [t2.c.c_rem], deferrable=True) op_obj = ops.AddConstraintOp.from_constraint(fk) eq_ignore_whitespace( re.sub( r"u'", "'", autogenerate.render_op_text(self.autogen_context, op_obj), ), "op.create_foreign_key(None, 't', 't2', ['c'], ['c_rem'], " "deferrable=True)", ) fk = ForeignKeyConstraint([t1.c.c], [t2.c.c_rem], initially="XYZ") op_obj = ops.AddConstraintOp.from_constraint(fk) eq_ignore_whitespace( re.sub( r"u'", "'", autogenerate.render_op_text(self.autogen_context, op_obj), ), "op.create_foreign_key(None, 't', 't2', ['c'], ['c_rem'], " "initially='XYZ')", ) fk = ForeignKeyConstraint( [t1.c.c], [t2.c.c_rem], initially="XYZ", ondelete="CASCADE", deferrable=True, ) op_obj = ops.AddConstraintOp.from_constraint(fk) eq_ignore_whitespace( re.sub( r"u'", "'", autogenerate.render_op_text(self.autogen_context, op_obj), ), "op.create_foreign_key(None, 't', 't2', ['c'], ['c_rem'], " "ondelete='CASCADE', initially='XYZ', deferrable=True)", ) def test_add_fk_constraint_inline_colkeys(self): m = MetaData() Table("a", m, Column("id", Integer, key="aid", primary_key=True)) b = Table( "b", m, Column("a_id", Integer, ForeignKey("a.aid"), key="baid") ) op_obj = ops.CreateTableOp.from_table(b) py_code = autogenerate.render_op_text(self.autogen_context, op_obj) eq_ignore_whitespace( py_code, "op.create_table('b'," "sa.Column('a_id', sa.Integer(), nullable=True)," "sa.ForeignKeyConstraint(['a_id'], ['a.id'], ))", ) context = op_fixture() eval(py_code) context.assert_( "CREATE TABLE b (a_id INTEGER, " "FOREIGN KEY(a_id) REFERENCES a (id))" ) def test_add_fk_constraint_separate_colkeys(self): m = MetaData() Table("a", m, Column("id", Integer, key="aid", primary_key=True)) b = Table("b", m, Column("a_id", Integer, key="baid")) fk = ForeignKeyConstraint(["baid"], ["a.aid"], name="fk_a_id") b.append_constraint(fk) op_obj = ops.CreateTableOp.from_table(b) py_code = autogenerate.render_op_text(self.autogen_context, op_obj) eq_ignore_whitespace( py_code, "op.create_table('b'," "sa.Column('a_id', sa.Integer(), nullable=True)," "sa.ForeignKeyConstraint(['a_id'], ['a.id'], name='fk_a_id'))", ) context = op_fixture() eval(py_code) context.assert_( "CREATE TABLE b (a_id INTEGER, CONSTRAINT " "fk_a_id FOREIGN KEY(a_id) REFERENCES a (id))" ) context = op_fixture() op_obj = ops.AddConstraintOp.from_constraint(fk) eq_ignore_whitespace( autogenerate.render_op_text(self.autogen_context, op_obj), "op.create_foreign_key('fk_a_id', 'b', 'a', ['a_id'], ['id'])", ) py_code = autogenerate.render_op_text(self.autogen_context, op_obj) eval(py_code) context.assert_( "ALTER TABLE b ADD CONSTRAINT fk_a_id " "FOREIGN KEY(a_id) REFERENCES a (id)" ) def test_add_fk_constraint_schema(self): m = MetaData() Table( "a", m, Column("id", Integer, primary_key=True), schema="CamelSchemaTwo", ) b = Table( "b", m, Column("a_id", Integer, ForeignKey("a.id")), schema="CamelSchemaOne", ) fk = ForeignKeyConstraint( ["a_id"], ["CamelSchemaTwo.a.id"], name="fk_a_id" ) b.append_constraint(fk) op_obj = ops.AddConstraintOp.from_constraint(fk) eq_ignore_whitespace( autogenerate.render_op_text(self.autogen_context, op_obj), "op.create_foreign_key('fk_a_id', 'b', 'a', ['a_id'], ['id']," " source_schema='CamelSchemaOne', " "referent_schema='CamelSchemaTwo')", ) def test_add_fk_constraint_schema_batch(self): m = MetaData() Table( "a", m, Column("id", Integer, primary_key=True), schema="CamelSchemaTwo", ) b = Table( "b", m, Column("a_id", Integer, ForeignKey("a.id")), schema="CamelSchemaOne", ) fk = ForeignKeyConstraint( ["a_id"], ["CamelSchemaTwo.a.id"], name="fk_a_id" ) b.append_constraint(fk) op_obj = ops.AddConstraintOp.from_constraint(fk) with self.autogen_context._within_batch(): eq_ignore_whitespace( autogenerate.render_op_text(self.autogen_context, op_obj), "batch_op.create_foreign_key('fk_a_id', 'a', ['a_id'], ['id']," " referent_schema='CamelSchemaTwo')", ) def test_drop_fk_constraint(self): m = MetaData() Table("a", m, Column("id", Integer, primary_key=True)) b = Table("b", m, Column("a_id", Integer, ForeignKey("a.id"))) fk = ForeignKeyConstraint(["a_id"], ["a.id"], name="fk_a_id") b.append_constraint(fk) op_obj = ops.DropConstraintOp.from_constraint(fk) eq_ignore_whitespace( autogenerate.render_op_text(self.autogen_context, op_obj), "op.drop_constraint('fk_a_id', 'b', type_='foreignkey')", ) def test_drop_fk_constraint_batch(self): m = MetaData() Table("a", m, Column("id", Integer, primary_key=True)) b = Table("b", m, Column("a_id", Integer, ForeignKey("a.id"))) fk = ForeignKeyConstraint(["a_id"], ["a.id"], name="fk_a_id") b.append_constraint(fk) op_obj = ops.DropConstraintOp.from_constraint(fk) with self.autogen_context._within_batch(): eq_ignore_whitespace( autogenerate.render_op_text(self.autogen_context, op_obj), "batch_op.drop_constraint('fk_a_id', type_='foreignkey')", ) def test_drop_fk_constraint_schema(self): m = MetaData() Table( "a", m, Column("id", Integer, primary_key=True), schema="CamelSchemaTwo", ) b = Table( "b", m, Column("a_id", Integer, ForeignKey("a.id")), schema="CamelSchemaOne", ) fk = ForeignKeyConstraint( ["a_id"], ["CamelSchemaTwo.a.id"], name="fk_a_id" ) b.append_constraint(fk) op_obj = ops.DropConstraintOp.from_constraint(fk) eq_ignore_whitespace( autogenerate.render_op_text(self.autogen_context, op_obj), "op.drop_constraint('fk_a_id', 'b', schema='CamelSchemaOne', " "type_='foreignkey')", ) def test_drop_fk_constraint_batch_schema(self): m = MetaData() Table( "a", m, Column("id", Integer, primary_key=True), schema="CamelSchemaTwo", ) b = Table( "b", m, Column("a_id", Integer, ForeignKey("a.id")), schema="CamelSchemaOne", ) fk = ForeignKeyConstraint( ["a_id"], ["CamelSchemaTwo.a.id"], name="fk_a_id" ) b.append_constraint(fk) op_obj = ops.DropConstraintOp.from_constraint(fk) with self.autogen_context._within_batch(): eq_ignore_whitespace( autogenerate.render_op_text(self.autogen_context, op_obj), "batch_op.drop_constraint('fk_a_id', type_='foreignkey')", ) def test_render_table_upgrade(self): m = MetaData() t = Table( "test", m, Column("id", Integer, primary_key=True), Column("name", Unicode(255)), Column("address_id", Integer, ForeignKey("address.id")), Column("timestamp", DATETIME, server_default="NOW()"), Column("amount", Numeric(5, 2)), UniqueConstraint("name", name="uq_name"), UniqueConstraint("timestamp"), ) op_obj = ops.CreateTableOp.from_table(t) eq_ignore_whitespace( autogenerate.render_op_text(self.autogen_context, op_obj), "op.create_table('test'," "sa.Column('id', sa.Integer(), nullable=False)," "sa.Column('name', sa.Unicode(length=255), nullable=True)," "sa.Column('address_id', sa.Integer(), nullable=True)," "sa.Column('timestamp', sa.DATETIME(), " "server_default='NOW()', " "nullable=True)," "sa.Column('amount', sa.Numeric(precision=5, scale=2), " "nullable=True)," "sa.ForeignKeyConstraint(['address_id'], ['address.id'], )," "sa.PrimaryKeyConstraint('id')," "sa.UniqueConstraint('name', name='uq_name')," "sa.UniqueConstraint('timestamp')" ")", ) def test_render_table_w_schema(self): m = MetaData() t = Table( "test", m, Column("id", Integer, primary_key=True), Column("q", Integer, ForeignKey("address.id")), schema="foo", ) op_obj = ops.CreateTableOp.from_table(t) eq_ignore_whitespace( autogenerate.render_op_text(self.autogen_context, op_obj), "op.create_table('test'," "sa.Column('id', sa.Integer(), nullable=False)," "sa.Column('q', sa.Integer(), nullable=True)," "sa.ForeignKeyConstraint(['q'], ['address.id'], )," "sa.PrimaryKeyConstraint('id')," "schema='foo'" ")", ) def test_render_table_w_system(self): m = MetaData() t = Table( "sometable", m, Column("id", Integer, primary_key=True), Column("xmin", Integer, system=True, nullable=False), ) op_obj = ops.CreateTableOp.from_table(t) eq_ignore_whitespace( autogenerate.render_op_text(self.autogen_context, op_obj), "op.create_table('sometable'," "sa.Column('id', sa.Integer(), nullable=False)," "sa.Column('xmin', sa.Integer(), nullable=False, system=True)," "sa.PrimaryKeyConstraint('id'))", ) def test_render_table_w_unicode_name(self): m = MetaData() t = Table( compat.ue("\u0411\u0435\u0437"), m, Column("id", Integer, primary_key=True), ) op_obj = ops.CreateTableOp.from_table(t) eq_ignore_whitespace( autogenerate.render_op_text(self.autogen_context, op_obj), "op.create_table(%r," "sa.Column('id', sa.Integer(), nullable=False)," "sa.PrimaryKeyConstraint('id'))" % compat.ue("\u0411\u0435\u0437"), ) def test_render_table_w_unicode_schema(self): m = MetaData() t = Table( "test", m, Column("id", Integer, primary_key=True), schema=compat.ue("\u0411\u0435\u0437"), ) op_obj = ops.CreateTableOp.from_table(t) eq_ignore_whitespace( autogenerate.render_op_text(self.autogen_context, op_obj), "op.create_table('test'," "sa.Column('id', sa.Integer(), nullable=False)," "sa.PrimaryKeyConstraint('id')," "schema=%r)" % compat.ue("\u0411\u0435\u0437"), ) def test_render_table_w_unsupported_constraint(self): from sqlalchemy.sql.schema import ColumnCollectionConstraint class SomeCustomConstraint(ColumnCollectionConstraint): __visit_name__ = "some_custom" m = MetaData() t = Table("t", m, Column("id", Integer), SomeCustomConstraint("id")) op_obj = ops.CreateTableOp.from_table(t) with assertions.expect_warnings( "No renderer is established for object SomeCustomConstraint" ): eq_ignore_whitespace( autogenerate.render_op_text(self.autogen_context, op_obj), "op.create_table('t'," "sa.Column('id', sa.Integer(), nullable=True)," "[Unknown Python object " "SomeCustomConstraint(Column('id', Integer(), table=))])", ) @mock.patch("alembic.autogenerate.render.MAX_PYTHON_ARGS", 3) def test_render_table_max_cols(self): m = MetaData() t = Table( "test", m, Column("a", Integer), Column("b", Integer), Column("c", Integer), Column("d", Integer), ) op_obj = ops.CreateTableOp.from_table(t) eq_ignore_whitespace( autogenerate.render_op_text(self.autogen_context, op_obj), "op.create_table('test'," "*[sa.Column('a', sa.Integer(), nullable=True)," "sa.Column('b', sa.Integer(), nullable=True)," "sa.Column('c', sa.Integer(), nullable=True)," "sa.Column('d', sa.Integer(), nullable=True)])", ) t2 = Table( "test2", m, Column("a", Integer), Column("b", Integer), Column("c", Integer), ) op_obj = ops.CreateTableOp.from_table(t2) eq_ignore_whitespace( autogenerate.render_op_text(self.autogen_context, op_obj), "op.create_table('test2'," "sa.Column('a', sa.Integer(), nullable=True)," "sa.Column('b', sa.Integer(), nullable=True)," "sa.Column('c', sa.Integer(), nullable=True))", ) def test_render_table_w_fk_schema(self): m = MetaData() t = Table( "test", m, Column("id", Integer, primary_key=True), Column("q", Integer, ForeignKey("foo.address.id")), ) op_obj = ops.CreateTableOp.from_table(t) eq_ignore_whitespace( autogenerate.render_op_text(self.autogen_context, op_obj), "op.create_table('test'," "sa.Column('id', sa.Integer(), nullable=False)," "sa.Column('q', sa.Integer(), nullable=True)," "sa.ForeignKeyConstraint(['q'], ['foo.address.id'], )," "sa.PrimaryKeyConstraint('id')" ")", ) def test_render_table_w_metadata_schema(self): m = MetaData(schema="foo") t = Table( "test", m, Column("id", Integer, primary_key=True), Column("q", Integer, ForeignKey("address.id")), ) op_obj = ops.CreateTableOp.from_table(t) eq_ignore_whitespace( re.sub( r"u'", "'", autogenerate.render_op_text(self.autogen_context, op_obj), ), "op.create_table('test'," "sa.Column('id', sa.Integer(), nullable=False)," "sa.Column('q', sa.Integer(), nullable=True)," "sa.ForeignKeyConstraint(['q'], ['foo.address.id'], )," "sa.PrimaryKeyConstraint('id')," "schema='foo'" ")", ) def test_render_table_w_metadata_schema_override(self): m = MetaData(schema="foo") t = Table( "test", m, Column("id", Integer, primary_key=True), Column("q", Integer, ForeignKey("bar.address.id")), ) op_obj = ops.CreateTableOp.from_table(t) eq_ignore_whitespace( autogenerate.render_op_text(self.autogen_context, op_obj), "op.create_table('test'," "sa.Column('id', sa.Integer(), nullable=False)," "sa.Column('q', sa.Integer(), nullable=True)," "sa.ForeignKeyConstraint(['q'], ['bar.address.id'], )," "sa.PrimaryKeyConstraint('id')," "schema='foo'" ")", ) def test_render_addtl_args(self): m = MetaData() t = Table( "test", m, Column("id", Integer, primary_key=True), Column("q", Integer, ForeignKey("bar.address.id")), sqlite_autoincrement=True, mysql_engine="InnoDB", ) op_obj = ops.CreateTableOp.from_table(t) eq_ignore_whitespace( autogenerate.render_op_text(self.autogen_context, op_obj), "op.create_table('test'," "sa.Column('id', sa.Integer(), nullable=False)," "sa.Column('q', sa.Integer(), nullable=True)," "sa.ForeignKeyConstraint(['q'], ['bar.address.id'], )," "sa.PrimaryKeyConstraint('id')," "mysql_engine='InnoDB',sqlite_autoincrement=True)", ) def test_render_drop_table(self): op_obj = ops.DropTableOp.from_table(Table("sometable", MetaData())) eq_ignore_whitespace( autogenerate.render_op_text(self.autogen_context, op_obj), "op.drop_table('sometable')", ) def test_render_drop_table_w_schema(self): op_obj = ops.DropTableOp.from_table( Table("sometable", MetaData(), schema="foo") ) eq_ignore_whitespace( autogenerate.render_op_text(self.autogen_context, op_obj), "op.drop_table('sometable', schema='foo')", ) def test_render_table_no_implicit_check(self): m = MetaData() t = Table("test", m, Column("x", Boolean())) op_obj = ops.CreateTableOp.from_table(t) eq_ignore_whitespace( autogenerate.render_op_text(self.autogen_context, op_obj), "op.create_table('test'," "sa.Column('x', sa.Boolean(), nullable=True))", ) def test_render_pk_with_col_name_vs_col_key(self): m = MetaData() t1 = Table("t1", m, Column("x", Integer, key="y", primary_key=True)) op_obj = ops.CreateTableOp.from_table(t1) eq_ignore_whitespace( autogenerate.render_op_text(self.autogen_context, op_obj), "op.create_table('t1'," "sa.Column('x', sa.Integer(), nullable=False)," "sa.PrimaryKeyConstraint('x'))", ) def test_render_empty_pk_vs_nonempty_pk(self): m = MetaData() t1 = Table("t1", m, Column("x", Integer)) t2 = Table("t2", m, Column("x", Integer, primary_key=True)) op_obj = ops.CreateTableOp.from_table(t1) eq_ignore_whitespace( autogenerate.render_op_text(self.autogen_context, op_obj), "op.create_table('t1'," "sa.Column('x', sa.Integer(), nullable=True))", ) op_obj = ops.CreateTableOp.from_table(t2) eq_ignore_whitespace( autogenerate.render_op_text(self.autogen_context, op_obj), "op.create_table('t2'," "sa.Column('x', sa.Integer(), nullable=False)," "sa.PrimaryKeyConstraint('x'))", ) def test_render_table_w_autoincrement(self): m = MetaData() t = Table( "test", m, Column("id1", Integer, primary_key=True), Column("id2", Integer, primary_key=True, autoincrement=True), ) op_obj = ops.CreateTableOp.from_table(t) eq_ignore_whitespace( autogenerate.render_op_text(self.autogen_context, op_obj), "op.create_table('test'," "sa.Column('id1', sa.Integer(), nullable=False)," "sa.Column('id2', sa.Integer(), autoincrement=True, " "nullable=False)," "sa.PrimaryKeyConstraint('id1', 'id2')" ")", ) def test_render_add_column(self): op_obj = ops.AddColumnOp( "foo", Column("x", Integer, server_default="5") ) eq_ignore_whitespace( autogenerate.render_op_text(self.autogen_context, op_obj), "op.add_column('foo', sa.Column('x', sa.Integer(), " "server_default='5', nullable=True))", ) def test_render_add_column_system(self): # this would never actually happen since "system" columns # can't be added in any case. Howver it will render as # part of op.CreateTableOp. op_obj = ops.AddColumnOp("foo", Column("xmin", Integer, system=True)) eq_ignore_whitespace( autogenerate.render_op_text(self.autogen_context, op_obj), "op.add_column('foo', sa.Column('xmin', sa.Integer(), " "nullable=True, system=True))", ) def test_render_add_column_w_schema(self): op_obj = ops.AddColumnOp( "bar", Column("x", Integer, server_default="5"), schema="foo" ) eq_ignore_whitespace( autogenerate.render_op_text(self.autogen_context, op_obj), "op.add_column('bar', sa.Column('x', sa.Integer(), " "server_default='5', nullable=True), schema='foo')", ) def test_render_drop_column(self): op_obj = ops.DropColumnOp.from_column_and_tablename( None, "foo", Column("x", Integer, server_default="5") ) eq_ignore_whitespace( autogenerate.render_op_text(self.autogen_context, op_obj), "op.drop_column('foo', 'x')", ) def test_render_drop_column_w_schema(self): op_obj = ops.DropColumnOp.from_column_and_tablename( "foo", "bar", Column("x", Integer, server_default="5") ) eq_ignore_whitespace( autogenerate.render_op_text(self.autogen_context, op_obj), "op.drop_column('bar', 'x', schema='foo')", ) def test_render_quoted_server_default(self): eq_( autogenerate.render._render_server_default( "nextval('group_to_perm_group_to_perm_id_seq'::regclass)", self.autogen_context, ), "\"nextval('group_to_perm_group_to_perm_id_seq'::regclass)\"", ) def test_render_unicode_server_default(self): default = compat.ue( "\u0411\u0435\u0437 " "\u043d\u0430\u0437\u0432\u0430\u043d\u0438\u044f" ) c = Column("x", Unicode, server_default=text(default)) eq_ignore_whitespace( autogenerate.render._render_server_default( c.server_default, self.autogen_context ), "sa.text(%r)" % default, ) def test_render_col_with_server_default(self): c = Column( "updated_at", TIMESTAMP(), server_default='TIMEZONE("utc", CURRENT_TIMESTAMP)', nullable=False, ) result = autogenerate.render._render_column(c, self.autogen_context) eq_ignore_whitespace( result, "sa.Column('updated_at', sa.TIMESTAMP(), " "server_default='TIMEZONE(\"utc\", CURRENT_TIMESTAMP)', " "nullable=False)", ) @config.requirements.comments_api def test_render_col_with_comment(self): c = Column("some_key", Integer, comment="This is a comment") Table("some_table", MetaData(), c) result = autogenerate.render._render_column(c, self.autogen_context) eq_ignore_whitespace( result, "sa.Column('some_key', sa.Integer(), " "nullable=True, " "comment='This is a comment')", ) @config.requirements.comments_api def test_render_col_comment_with_quote(self): c = Column("some_key", Integer, comment="This is a john's comment") Table("some_table", MetaData(), c) result = autogenerate.render._render_column(c, self.autogen_context) eq_ignore_whitespace( result, "sa.Column('some_key', sa.Integer(), " "nullable=True, " 'comment="This is a john\'s comment")', ) def test_render_col_autoinc_false_mysql(self): c = Column("some_key", Integer, primary_key=True, autoincrement=False) Table("some_table", MetaData(), c) result = autogenerate.render._render_column(c, self.autogen_context) eq_ignore_whitespace( result, "sa.Column('some_key', sa.Integer(), " "autoincrement=False, " "nullable=False)", ) def test_render_custom(self): class MySpecialType(Integer): pass def render(type_, obj, context): if type_ == "foreign_key": return None if type_ == "column": if obj.name == "y": return None elif obj.name == "q": return False else: return "col(%s)" % obj.name if type_ == "type" and isinstance(obj, MySpecialType): context.imports.add("from mypackage import MySpecialType") return "MySpecialType()" return "render:%s" % type_ self.autogen_context.opts.update( render_item=render, alembic_module_prefix="sa." ) t = Table( "t", MetaData(), Column("x", Integer), Column("y", Integer), Column("q", MySpecialType()), PrimaryKeyConstraint("x"), ForeignKeyConstraint(["x"], ["y"]), ) op_obj = ops.CreateTableOp.from_table(t) result = autogenerate.render_op_text(self.autogen_context, op_obj) eq_ignore_whitespace( result, "sa.create_table('t'," "col(x)," "sa.Column('q', MySpecialType(), nullable=True)," "render:primary_key)", ) eq_( self.autogen_context.imports, set(["from mypackage import MySpecialType"]), ) def test_render_modify_type(self): op_obj = ops.AlterColumnOp( "sometable", "somecolumn", modify_type=CHAR(10), existing_type=CHAR(20), ) eq_ignore_whitespace( autogenerate.render_op_text(self.autogen_context, op_obj), "op.alter_column('sometable', 'somecolumn', " "existing_type=sa.CHAR(length=20), type_=sa.CHAR(length=10))", ) def test_render_modify_type_w_schema(self): op_obj = ops.AlterColumnOp( "sometable", "somecolumn", modify_type=CHAR(10), existing_type=CHAR(20), schema="foo", ) eq_ignore_whitespace( autogenerate.render_op_text(self.autogen_context, op_obj), "op.alter_column('sometable', 'somecolumn', " "existing_type=sa.CHAR(length=20), type_=sa.CHAR(length=10), " "schema='foo')", ) def test_render_modify_nullable(self): op_obj = ops.AlterColumnOp( "sometable", "somecolumn", existing_type=Integer(), modify_nullable=True, ) eq_ignore_whitespace( autogenerate.render_op_text(self.autogen_context, op_obj), "op.alter_column('sometable', 'somecolumn', " "existing_type=sa.Integer(), nullable=True)", ) def test_render_modify_nullable_no_existing_type(self): op_obj = ops.AlterColumnOp( "sometable", "somecolumn", modify_nullable=True ) eq_ignore_whitespace( autogenerate.render_op_text(self.autogen_context, op_obj), "op.alter_column('sometable', 'somecolumn', nullable=True)", ) def test_render_modify_nullable_w_schema(self): op_obj = ops.AlterColumnOp( "sometable", "somecolumn", existing_type=Integer(), modify_nullable=True, schema="foo", ) eq_ignore_whitespace( autogenerate.render_op_text(self.autogen_context, op_obj), "op.alter_column('sometable', 'somecolumn', " "existing_type=sa.Integer(), nullable=True, schema='foo')", ) def test_render_modify_type_w_autoincrement(self): op_obj = ops.AlterColumnOp( "sometable", "somecolumn", modify_type=Integer(), existing_type=BigInteger(), autoincrement=True, ) eq_ignore_whitespace( autogenerate.render_op_text(self.autogen_context, op_obj), "op.alter_column('sometable', 'somecolumn', " "existing_type=sa.BigInteger(), type_=sa.Integer(), " "autoincrement=True)", ) def test_render_fk_constraint_kwarg(self): m = MetaData() t1 = Table("t", m, Column("c", Integer)) t2 = Table("t2", m, Column("c_rem", Integer)) fk = ForeignKeyConstraint([t1.c.c], [t2.c.c_rem], onupdate="CASCADE") # SQLA 0.9 generates a u'' here for remote cols while 0.8 does not, # so just whack out "'u" here from the generated eq_ignore_whitespace( re.sub( r"u'", "'", autogenerate.render._render_constraint( fk, self.autogen_context ), ), "sa.ForeignKeyConstraint(['c'], ['t2.c_rem'], onupdate='CASCADE')", ) fk = ForeignKeyConstraint([t1.c.c], [t2.c.c_rem], ondelete="CASCADE") eq_ignore_whitespace( re.sub( r"u'", "'", autogenerate.render._render_constraint( fk, self.autogen_context ), ), "sa.ForeignKeyConstraint(['c'], ['t2.c_rem'], ondelete='CASCADE')", ) fk = ForeignKeyConstraint([t1.c.c], [t2.c.c_rem], deferrable=True) eq_ignore_whitespace( re.sub( r"u'", "'", autogenerate.render._render_constraint( fk, self.autogen_context ), ), "sa.ForeignKeyConstraint(['c'], ['t2.c_rem'], deferrable=True)", ) fk = ForeignKeyConstraint([t1.c.c], [t2.c.c_rem], initially="XYZ") eq_ignore_whitespace( re.sub( r"u'", "'", autogenerate.render._render_constraint( fk, self.autogen_context ), ), "sa.ForeignKeyConstraint(['c'], ['t2.c_rem'], initially='XYZ')", ) fk = ForeignKeyConstraint( [t1.c.c], [t2.c.c_rem], initially="XYZ", ondelete="CASCADE", deferrable=True, ) eq_ignore_whitespace( re.sub( r"u'", "'", autogenerate.render._render_constraint( fk, self.autogen_context ), ), "sa.ForeignKeyConstraint(['c'], ['t2.c_rem'], " "ondelete='CASCADE', initially='XYZ', deferrable=True)", ) def test_render_fk_constraint_resolve_key(self): m = MetaData() t1 = Table("t", m, Column("c", Integer)) Table("t2", m, Column("c_rem", Integer, key="c_remkey")) fk = ForeignKeyConstraint(["c"], ["t2.c_remkey"]) t1.append_constraint(fk) eq_ignore_whitespace( re.sub( r"u'", "'", autogenerate.render._render_constraint( fk, self.autogen_context ), ), "sa.ForeignKeyConstraint(['c'], ['t2.c_rem'], )", ) def test_render_fk_constraint_bad_table_resolve(self): m = MetaData() t1 = Table("t", m, Column("c", Integer)) Table("t2", m, Column("c_rem", Integer)) fk = ForeignKeyConstraint(["c"], ["t2.nonexistent"]) t1.append_constraint(fk) eq_ignore_whitespace( re.sub( r"u'", "'", autogenerate.render._render_constraint( fk, self.autogen_context ), ), "sa.ForeignKeyConstraint(['c'], ['t2.nonexistent'], )", ) def test_render_fk_constraint_bad_table_resolve_dont_get_confused(self): m = MetaData() t1 = Table("t", m, Column("c", Integer)) Table( "t2", m, Column("c_rem", Integer, key="cr_key"), Column("c_rem_2", Integer, key="c_rem"), ) fk = ForeignKeyConstraint(["c"], ["t2.c_rem"], link_to_name=True) t1.append_constraint(fk) eq_ignore_whitespace( re.sub( r"u'", "'", autogenerate.render._render_constraint( fk, self.autogen_context ), ), "sa.ForeignKeyConstraint(['c'], ['t2.c_rem'], )", ) def test_render_fk_constraint_link_to_name(self): m = MetaData() t1 = Table("t", m, Column("c", Integer)) Table("t2", m, Column("c_rem", Integer, key="c_remkey")) fk = ForeignKeyConstraint(["c"], ["t2.c_rem"], link_to_name=True) t1.append_constraint(fk) eq_ignore_whitespace( re.sub( r"u'", "'", autogenerate.render._render_constraint( fk, self.autogen_context ), ), "sa.ForeignKeyConstraint(['c'], ['t2.c_rem'], )", ) def test_render_fk_constraint_use_alter(self): m = MetaData() Table("t", m, Column("c", Integer)) t2 = Table( "t2", m, Column( "c_rem", Integer, ForeignKey("t.c", name="fk1", use_alter=True) ), ) const = list(t2.foreign_keys)[0].constraint eq_ignore_whitespace( autogenerate.render._render_constraint( const, self.autogen_context ), "sa.ForeignKeyConstraint(['c_rem'], ['t.c'], " "name='fk1', use_alter=True)", ) def test_render_fk_constraint_w_metadata_schema(self): m = MetaData(schema="foo") t1 = Table("t", m, Column("c", Integer)) t2 = Table("t2", m, Column("c_rem", Integer)) fk = ForeignKeyConstraint([t1.c.c], [t2.c.c_rem], onupdate="CASCADE") eq_ignore_whitespace( re.sub( r"u'", "'", autogenerate.render._render_constraint( fk, self.autogen_context ), ), "sa.ForeignKeyConstraint(['c'], ['foo.t2.c_rem'], " "onupdate='CASCADE')", ) def test_render_check_constraint_literal(self): eq_ignore_whitespace( autogenerate.render._render_check_constraint( CheckConstraint("im a constraint", name="cc1"), self.autogen_context, ), "sa.CheckConstraint(!U'im a constraint', name='cc1')", ) def test_render_check_constraint_sqlexpr(self): c = column("c") five = literal_column("5") ten = literal_column("10") eq_ignore_whitespace( autogenerate.render._render_check_constraint( CheckConstraint(and_(c > five, c < ten)), self.autogen_context ), "sa.CheckConstraint(!U'c > 5 AND c < 10')", ) def test_render_check_constraint_literal_binds(self): c = column("c") eq_ignore_whitespace( autogenerate.render._render_check_constraint( CheckConstraint(and_(c > 5, c < 10)), self.autogen_context ), "sa.CheckConstraint(!U'c > 5 AND c < 10')", ) def test_render_unique_constraint_opts(self): m = MetaData() t = Table("t", m, Column("c", Integer)) eq_ignore_whitespace( autogenerate.render._render_unique_constraint( UniqueConstraint(t.c.c, name="uq_1", deferrable="XYZ"), self.autogen_context, ), "sa.UniqueConstraint('c', deferrable='XYZ', name='uq_1')", ) def test_add_unique_constraint_unicode_schema(self): m = MetaData() t = Table( "t", m, Column("c", Integer), schema=compat.ue("\u0411\u0435\u0437"), ) op_obj = ops.AddConstraintOp.from_constraint(UniqueConstraint(t.c.c)) eq_ignore_whitespace( autogenerate.render_op_text(self.autogen_context, op_obj), "op.create_unique_constraint(None, 't', ['c'], " "schema=%r)" % compat.ue("\u0411\u0435\u0437"), ) def test_render_modify_nullable_w_default(self): op_obj = ops.AlterColumnOp( "sometable", "somecolumn", existing_type=Integer(), existing_server_default="5", modify_nullable=True, ) eq_ignore_whitespace( autogenerate.render_op_text(self.autogen_context, op_obj), "op.alter_column('sometable', 'somecolumn', " "existing_type=sa.Integer(), nullable=True, " "existing_server_default='5')", ) def test_render_enum(self): eq_ignore_whitespace( autogenerate.render._repr_type( Enum("one", "two", "three", name="myenum"), self.autogen_context, ), "sa.Enum('one', 'two', 'three', name='myenum')", ) eq_ignore_whitespace( autogenerate.render._repr_type( Enum("one", "two", "three"), self.autogen_context ), "sa.Enum('one', 'two', 'three')", ) def test_render_non_native_enum(self): eq_ignore_whitespace( autogenerate.render._repr_type( Enum("one", "two", "three", native_enum=False), self.autogen_context, ), "sa.Enum('one', 'two', 'three', native_enum=False)", ) def test_repr_plain_sqla_type(self): type_ = Integer() eq_ignore_whitespace( autogenerate.render._repr_type(type_, self.autogen_context), "sa.Integer()", ) def test_generic_array_type(self): eq_ignore_whitespace( autogenerate.render._repr_type( types.ARRAY(Integer), self.autogen_context ), "sa.ARRAY(sa.Integer())", ) eq_ignore_whitespace( autogenerate.render._repr_type( types.ARRAY(DateTime(timezone=True)), self.autogen_context ), "sa.ARRAY(sa.DateTime(timezone=True))", ) def test_render_array_no_context(self): uo = ops.UpgradeOps( ops=[ ops.CreateTableOp( "sometable", [Column("x", types.ARRAY(Integer))] ) ] ) eq_( autogenerate.render_python_code(uo), "# ### commands auto generated by Alembic - please adjust! ###\n" " op.create_table('sometable',\n" " sa.Column('x', sa.ARRAY(sa.Integer()), nullable=True)\n" " )\n" " # ### end Alembic commands ###", ) def test_repr_custom_type_w_sqla_prefix(self): self.autogen_context.opts["user_module_prefix"] = None class MyType(UserDefinedType): pass MyType.__module__ = "sqlalchemy_util.types" type_ = MyType() eq_ignore_whitespace( autogenerate.render._repr_type(type_, self.autogen_context), "sqlalchemy_util.types.MyType()", ) def test_repr_user_type_user_prefix_None(self): class MyType(UserDefinedType): def get_col_spec(self): return "MYTYPE" type_ = MyType() self.autogen_context.opts["user_module_prefix"] = None eq_ignore_whitespace( autogenerate.render._repr_type(type_, self.autogen_context), "tests.test_autogen_render.MyType()", ) def test_repr_user_type_user_prefix_present(self): from sqlalchemy.types import UserDefinedType class MyType(UserDefinedType): def get_col_spec(self): return "MYTYPE" type_ = MyType() self.autogen_context.opts["user_module_prefix"] = "user." eq_ignore_whitespace( autogenerate.render._repr_type(type_, self.autogen_context), "user.MyType()", ) def test_repr_dialect_type(self): from sqlalchemy.dialects.mysql import VARCHAR type_ = VARCHAR(20, charset="utf8", national=True) self.autogen_context.opts["user_module_prefix"] = None eq_ignore_whitespace( autogenerate.render._repr_type(type_, self.autogen_context), "mysql.VARCHAR(charset='utf8', national=True, length=20)", ) eq_( self.autogen_context.imports, set(["from sqlalchemy.dialects import mysql"]), ) def test_render_server_default_text(self): c = Column( "updated_at", TIMESTAMP(), server_default=text("now()"), nullable=False, ) result = autogenerate.render._render_column(c, self.autogen_context) eq_ignore_whitespace( result, "sa.Column('updated_at', sa.TIMESTAMP(), " "server_default=sa.text(!U'now()'), " "nullable=False)", ) def test_render_server_default_non_native_boolean(self): c = Column( "updated_at", Boolean(), server_default=false(), nullable=False ) result = autogenerate.render._render_column(c, self.autogen_context) eq_ignore_whitespace( result, "sa.Column('updated_at', sa.Boolean(), " "server_default=sa.text(!U'0'), " "nullable=False)", ) def test_render_server_default_func(self): c = Column( "updated_at", TIMESTAMP(), server_default=func.now(), nullable=False, ) result = autogenerate.render._render_column(c, self.autogen_context) eq_ignore_whitespace( result, "sa.Column('updated_at', sa.TIMESTAMP(), " "server_default=sa.text(!U'now()'), " "nullable=False)", ) def test_render_server_default_int(self): c = Column("value", Integer, server_default="0") result = autogenerate.render._render_column(c, self.autogen_context) eq_( result, "sa.Column('value', sa.Integer(), " "server_default='0', nullable=True)", ) def test_render_modify_reflected_int_server_default(self): op_obj = ops.AlterColumnOp( "sometable", "somecolumn", existing_type=Integer(), existing_server_default=DefaultClause(text("5")), modify_nullable=True, ) eq_ignore_whitespace( autogenerate.render_op_text(self.autogen_context, op_obj), "op.alter_column('sometable', 'somecolumn', " "existing_type=sa.Integer(), nullable=True, " "existing_server_default=sa.text(!U'5'))", ) def test_render_executesql_plaintext(self): op_obj = ops.ExecuteSQLOp("drop table foo") eq_( autogenerate.render_op_text(self.autogen_context, op_obj), "op.execute('drop table foo')", ) def test_render_executesql_sqlexpr_notimplemented(self): sql = table("x", column("q")).insert() op_obj = ops.ExecuteSQLOp(sql) assert_raises( NotImplementedError, autogenerate.render_op_text, self.autogen_context, op_obj, ) @config.requirements.comments_api def test_render_alter_column_modify_comment(self): op_obj = ops.AlterColumnOp( "sometable", "somecolumn", modify_comment="This is a comment" ) eq_ignore_whitespace( autogenerate.render_op_text(self.autogen_context, op_obj), "op.alter_column('sometable', 'somecolumn', " "comment='This is a comment')", ) @config.requirements.comments_api def test_render_alter_column_existing_comment(self): op_obj = ops.AlterColumnOp( "sometable", "somecolumn", existing_comment="This is a comment" ) eq_ignore_whitespace( autogenerate.render_op_text(self.autogen_context, op_obj), "op.alter_column('sometable', 'somecolumn', " "existing_comment='This is a comment')", ) @config.requirements.comments_api def test_render_col_drop_comment(self): op_obj = ops.AlterColumnOp( "sometable", "somecolumn", existing_comment="This is a comment", modify_comment=None, ) eq_ignore_whitespace( autogenerate.render_op_text(self.autogen_context, op_obj), "op.alter_column('sometable', 'somecolumn', " "comment=None, " "existing_comment='This is a comment')", ) @config.requirements.comments_api def test_render_table_with_comment(self): m = MetaData() t = Table( "test", m, Column("id", Integer, primary_key=True), Column("q", Integer, ForeignKey("address.id")), comment="test comment", ) op_obj = ops.CreateTableOp.from_table(t) eq_ignore_whitespace( autogenerate.render_op_text(self.autogen_context, op_obj), "op.create_table('test'," "sa.Column('id', sa.Integer(), nullable=False)," "sa.Column('q', sa.Integer(), nullable=True)," "sa.ForeignKeyConstraint(['q'], ['address.id'], )," "sa.PrimaryKeyConstraint('id')," "comment='test comment'" ")", ) @config.requirements.comments_api def test_render_add_column_with_comment(self): op_obj = ops.AddColumnOp( "foo", Column("x", Integer, comment="This is a Column") ) eq_ignore_whitespace( autogenerate.render_op_text(self.autogen_context, op_obj), "op.add_column('foo', sa.Column('x', sa.Integer(), " "nullable=True, comment='This is a Column'))", ) @config.requirements.comments_api def test_render_create_table_comment_op(self): op_obj = ops.CreateTableCommentOp("table_name", "comment") eq_ignore_whitespace( autogenerate.render_op_text(self.autogen_context, op_obj), "op.create_table_comment(" " 'table_name'," " 'comment'," " existing_comment=None," " schema=None" ")", ) def test_render_create_table_comment_op_with_existing_comment(self): op_obj = ops.CreateTableCommentOp( "table_name", "comment", existing_comment="old comment" ) eq_ignore_whitespace( autogenerate.render_op_text(self.autogen_context, op_obj), "op.create_table_comment(" " 'table_name'," " 'comment'," " existing_comment='old comment'," " schema=None" ")", ) def test_render_create_table_comment_op_with_schema(self): op_obj = ops.CreateTableCommentOp( "table_name", "comment", schema="SomeSchema" ) eq_ignore_whitespace( autogenerate.render_op_text(self.autogen_context, op_obj), "op.create_table_comment(" " 'table_name'," " 'comment'," " existing_comment=None," " schema='SomeSchema'" ")", ) def test_render_drop_table_comment_op(self): op_obj = ops.DropTableCommentOp("table_name") eq_ignore_whitespace( autogenerate.render_op_text(self.autogen_context, op_obj), "op.drop_table_comment(" " 'table_name'," " existing_comment=None," " schema=None" ")", ) class RenderNamingConventionTest(TestBase): def setUp(self): convention = { "ix": "ix_%(custom)s_%(column_0_label)s", "uq": "uq_%(custom)s_%(table_name)s_%(column_0_name)s", "ck": "ck_%(custom)s_%(table_name)s", "fk": "fk_%(custom)s_%(table_name)s_" "%(column_0_name)s_%(referred_table_name)s", "pk": "pk_%(custom)s_%(table_name)s", "custom": lambda const, table: "ct", } self.metadata = MetaData(naming_convention=convention) ctx_opts = { "sqlalchemy_module_prefix": "sa.", "alembic_module_prefix": "op.", "target_metadata": MetaData(), } context = MigrationContext.configure( dialect_name="postgresql", opts=ctx_opts ) self.autogen_context = api.AutogenContext(context) def test_schema_type_boolean(self): t = Table("t", self.metadata, Column("c", Boolean(name="xyz"))) op_obj = ops.AddColumnOp.from_column(t.c.c) eq_ignore_whitespace( autogenerate.render_op_text(self.autogen_context, op_obj), "op.add_column('t', " "sa.Column('c', sa.Boolean(name='xyz'), nullable=True))", ) def test_explicit_unique_constraint(self): t = Table("t", self.metadata, Column("c", Integer)) eq_ignore_whitespace( autogenerate.render._render_unique_constraint( UniqueConstraint(t.c.c, deferrable="XYZ"), self.autogen_context ), "sa.UniqueConstraint('c', deferrable='XYZ', " "name=op.f('uq_ct_t_c'))", ) def test_explicit_named_unique_constraint(self): t = Table("t", self.metadata, Column("c", Integer)) eq_ignore_whitespace( autogenerate.render._render_unique_constraint( UniqueConstraint(t.c.c, name="q"), self.autogen_context ), "sa.UniqueConstraint('c', name='q')", ) def test_render_add_index(self): t = Table( "test", self.metadata, Column("id", Integer, primary_key=True), Column("active", Boolean()), Column("code", String(255)), ) idx = Index(None, t.c.active, t.c.code) op_obj = ops.CreateIndexOp.from_index(idx) eq_ignore_whitespace( autogenerate.render_op_text(self.autogen_context, op_obj), "op.create_index(op.f('ix_ct_test_active'), 'test', " "['active', 'code'], unique=False)", ) def test_render_drop_index(self): t = Table( "test", self.metadata, Column("id", Integer, primary_key=True), Column("active", Boolean()), Column("code", String(255)), ) idx = Index(None, t.c.active, t.c.code) op_obj = ops.DropIndexOp.from_index(idx) eq_ignore_whitespace( autogenerate.render_op_text(self.autogen_context, op_obj), "op.drop_index(op.f('ix_ct_test_active'), table_name='test')", ) def test_render_add_index_schema(self): t = Table( "test", self.metadata, Column("id", Integer, primary_key=True), Column("active", Boolean()), Column("code", String(255)), schema="CamelSchema", ) idx = Index(None, t.c.active, t.c.code) op_obj = ops.CreateIndexOp.from_index(idx) eq_ignore_whitespace( autogenerate.render_op_text(self.autogen_context, op_obj), "op.create_index(op.f('ix_ct_CamelSchema_test_active'), 'test', " "['active', 'code'], unique=False, schema='CamelSchema')", ) def test_implicit_unique_constraint(self): t = Table("t", self.metadata, Column("c", Integer, unique=True)) uq = [c for c in t.constraints if isinstance(c, UniqueConstraint)][0] eq_ignore_whitespace( autogenerate.render._render_unique_constraint( uq, self.autogen_context ), "sa.UniqueConstraint('c', name=op.f('uq_ct_t_c'))", ) def test_inline_pk_constraint(self): t = Table("t", self.metadata, Column("c", Integer, primary_key=True)) op_obj = ops.CreateTableOp.from_table(t) eq_ignore_whitespace( autogenerate.render_op_text(self.autogen_context, op_obj), "op.create_table('t',sa.Column('c', sa.Integer(), nullable=False)," "sa.PrimaryKeyConstraint('c', name=op.f('pk_ct_t')))", ) def test_inline_ck_constraint(self): t = Table( "t", self.metadata, Column("c", Integer), CheckConstraint("c > 5") ) op_obj = ops.CreateTableOp.from_table(t) eq_ignore_whitespace( autogenerate.render_op_text(self.autogen_context, op_obj), "op.create_table('t',sa.Column('c', sa.Integer(), nullable=True)," "sa.CheckConstraint(!U'c > 5', name=op.f('ck_ct_t')))", ) def test_inline_fk(self): t = Table("t", self.metadata, Column("c", Integer, ForeignKey("q.id"))) op_obj = ops.CreateTableOp.from_table(t) eq_ignore_whitespace( autogenerate.render_op_text(self.autogen_context, op_obj), "op.create_table('t',sa.Column('c', sa.Integer(), nullable=True)," "sa.ForeignKeyConstraint(['c'], ['q.id'], " "name=op.f('fk_ct_t_c_q')))", ) def test_render_check_constraint_renamed(self): """test that constraints from autogenerate render with the naming convention name explicitly. These names should be frozen into the migration scripts so that they remain the same if the application's naming convention changes. However, op.create_table() and others need to be careful that these don't double up when the "%(constraint_name)s" token is used. """ m1 = MetaData( naming_convention={"ck": "ck_%(table_name)s_%(constraint_name)s"} ) ck = CheckConstraint("im a constraint", name="cc1") Table("t", m1, Column("x"), ck) eq_ignore_whitespace( autogenerate.render._render_check_constraint( ck, self.autogen_context ), "sa.CheckConstraint(!U'im a constraint', name=op.f('ck_t_cc1'))", ) def test_create_table_plus_add_index_in_modify(self): uo = ops.UpgradeOps( ops=[ ops.CreateTableOp( "sometable", [Column("x", Integer), Column("y", Integer)] ), ops.ModifyTableOps( "sometable", ops=[ops.CreateIndexOp("ix1", "sometable", ["x", "y"])], ), ] ) eq_( autogenerate.render_python_code(uo, render_as_batch=True), "# ### commands auto generated by Alembic - please adjust! ###\n" " op.create_table('sometable',\n" " sa.Column('x', sa.Integer(), nullable=True),\n" " sa.Column('y', sa.Integer(), nullable=True)\n" " )\n" " with op.batch_alter_table('sometable', schema=None) " "as batch_op:\n" " batch_op.create_index(" "'ix1', ['x', 'y'], unique=False)\n\n" " # ### end Alembic commands ###", ) zzzeek-alembic-bee044a1c187/tests/test_batch.py000066400000000000000000001567001353106760100214470ustar00rootroot00000000000000from contextlib import contextmanager import re from sqlalchemy import Boolean from sqlalchemy import CheckConstraint from sqlalchemy import Column from sqlalchemy import DateTime from sqlalchemy import Enum from sqlalchemy import exc from sqlalchemy import ForeignKey from sqlalchemy import ForeignKeyConstraint from sqlalchemy import func from sqlalchemy import Index from sqlalchemy import Integer from sqlalchemy import MetaData from sqlalchemy import PrimaryKeyConstraint from sqlalchemy import String from sqlalchemy import Table from sqlalchemy import UniqueConstraint from sqlalchemy.engine.reflection import Inspector from sqlalchemy.schema import CreateIndex from sqlalchemy.schema import CreateTable from sqlalchemy.sql import column from sqlalchemy.sql import select from sqlalchemy.sql import text from alembic.operations import Operations from alembic.operations.batch import ApplyBatchImpl from alembic.runtime.migration import MigrationContext from alembic.testing import assert_raises_message from alembic.testing import config from alembic.testing import eq_ from alembic.testing import exclusions from alembic.testing import mock from alembic.testing import TestBase from alembic.testing.fixtures import op_fixture class BatchApplyTest(TestBase): def setUp(self): self.op = Operations(mock.Mock(opts={})) def _simple_fixture(self, table_args=(), table_kwargs={}): m = MetaData() t = Table( "tname", m, Column("id", Integer, primary_key=True), Column("x", String(10)), Column("y", Integer), ) return ApplyBatchImpl(t, table_args, table_kwargs, False) def _uq_fixture(self, table_args=(), table_kwargs={}): m = MetaData() t = Table( "tname", m, Column("id", Integer, primary_key=True), Column("x", String()), Column("y", Integer), UniqueConstraint("y", name="uq1"), ) return ApplyBatchImpl(t, table_args, table_kwargs, False) def _ix_fixture(self, table_args=(), table_kwargs={}): m = MetaData() t = Table( "tname", m, Column("id", Integer, primary_key=True), Column("x", String()), Column("y", Integer), Index("ix1", "y"), ) return ApplyBatchImpl(t, table_args, table_kwargs, False) def _pk_fixture(self): m = MetaData() t = Table( "tname", m, Column("id", Integer), Column("x", String()), Column("y", Integer), PrimaryKeyConstraint("id", name="mypk"), ) return ApplyBatchImpl(t, (), {}, False) def _literal_ck_fixture( self, copy_from=None, table_args=(), table_kwargs={} ): m = MetaData() if copy_from is not None: t = copy_from else: t = Table( "tname", m, Column("id", Integer, primary_key=True), Column("email", String()), CheckConstraint("email LIKE '%@%'"), ) return ApplyBatchImpl(t, table_args, table_kwargs, False) def _sql_ck_fixture(self, table_args=(), table_kwargs={}): m = MetaData() t = Table( "tname", m, Column("id", Integer, primary_key=True), Column("email", String()), ) t.append_constraint(CheckConstraint(t.c.email.like("%@%"))) return ApplyBatchImpl(t, table_args, table_kwargs, False) def _fk_fixture(self, table_args=(), table_kwargs={}): m = MetaData() t = Table( "tname", m, Column("id", Integer, primary_key=True), Column("email", String()), Column("user_id", Integer, ForeignKey("user.id")), ) return ApplyBatchImpl(t, table_args, table_kwargs, False) def _multi_fk_fixture(self, table_args=(), table_kwargs={}, schema=None): m = MetaData() if schema: schemaarg = "%s." % schema else: schemaarg = "" t = Table( "tname", m, Column("id", Integer, primary_key=True), Column("email", String()), Column("user_id_1", Integer, ForeignKey("%suser.id" % schemaarg)), Column("user_id_2", Integer, ForeignKey("%suser.id" % schemaarg)), Column("user_id_3", Integer), Column("user_id_version", Integer), ForeignKeyConstraint( ["user_id_3", "user_id_version"], ["%suser.id" % schemaarg, "%suser.id_version" % schemaarg], ), schema=schema, ) return ApplyBatchImpl(t, table_args, table_kwargs, False) def _named_fk_fixture(self, table_args=(), table_kwargs={}): m = MetaData() t = Table( "tname", m, Column("id", Integer, primary_key=True), Column("email", String()), Column("user_id", Integer, ForeignKey("user.id", name="ufk")), ) return ApplyBatchImpl(t, table_args, table_kwargs, False) def _selfref_fk_fixture(self, table_args=(), table_kwargs={}): m = MetaData() t = Table( "tname", m, Column("id", Integer, primary_key=True), Column("parent_id", Integer, ForeignKey("tname.id")), Column("data", String), ) return ApplyBatchImpl(t, table_args, table_kwargs, False) def _boolean_fixture(self, table_args=(), table_kwargs={}): m = MetaData() t = Table( "tname", m, Column("id", Integer, primary_key=True), Column("flag", Boolean), ) return ApplyBatchImpl(t, table_args, table_kwargs, False) def _boolean_no_ck_fixture(self, table_args=(), table_kwargs={}): m = MetaData() t = Table( "tname", m, Column("id", Integer, primary_key=True), Column("flag", Boolean(create_constraint=False)), ) return ApplyBatchImpl(t, table_args, table_kwargs, False) def _enum_fixture(self, table_args=(), table_kwargs={}): m = MetaData() t = Table( "tname", m, Column("id", Integer, primary_key=True), Column("thing", Enum("a", "b", "c")), ) return ApplyBatchImpl(t, table_args, table_kwargs, False) def _server_default_fixture(self, table_args=(), table_kwargs={}): m = MetaData() t = Table( "tname", m, Column("id", Integer, primary_key=True), Column("thing", String(), server_default=""), ) return ApplyBatchImpl(t, table_args, table_kwargs, False) def _assert_impl( self, impl, colnames=None, ddl_contains=None, ddl_not_contains=None, dialect="default", schema=None, ): context = op_fixture(dialect=dialect) impl._create(context.impl) if colnames is None: colnames = ["id", "x", "y"] eq_(impl.new_table.c.keys(), colnames) pk_cols = [col for col in impl.new_table.c if col.primary_key] eq_(list(impl.new_table.primary_key), pk_cols) create_stmt = str( CreateTable(impl.new_table).compile(dialect=context.dialect) ) create_stmt = re.sub(r"[\n\t]", "", create_stmt) idx_stmt = "" for idx in impl.indexes.values(): idx_stmt += str(CreateIndex(idx).compile(dialect=context.dialect)) for idx in impl.new_indexes.values(): impl.new_table.name = impl.table.name idx_stmt += str(CreateIndex(idx).compile(dialect=context.dialect)) impl.new_table.name = ApplyBatchImpl._calc_temp_name( impl.table.name ) idx_stmt = re.sub(r"[\n\t]", "", idx_stmt) if ddl_contains: assert ddl_contains in create_stmt + idx_stmt if ddl_not_contains: assert ddl_not_contains not in create_stmt + idx_stmt expected = [create_stmt] if schema: args = {"schema": "%s." % schema} else: args = {"schema": ""} args["temp_name"] = impl.new_table.name args["colnames"] = ", ".join( [ impl.new_table.c[name].name for name in colnames if name in impl.table.c ] ) args["tname_colnames"] = ", ".join( "CAST(%(schema)stname.%(name)s AS %(type)s) AS anon_1" % { "schema": args["schema"], "name": name, "type": impl.new_table.c[name].type, } if ( impl.new_table.c[name].type._type_affinity is not impl.table.c[name].type._type_affinity ) else "%(schema)stname.%(name)s" % {"schema": args["schema"], "name": name} for name in colnames if name in impl.table.c ) expected.extend( [ "INSERT INTO %(schema)s%(temp_name)s (%(colnames)s) " "SELECT %(tname_colnames)s FROM %(schema)stname" % args, "DROP TABLE %(schema)stname" % args, "ALTER TABLE %(schema)s%(temp_name)s " "RENAME TO %(schema)stname" % args, ] ) if idx_stmt: expected.append(idx_stmt) context.assert_(*expected) return impl.new_table def test_change_type(self): impl = self._simple_fixture() impl.alter_column("tname", "x", type_=String) new_table = self._assert_impl(impl) assert new_table.c.x.type._type_affinity is String def test_rename_col(self): impl = self._simple_fixture() impl.alter_column("tname", "x", name="q") new_table = self._assert_impl(impl) eq_(new_table.c.x.name, "q") def test_rename_col_boolean(self): impl = self._boolean_fixture() impl.alter_column("tname", "flag", name="bflag") new_table = self._assert_impl( impl, ddl_contains="CHECK (bflag IN (0, 1)", colnames=["id", "flag"], ) eq_(new_table.c.flag.name, "bflag") eq_( len( [ const for const in new_table.constraints if isinstance(const, CheckConstraint) ] ), 1, ) def test_change_type_schematype_to_non(self): impl = self._boolean_fixture() impl.alter_column("tname", "flag", type_=Integer) new_table = self._assert_impl( impl, colnames=["id", "flag"], ddl_not_contains="CHECK" ) assert new_table.c.flag.type._type_affinity is Integer # NOTE: we can't do test_change_type_non_to_schematype # at this level because the "add_constraint" part of this # comes from toimpl.py, which we aren't testing here def test_rename_col_boolean_no_ck(self): impl = self._boolean_no_ck_fixture() impl.alter_column("tname", "flag", name="bflag") new_table = self._assert_impl( impl, ddl_not_contains="CHECK", colnames=["id", "flag"] ) eq_(new_table.c.flag.name, "bflag") eq_( len( [ const for const in new_table.constraints if isinstance(const, CheckConstraint) ] ), 0, ) def test_rename_col_enum(self): impl = self._enum_fixture() impl.alter_column("tname", "thing", name="thang") new_table = self._assert_impl( impl, ddl_contains="CHECK (thang IN ('a', 'b', 'c')", colnames=["id", "thing"], ) eq_(new_table.c.thing.name, "thang") eq_( len( [ const for const in new_table.constraints if isinstance(const, CheckConstraint) ] ), 1, ) def test_rename_col_literal_ck(self): impl = self._literal_ck_fixture() impl.alter_column("tname", "email", name="emol") new_table = self._assert_impl( # note this is wrong, we don't dig into the SQL impl, ddl_contains="CHECK (email LIKE '%@%')", colnames=["id", "email"], ) eq_( len( [ c for c in new_table.constraints if isinstance(c, CheckConstraint) ] ), 1, ) eq_(new_table.c.email.name, "emol") def test_rename_col_literal_ck_workaround(self): impl = self._literal_ck_fixture( copy_from=Table( "tname", MetaData(), Column("id", Integer, primary_key=True), Column("email", String), ), table_args=[CheckConstraint("emol LIKE '%@%'")], ) impl.alter_column("tname", "email", name="emol") new_table = self._assert_impl( impl, ddl_contains="CHECK (emol LIKE '%@%')", colnames=["id", "email"], ) eq_( len( [ c for c in new_table.constraints if isinstance(c, CheckConstraint) ] ), 1, ) eq_(new_table.c.email.name, "emol") def test_rename_col_sql_ck(self): impl = self._sql_ck_fixture() impl.alter_column("tname", "email", name="emol") new_table = self._assert_impl( impl, ddl_contains="CHECK (emol LIKE '%@%')", colnames=["id", "email"], ) eq_( len( [ c for c in new_table.constraints if isinstance(c, CheckConstraint) ] ), 1, ) eq_(new_table.c.email.name, "emol") def test_add_col(self): impl = self._simple_fixture() col = Column("g", Integer) # operations.add_column produces a table t = self.op.schema_obj.table("tname", col) # noqa impl.add_column("tname", col) new_table = self._assert_impl(impl, colnames=["id", "x", "y", "g"]) eq_(new_table.c.g.name, "g") def test_add_server_default(self): impl = self._simple_fixture() impl.alter_column("tname", "y", server_default="10") new_table = self._assert_impl(impl, ddl_contains="DEFAULT '10'") eq_(new_table.c.y.server_default.arg, "10") def test_drop_server_default(self): impl = self._server_default_fixture() impl.alter_column("tname", "thing", server_default=None) new_table = self._assert_impl( impl, colnames=["id", "thing"], ddl_not_contains="DEFAULT" ) eq_(new_table.c.thing.server_default, None) def test_rename_col_pk(self): impl = self._simple_fixture() impl.alter_column("tname", "id", name="foobar") new_table = self._assert_impl( impl, ddl_contains="PRIMARY KEY (foobar)" ) eq_(new_table.c.id.name, "foobar") eq_(list(new_table.primary_key), [new_table.c.id]) def test_rename_col_fk(self): impl = self._fk_fixture() impl.alter_column("tname", "user_id", name="foobar") new_table = self._assert_impl( impl, colnames=["id", "email", "user_id"], ddl_contains='FOREIGN KEY(foobar) REFERENCES "user" (id)', ) eq_(new_table.c.user_id.name, "foobar") eq_( list(new_table.c.user_id.foreign_keys)[0]._get_colspec(), "user.id" ) def test_regen_multi_fk(self): impl = self._multi_fk_fixture() self._assert_impl( impl, colnames=[ "id", "email", "user_id_1", "user_id_2", "user_id_3", "user_id_version", ], ddl_contains="FOREIGN KEY(user_id_3, user_id_version) " 'REFERENCES "user" (id, id_version)', ) def test_regen_multi_fk_schema(self): impl = self._multi_fk_fixture(schema="foo_schema") self._assert_impl( impl, colnames=[ "id", "email", "user_id_1", "user_id_2", "user_id_3", "user_id_version", ], ddl_contains="FOREIGN KEY(user_id_3, user_id_version) " 'REFERENCES foo_schema."user" (id, id_version)', schema="foo_schema", ) def test_drop_col(self): impl = self._simple_fixture() impl.drop_column("tname", column("x")) new_table = self._assert_impl(impl, colnames=["id", "y"]) assert "y" in new_table.c assert "x" not in new_table.c def test_drop_col_remove_pk(self): impl = self._simple_fixture() impl.drop_column("tname", column("id")) new_table = self._assert_impl( impl, colnames=["x", "y"], ddl_not_contains="PRIMARY KEY" ) assert "y" in new_table.c assert "id" not in new_table.c assert not new_table.primary_key def test_drop_col_remove_fk(self): impl = self._fk_fixture() impl.drop_column("tname", column("user_id")) new_table = self._assert_impl( impl, colnames=["id", "email"], ddl_not_contains="FOREIGN KEY" ) assert "user_id" not in new_table.c assert not new_table.foreign_keys def test_drop_col_retain_fk(self): impl = self._fk_fixture() impl.drop_column("tname", column("email")) new_table = self._assert_impl( impl, colnames=["id", "user_id"], ddl_contains='FOREIGN KEY(user_id) REFERENCES "user" (id)', ) assert "email" not in new_table.c assert new_table.c.user_id.foreign_keys def test_drop_col_retain_fk_selfref(self): impl = self._selfref_fk_fixture() impl.drop_column("tname", column("data")) new_table = self._assert_impl(impl, colnames=["id", "parent_id"]) assert "data" not in new_table.c assert new_table.c.parent_id.foreign_keys def test_add_fk(self): impl = self._simple_fixture() impl.add_column("tname", Column("user_id", Integer)) fk = self.op.schema_obj.foreign_key_constraint( "fk1", "tname", "user", ["user_id"], ["id"] ) impl.add_constraint(fk) new_table = self._assert_impl( impl, colnames=["id", "x", "y", "user_id"], ddl_contains="CONSTRAINT fk1 FOREIGN KEY(user_id) " 'REFERENCES "user" (id)', ) eq_( list(new_table.c.user_id.foreign_keys)[0]._get_colspec(), "user.id" ) def test_drop_fk(self): impl = self._named_fk_fixture() fk = ForeignKeyConstraint([], [], name="ufk") impl.drop_constraint(fk) new_table = self._assert_impl( impl, colnames=["id", "email", "user_id"], ddl_not_contains="CONSTRANT fk1", ) eq_(list(new_table.foreign_keys), []) def test_add_uq(self): impl = self._simple_fixture() uq = self.op.schema_obj.unique_constraint("uq1", "tname", ["y"]) impl.add_constraint(uq) self._assert_impl( impl, colnames=["id", "x", "y"], ddl_contains="CONSTRAINT uq1 UNIQUE", ) def test_drop_uq(self): impl = self._uq_fixture() uq = self.op.schema_obj.unique_constraint("uq1", "tname", ["y"]) impl.drop_constraint(uq) self._assert_impl( impl, colnames=["id", "x", "y"], ddl_not_contains="CONSTRAINT uq1 UNIQUE", ) def test_create_index(self): impl = self._simple_fixture() ix = self.op.schema_obj.index("ix1", "tname", ["y"]) impl.create_index(ix) self._assert_impl( impl, colnames=["id", "x", "y"], ddl_contains="CREATE INDEX ix1" ) def test_drop_index(self): impl = self._ix_fixture() ix = self.op.schema_obj.index("ix1", "tname", ["y"]) impl.drop_index(ix) self._assert_impl( impl, colnames=["id", "x", "y"], ddl_not_contains="CONSTRAINT uq1 UNIQUE", ) def test_add_table_opts(self): impl = self._simple_fixture(table_kwargs={"mysql_engine": "InnoDB"}) self._assert_impl(impl, ddl_contains="ENGINE=InnoDB", dialect="mysql") def test_drop_pk(self): impl = self._pk_fixture() pk = self.op.schema_obj.primary_key_constraint("mypk", "tname", ["id"]) impl.drop_constraint(pk) new_table = self._assert_impl(impl) assert not new_table.c.id.primary_key assert not len(new_table.primary_key) class BatchAPITest(TestBase): @contextmanager def _fixture(self, schema=None): migration_context = mock.Mock( opts={}, impl=mock.MagicMock(__dialect__="sqlite") ) op = Operations(migration_context) batch = op.batch_alter_table( "tname", recreate="never", schema=schema ).__enter__() mock_schema = mock.MagicMock() with mock.patch("alembic.operations.schemaobj.sa_schema", mock_schema): yield batch batch.impl.flush() self.mock_schema = mock_schema def test_drop_col(self): with self._fixture() as batch: batch.drop_column("q") eq_( batch.impl.operations.impl.mock_calls, [ mock.call.drop_column( "tname", self.mock_schema.Column(), schema=None ) ], ) def test_add_col(self): column = Column("w", String(50)) with self._fixture() as batch: batch.add_column(column) assert ( mock.call.add_column("tname", column, schema=None) in batch.impl.operations.impl.mock_calls ) def test_create_fk(self): with self._fixture() as batch: batch.create_foreign_key("myfk", "user", ["x"], ["y"]) eq_( self.mock_schema.ForeignKeyConstraint.mock_calls, [ mock.call( ["x"], ["user.y"], onupdate=None, ondelete=None, name="myfk", initially=None, deferrable=None, match=None, ) ], ) eq_( self.mock_schema.Table.mock_calls, [ mock.call( "user", self.mock_schema.MetaData(), self.mock_schema.Column(), schema=None, ), mock.call( "tname", self.mock_schema.MetaData(), self.mock_schema.Column(), schema=None, ), mock.call().append_constraint( self.mock_schema.ForeignKeyConstraint() ), ], ) eq_( batch.impl.operations.impl.mock_calls, [ mock.call.add_constraint( self.mock_schema.ForeignKeyConstraint() ) ], ) def test_create_fk_schema(self): with self._fixture(schema="foo") as batch: batch.create_foreign_key("myfk", "user", ["x"], ["y"]) eq_( self.mock_schema.ForeignKeyConstraint.mock_calls, [ mock.call( ["x"], ["user.y"], onupdate=None, ondelete=None, name="myfk", initially=None, deferrable=None, match=None, ) ], ) eq_( self.mock_schema.Table.mock_calls, [ mock.call( "user", self.mock_schema.MetaData(), self.mock_schema.Column(), schema=None, ), mock.call( "tname", self.mock_schema.MetaData(), self.mock_schema.Column(), schema="foo", ), mock.call().append_constraint( self.mock_schema.ForeignKeyConstraint() ), ], ) eq_( batch.impl.operations.impl.mock_calls, [ mock.call.add_constraint( self.mock_schema.ForeignKeyConstraint() ) ], ) def test_create_uq(self): with self._fixture() as batch: batch.create_unique_constraint("uq1", ["a", "b"]) eq_( self.mock_schema.Table().c.__getitem__.mock_calls, [mock.call("a"), mock.call("b")], ) eq_( self.mock_schema.UniqueConstraint.mock_calls, [ mock.call( self.mock_schema.Table().c.__getitem__(), self.mock_schema.Table().c.__getitem__(), name="uq1", ) ], ) eq_( batch.impl.operations.impl.mock_calls, [mock.call.add_constraint(self.mock_schema.UniqueConstraint())], ) def test_create_pk(self): with self._fixture() as batch: batch.create_primary_key("pk1", ["a", "b"]) eq_( self.mock_schema.Table().c.__getitem__.mock_calls, [mock.call("a"), mock.call("b")], ) eq_( self.mock_schema.PrimaryKeyConstraint.mock_calls, [ mock.call( self.mock_schema.Table().c.__getitem__(), self.mock_schema.Table().c.__getitem__(), name="pk1", ) ], ) eq_( batch.impl.operations.impl.mock_calls, [ mock.call.add_constraint( self.mock_schema.PrimaryKeyConstraint() ) ], ) def test_create_check(self): expr = text("a > b") with self._fixture() as batch: batch.create_check_constraint("ck1", expr) eq_( self.mock_schema.CheckConstraint.mock_calls, [mock.call(expr, name="ck1")], ) eq_( batch.impl.operations.impl.mock_calls, [mock.call.add_constraint(self.mock_schema.CheckConstraint())], ) def test_drop_constraint(self): with self._fixture() as batch: batch.drop_constraint("uq1") eq_(self.mock_schema.Constraint.mock_calls, [mock.call(name="uq1")]) eq_( batch.impl.operations.impl.mock_calls, [mock.call.drop_constraint(self.mock_schema.Constraint())], ) class CopyFromTest(TestBase): def _fixture(self): self.metadata = MetaData() self.table = Table( "foo", self.metadata, Column("id", Integer, primary_key=True), Column("data", String(50)), Column("x", Integer), ) context = op_fixture(dialect="sqlite", as_sql=True) self.op = Operations(context) return context def test_change_type(self): context = self._fixture() with self.op.batch_alter_table( "foo", copy_from=self.table ) as batch_op: batch_op.alter_column("data", type_=Integer) context.assert_( "CREATE TABLE _alembic_tmp_foo (id INTEGER NOT NULL, " "data INTEGER, x INTEGER, PRIMARY KEY (id))", "INSERT INTO _alembic_tmp_foo (id, data, x) SELECT foo.id, " "CAST(foo.data AS INTEGER) AS anon_1, foo.x FROM foo", "DROP TABLE foo", "ALTER TABLE _alembic_tmp_foo RENAME TO foo", ) def test_change_type_from_schematype(self): context = self._fixture() self.table.append_column( Column("y", Boolean(create_constraint=True, name="ck1")) ) with self.op.batch_alter_table( "foo", copy_from=self.table ) as batch_op: batch_op.alter_column( "y", type_=Integer, existing_type=Boolean(create_constraint=True, name="ck1"), ) context.assert_( "CREATE TABLE _alembic_tmp_foo (id INTEGER NOT NULL, " "data VARCHAR(50), x INTEGER, y INTEGER, PRIMARY KEY (id))", "INSERT INTO _alembic_tmp_foo (id, data, x, y) SELECT foo.id, " "foo.data, foo.x, CAST(foo.y AS INTEGER) AS anon_1 FROM foo", "DROP TABLE foo", "ALTER TABLE _alembic_tmp_foo RENAME TO foo", ) def test_change_type_to_schematype(self): context = self._fixture() self.table.append_column(Column("y", Integer)) with self.op.batch_alter_table( "foo", copy_from=self.table ) as batch_op: batch_op.alter_column( "y", existing_type=Integer, type_=Boolean(create_constraint=True, name="ck1"), ) context.assert_( "CREATE TABLE _alembic_tmp_foo (id INTEGER NOT NULL, " "data VARCHAR(50), x INTEGER, y BOOLEAN, PRIMARY KEY (id), " "CONSTRAINT ck1 CHECK (y IN (0, 1)))", "INSERT INTO _alembic_tmp_foo (id, data, x, y) SELECT foo.id, " "foo.data, foo.x, CAST(foo.y AS BOOLEAN) AS anon_1 FROM foo", "DROP TABLE foo", "ALTER TABLE _alembic_tmp_foo RENAME TO foo", ) def test_create_drop_index_w_always(self): context = self._fixture() with self.op.batch_alter_table( "foo", copy_from=self.table, recreate="always" ) as batch_op: batch_op.create_index("ix_data", ["data"], unique=True) context.assert_( "CREATE TABLE _alembic_tmp_foo (id INTEGER NOT NULL, " "data VARCHAR(50), " "x INTEGER, PRIMARY KEY (id))", "INSERT INTO _alembic_tmp_foo (id, data, x) " "SELECT foo.id, foo.data, foo.x FROM foo", "DROP TABLE foo", "ALTER TABLE _alembic_tmp_foo RENAME TO foo", "CREATE UNIQUE INDEX ix_data ON foo (data)", ) context.clear_assertions() Index("ix_data", self.table.c.data, unique=True) with self.op.batch_alter_table( "foo", copy_from=self.table, recreate="always" ) as batch_op: batch_op.drop_index("ix_data") context.assert_( "CREATE TABLE _alembic_tmp_foo (id INTEGER NOT NULL, " "data VARCHAR(50), x INTEGER, PRIMARY KEY (id))", "INSERT INTO _alembic_tmp_foo (id, data, x) " "SELECT foo.id, foo.data, foo.x FROM foo", "DROP TABLE foo", "ALTER TABLE _alembic_tmp_foo RENAME TO foo", ) def test_create_drop_index_wo_always(self): context = self._fixture() with self.op.batch_alter_table( "foo", copy_from=self.table ) as batch_op: batch_op.create_index("ix_data", ["data"], unique=True) context.assert_("CREATE UNIQUE INDEX ix_data ON foo (data)") context.clear_assertions() Index("ix_data", self.table.c.data, unique=True) with self.op.batch_alter_table( "foo", copy_from=self.table ) as batch_op: batch_op.drop_index("ix_data") context.assert_("DROP INDEX ix_data") def test_create_drop_index_w_other_ops(self): context = self._fixture() with self.op.batch_alter_table( "foo", copy_from=self.table ) as batch_op: batch_op.alter_column("data", type_=Integer) batch_op.create_index("ix_data", ["data"], unique=True) context.assert_( "CREATE TABLE _alembic_tmp_foo (id INTEGER NOT NULL, " "data INTEGER, x INTEGER, PRIMARY KEY (id))", "INSERT INTO _alembic_tmp_foo (id, data, x) SELECT foo.id, " "CAST(foo.data AS INTEGER) AS anon_1, foo.x FROM foo", "DROP TABLE foo", "ALTER TABLE _alembic_tmp_foo RENAME TO foo", "CREATE UNIQUE INDEX ix_data ON foo (data)", ) context.clear_assertions() Index("ix_data", self.table.c.data, unique=True) with self.op.batch_alter_table( "foo", copy_from=self.table ) as batch_op: batch_op.drop_index("ix_data") batch_op.alter_column("data", type_=String) context.assert_( "CREATE TABLE _alembic_tmp_foo (id INTEGER NOT NULL, " "data VARCHAR, x INTEGER, PRIMARY KEY (id))", "INSERT INTO _alembic_tmp_foo (id, data, x) SELECT foo.id, " "foo.data, foo.x FROM foo", "DROP TABLE foo", "ALTER TABLE _alembic_tmp_foo RENAME TO foo", ) class BatchRoundTripTest(TestBase): __only_on__ = "sqlite" def setUp(self): self.conn = config.db.connect() self.metadata = MetaData() t1 = Table( "foo", self.metadata, Column("id", Integer, primary_key=True), Column("data", String(50)), Column("x", Integer), mysql_engine="InnoDB", ) t1.create(self.conn) self.conn.execute( t1.insert(), [ {"id": 1, "data": "d1", "x": 5}, {"id": 2, "data": "22", "x": 6}, {"id": 3, "data": "8.5", "x": 7}, {"id": 4, "data": "9.46", "x": 8}, {"id": 5, "data": "d5", "x": 9}, ], ) context = MigrationContext.configure(self.conn) self.op = Operations(context) @contextmanager def _sqlite_referential_integrity(self): self.conn.execute("PRAGMA foreign_keys=ON") try: yield finally: self.conn.execute("PRAGMA foreign_keys=OFF") def _no_pk_fixture(self): nopk = Table( "nopk", self.metadata, Column("a", Integer), Column("b", Integer), Column("c", Integer), mysql_engine="InnoDB", ) nopk.create(self.conn) self.conn.execute( nopk.insert(), [{"a": 1, "b": 2, "c": 3}, {"a": 2, "b": 4, "c": 5}] ) return nopk def _table_w_index_fixture(self): t = Table( "t_w_ix", self.metadata, Column("id", Integer, primary_key=True), Column("thing", Integer), Column("data", String(20)), ) Index("ix_thing", t.c.thing) t.create(self.conn) return t def _boolean_fixture(self): t = Table( "hasbool", self.metadata, Column("x", Boolean(create_constraint=True, name="ck1")), Column("y", Integer), ) t.create(self.conn) def _timestamp_fixture(self): t = Table("hasts", self.metadata, Column("x", DateTime())) t.create(self.conn) return t def _datetime_server_default_fixture(self): return func.datetime("now", "localtime") def _timestamp_w_expr_default_fixture(self): t = Table( "hasts", self.metadata, Column( "x", DateTime(), server_default=self._datetime_server_default_fixture(), nullable=False, ), ) t.create(self.conn) return t def _int_to_boolean_fixture(self): t = Table("hasbool", self.metadata, Column("x", Integer)) t.create(self.conn) def test_change_type_boolean_to_int(self): self._boolean_fixture() with self.op.batch_alter_table("hasbool") as batch_op: batch_op.alter_column( "x", type_=Integer, existing_type=Boolean(create_constraint=True, name="ck1"), ) insp = Inspector.from_engine(config.db) eq_( [ c["type"]._type_affinity for c in insp.get_columns("hasbool") if c["name"] == "x" ], [Integer], ) def test_no_net_change_timestamp(self): t = self._timestamp_fixture() import datetime self.conn.execute( t.insert(), {"x": datetime.datetime(2012, 5, 18, 15, 32, 5)} ) with self.op.batch_alter_table("hasts") as batch_op: batch_op.alter_column("x", type_=DateTime()) eq_( self.conn.execute(select([t.c.x])).fetchall(), [(datetime.datetime(2012, 5, 18, 15, 32, 5),)], ) @config.requirements.sqlalchemy_12 def test_no_net_change_timestamp_w_default(self): t = self._timestamp_w_expr_default_fixture() with self.op.batch_alter_table("hasts") as batch_op: batch_op.alter_column( "x", type_=DateTime(), nullable=False, server_default=self._datetime_server_default_fixture(), ) self.conn.execute(t.insert()) row = self.conn.execute(select([t.c.x])).fetchone() assert row["x"] is not None def test_drop_col_schematype(self): self._boolean_fixture() with self.op.batch_alter_table("hasbool") as batch_op: batch_op.drop_column("x") insp = Inspector.from_engine(config.db) assert "x" not in (c["name"] for c in insp.get_columns("hasbool")) def test_change_type_int_to_boolean(self): self._int_to_boolean_fixture() with self.op.batch_alter_table("hasbool") as batch_op: batch_op.alter_column( "x", type_=Boolean(create_constraint=True, name="ck1") ) insp = Inspector.from_engine(config.db) if exclusions.against(config, "sqlite"): eq_( [ c["type"]._type_affinity for c in insp.get_columns("hasbool") if c["name"] == "x" ], [Boolean], ) elif exclusions.against(config, "mysql"): eq_( [ c["type"]._type_affinity for c in insp.get_columns("hasbool") if c["name"] == "x" ], [Integer], ) def tearDown(self): self.metadata.drop_all(self.conn) self.conn.close() def _assert_data(self, data, tablename="foo"): eq_( [ dict(row) for row in self.conn.execute("select * from %s" % tablename) ], data, ) def test_ix_existing(self): self._table_w_index_fixture() with self.op.batch_alter_table("t_w_ix") as batch_op: batch_op.alter_column("data", type_=String(30)) batch_op.create_index("ix_data", ["data"]) insp = Inspector.from_engine(config.db) eq_( set( (ix["name"], tuple(ix["column_names"])) for ix in insp.get_indexes("t_w_ix") ), set([("ix_data", ("data",)), ("ix_thing", ("thing",))]), ) def test_fk_points_to_me_auto(self): self._test_fk_points_to_me("auto") # in particular, this tests that the failures # on PG and MySQL result in recovery of the batch system, # e.g. that the _alembic_tmp_temp table is dropped @config.requirements.no_referential_integrity def test_fk_points_to_me_recreate(self): self._test_fk_points_to_me("always") @exclusions.only_on("sqlite") @exclusions.fails( "intentionally asserting that this " "doesn't work w/ pragma foreign keys" ) def test_fk_points_to_me_sqlite_refinteg(self): with self._sqlite_referential_integrity(): self._test_fk_points_to_me("auto") def _test_fk_points_to_me(self, recreate): bar = Table( "bar", self.metadata, Column("id", Integer, primary_key=True), Column("foo_id", Integer, ForeignKey("foo.id")), mysql_engine="InnoDB", ) bar.create(self.conn) self.conn.execute(bar.insert(), {"id": 1, "foo_id": 3}) with self.op.batch_alter_table("foo", recreate=recreate) as batch_op: batch_op.alter_column( "data", new_column_name="newdata", existing_type=String(50) ) insp = Inspector.from_engine(self.conn) eq_( [ ( key["referred_table"], key["referred_columns"], key["constrained_columns"], ) for key in insp.get_foreign_keys("bar") ], [("foo", ["id"], ["foo_id"])], ) def test_selfref_fk_auto(self): self._test_selfref_fk("auto") @config.requirements.no_referential_integrity def test_selfref_fk_recreate(self): self._test_selfref_fk("always") @exclusions.only_on("sqlite") @exclusions.fails( "intentionally asserting that this " "doesn't work w/ pragma foreign keys" ) def test_selfref_fk_sqlite_refinteg(self): with self._sqlite_referential_integrity(): self._test_selfref_fk("auto") def _test_selfref_fk(self, recreate): bar = Table( "bar", self.metadata, Column("id", Integer, primary_key=True), Column("bar_id", Integer, ForeignKey("bar.id")), Column("data", String(50)), mysql_engine="InnoDB", ) bar.create(self.conn) self.conn.execute(bar.insert(), {"id": 1, "data": "x", "bar_id": None}) self.conn.execute(bar.insert(), {"id": 2, "data": "y", "bar_id": 1}) with self.op.batch_alter_table("bar", recreate=recreate) as batch_op: batch_op.alter_column( "data", new_column_name="newdata", existing_type=String(50) ) insp = Inspector.from_engine(self.conn) insp = Inspector.from_engine(self.conn) eq_( [ ( key["referred_table"], key["referred_columns"], key["constrained_columns"], ) for key in insp.get_foreign_keys("bar") ], [("bar", ["id"], ["bar_id"])], ) def test_change_type(self): with self.op.batch_alter_table("foo") as batch_op: batch_op.alter_column("data", type_=Integer) self._assert_data( [ {"id": 1, "data": 0, "x": 5}, {"id": 2, "data": 22, "x": 6}, {"id": 3, "data": 8, "x": 7}, {"id": 4, "data": 9, "x": 8}, {"id": 5, "data": 0, "x": 9}, ] ) def test_drop_column(self): with self.op.batch_alter_table("foo") as batch_op: batch_op.drop_column("data") self._assert_data( [ {"id": 1, "x": 5}, {"id": 2, "x": 6}, {"id": 3, "x": 7}, {"id": 4, "x": 8}, {"id": 5, "x": 9}, ] ) def test_drop_pk_col_readd_col(self): # drop a column, add it back without primary_key=True, should no # longer be in the constraint with self.op.batch_alter_table("foo") as batch_op: batch_op.drop_column("id") batch_op.add_column(Column("id", Integer)) pk_const = Inspector.from_engine(self.conn).get_pk_constraint("foo") eq_(pk_const["constrained_columns"], []) def test_drop_pk_col_readd_pk_col(self): # drop a column, add it back with primary_key=True, should remain with self.op.batch_alter_table("foo") as batch_op: batch_op.drop_column("id") batch_op.add_column(Column("id", Integer, primary_key=True)) pk_const = Inspector.from_engine(self.conn).get_pk_constraint("foo") eq_(pk_const["constrained_columns"], ["id"]) def test_drop_pk_col_readd_col_also_pk_const(self): # drop a column, add it back without primary_key=True, but then # also make anew PK constraint that includes it, should remain with self.op.batch_alter_table("foo") as batch_op: batch_op.drop_column("id") batch_op.add_column(Column("id", Integer)) batch_op.create_primary_key("newpk", ["id"]) pk_const = Inspector.from_engine(self.conn).get_pk_constraint("foo") eq_(pk_const["constrained_columns"], ["id"]) def test_add_pk_constraint(self): self._no_pk_fixture() with self.op.batch_alter_table("nopk", recreate="always") as batch_op: batch_op.create_primary_key("newpk", ["a", "b"]) pk_const = Inspector.from_engine(self.conn).get_pk_constraint("nopk") with config.requirements.reflects_pk_names.fail_if(): eq_(pk_const["name"], "newpk") eq_(pk_const["constrained_columns"], ["a", "b"]) @config.requirements.check_constraints_w_enforcement def test_add_ck_constraint(self): with self.op.batch_alter_table("foo", recreate="always") as batch_op: batch_op.create_check_constraint("newck", text("x > 0")) # we dont support reflection of CHECK constraints # so test this by just running invalid data in foo = self.metadata.tables["foo"] assert_raises_message( exc.IntegrityError, "newck", self.conn.execute, foo.insert(), {"id": 6, "data": 5, "x": -2}, ) @config.requirements.unnamed_constraints def test_drop_foreign_key(self): bar = Table( "bar", self.metadata, Column("id", Integer, primary_key=True), Column("foo_id", Integer, ForeignKey("foo.id")), mysql_engine="InnoDB", ) bar.create(self.conn) self.conn.execute(bar.insert(), {"id": 1, "foo_id": 3}) naming_convention = { "fk": "fk_%(table_name)s_%(column_0_name)s_%(referred_table_name)s" } with self.op.batch_alter_table( "bar", naming_convention=naming_convention ) as batch_op: batch_op.drop_constraint("fk_bar_foo_id_foo", type_="foreignkey") eq_(Inspector.from_engine(self.conn).get_foreign_keys("bar"), []) def test_drop_column_fk_recreate(self): with self.op.batch_alter_table("foo", recreate="always") as batch_op: batch_op.drop_column("data") self._assert_data( [ {"id": 1, "x": 5}, {"id": 2, "x": 6}, {"id": 3, "x": 7}, {"id": 4, "x": 8}, {"id": 5, "x": 9}, ] ) def test_rename_column(self): with self.op.batch_alter_table("foo") as batch_op: batch_op.alter_column("x", new_column_name="y") self._assert_data( [ {"id": 1, "data": "d1", "y": 5}, {"id": 2, "data": "22", "y": 6}, {"id": 3, "data": "8.5", "y": 7}, {"id": 4, "data": "9.46", "y": 8}, {"id": 5, "data": "d5", "y": 9}, ] ) def test_rename_column_boolean(self): bar = Table( "bar", self.metadata, Column("id", Integer, primary_key=True), Column("flag", Boolean()), mysql_engine="InnoDB", ) bar.create(self.conn) self.conn.execute(bar.insert(), {"id": 1, "flag": True}) self.conn.execute(bar.insert(), {"id": 2, "flag": False}) with self.op.batch_alter_table("bar") as batch_op: batch_op.alter_column( "flag", new_column_name="bflag", existing_type=Boolean ) self._assert_data( [{"id": 1, "bflag": True}, {"id": 2, "bflag": False}], "bar" ) @config.requirements.non_native_boolean def test_rename_column_non_native_boolean_no_ck(self): bar = Table( "bar", self.metadata, Column("id", Integer, primary_key=True), Column("flag", Boolean(create_constraint=False)), mysql_engine="InnoDB", ) bar.create(self.conn) self.conn.execute(bar.insert(), {"id": 1, "flag": True}) self.conn.execute(bar.insert(), {"id": 2, "flag": False}) self.conn.execute( # override Boolean type which as of 1.1 coerces numerics # to 1/0 text("insert into bar (id, flag) values (:id, :flag)"), {"id": 3, "flag": 5}, ) with self.op.batch_alter_table( "bar", reflect_args=[Column("flag", Boolean(create_constraint=False))], ) as batch_op: batch_op.alter_column( "flag", new_column_name="bflag", existing_type=Boolean ) self._assert_data( [ {"id": 1, "bflag": True}, {"id": 2, "bflag": False}, {"id": 3, "bflag": 5}, ], "bar", ) def test_drop_column_pk(self): with self.op.batch_alter_table("foo") as batch_op: batch_op.drop_column("id") self._assert_data( [ {"data": "d1", "x": 5}, {"data": "22", "x": 6}, {"data": "8.5", "x": 7}, {"data": "9.46", "x": 8}, {"data": "d5", "x": 9}, ] ) def test_rename_column_pk(self): with self.op.batch_alter_table("foo") as batch_op: batch_op.alter_column("id", new_column_name="ident") self._assert_data( [ {"ident": 1, "data": "d1", "x": 5}, {"ident": 2, "data": "22", "x": 6}, {"ident": 3, "data": "8.5", "x": 7}, {"ident": 4, "data": "9.46", "x": 8}, {"ident": 5, "data": "d5", "x": 9}, ] ) def test_add_column_auto(self): # note this uses ALTER with self.op.batch_alter_table("foo") as batch_op: batch_op.add_column( Column("data2", String(50), server_default="hi") ) self._assert_data( [ {"id": 1, "data": "d1", "x": 5, "data2": "hi"}, {"id": 2, "data": "22", "x": 6, "data2": "hi"}, {"id": 3, "data": "8.5", "x": 7, "data2": "hi"}, {"id": 4, "data": "9.46", "x": 8, "data2": "hi"}, {"id": 5, "data": "d5", "x": 9, "data2": "hi"}, ] ) def test_add_column_recreate(self): with self.op.batch_alter_table("foo", recreate="always") as batch_op: batch_op.add_column( Column("data2", String(50), server_default="hi") ) self._assert_data( [ {"id": 1, "data": "d1", "x": 5, "data2": "hi"}, {"id": 2, "data": "22", "x": 6, "data2": "hi"}, {"id": 3, "data": "8.5", "x": 7, "data2": "hi"}, {"id": 4, "data": "9.46", "x": 8, "data2": "hi"}, {"id": 5, "data": "d5", "x": 9, "data2": "hi"}, ] ) def test_create_drop_index(self): insp = Inspector.from_engine(config.db) eq_(insp.get_indexes("foo"), []) with self.op.batch_alter_table("foo", recreate="always") as batch_op: batch_op.create_index("ix_data", ["data"], unique=True) self._assert_data( [ {"id": 1, "data": "d1", "x": 5}, {"id": 2, "data": "22", "x": 6}, {"id": 3, "data": "8.5", "x": 7}, {"id": 4, "data": "9.46", "x": 8}, {"id": 5, "data": "d5", "x": 9}, ] ) insp = Inspector.from_engine(config.db) eq_( [ dict( unique=ix["unique"], name=ix["name"], column_names=ix["column_names"], ) for ix in insp.get_indexes("foo") ], [{"unique": True, "name": "ix_data", "column_names": ["data"]}], ) with self.op.batch_alter_table("foo", recreate="always") as batch_op: batch_op.drop_index("ix_data") insp = Inspector.from_engine(config.db) eq_(insp.get_indexes("foo"), []) class BatchRoundTripMySQLTest(BatchRoundTripTest): __only_on__ = "mysql" __backend__ = True def _datetime_server_default_fixture(self): return func.current_timestamp() @exclusions.fails() def test_drop_pk_col_readd_pk_col(self): super(BatchRoundTripMySQLTest, self).test_drop_pk_col_readd_pk_col() @exclusions.fails() def test_drop_pk_col_readd_col_also_pk_const(self): super( BatchRoundTripMySQLTest, self ).test_drop_pk_col_readd_col_also_pk_const() @exclusions.fails() def test_rename_column_pk(self): super(BatchRoundTripMySQLTest, self).test_rename_column_pk() @exclusions.fails() def test_rename_column(self): super(BatchRoundTripMySQLTest, self).test_rename_column() @exclusions.fails() def test_change_type(self): super(BatchRoundTripMySQLTest, self).test_change_type() def test_create_drop_index(self): super(BatchRoundTripMySQLTest, self).test_create_drop_index() # fails on mariadb 10.2, succeeds on 10.3 @exclusions.fails_if(config.requirements.mysql_check_col_name_change) def test_rename_column_boolean(self): super(BatchRoundTripMySQLTest, self).test_rename_column_boolean() @config.requirements.mysql_check_reflection_or_none def test_change_type_boolean_to_int(self): super(BatchRoundTripMySQLTest, self).test_change_type_boolean_to_int() @config.requirements.mysql_check_reflection_or_none def test_change_type_int_to_boolean(self): super(BatchRoundTripMySQLTest, self).test_change_type_int_to_boolean() class BatchRoundTripPostgresqlTest(BatchRoundTripTest): __only_on__ = "postgresql" __backend__ = True def _datetime_server_default_fixture(self): return func.current_timestamp() @exclusions.fails() def test_drop_pk_col_readd_pk_col(self): super( BatchRoundTripPostgresqlTest, self ).test_drop_pk_col_readd_pk_col() @exclusions.fails() def test_drop_pk_col_readd_col_also_pk_const(self): super( BatchRoundTripPostgresqlTest, self ).test_drop_pk_col_readd_col_also_pk_const() @exclusions.fails() def test_change_type(self): super(BatchRoundTripPostgresqlTest, self).test_change_type() def test_create_drop_index(self): super(BatchRoundTripPostgresqlTest, self).test_create_drop_index() @exclusions.fails() def test_change_type_int_to_boolean(self): super( BatchRoundTripPostgresqlTest, self ).test_change_type_int_to_boolean() @exclusions.fails() def test_change_type_boolean_to_int(self): super( BatchRoundTripPostgresqlTest, self ).test_change_type_boolean_to_int() zzzeek-alembic-bee044a1c187/tests/test_bulk_insert.py000066400000000000000000000236171353106760100227070ustar00rootroot00000000000000from sqlalchemy import Column from sqlalchemy import Integer from sqlalchemy import MetaData from sqlalchemy import String from sqlalchemy import Table from sqlalchemy.sql import column from sqlalchemy.sql import table from sqlalchemy.types import TypeEngine from alembic import op from alembic.migration import MigrationContext from alembic.testing import assert_raises_message from alembic.testing import config from alembic.testing import eq_ from alembic.testing.fixtures import op_fixture from alembic.testing.fixtures import TestBase class BulkInsertTest(TestBase): def _table_fixture(self, dialect, as_sql): context = op_fixture(dialect, as_sql) t1 = table( "ins_table", column("id", Integer), column("v1", String()), column("v2", String()), ) return context, t1 def _big_t_table_fixture(self, dialect, as_sql): context = op_fixture(dialect, as_sql) t1 = Table( "ins_table", MetaData(), Column("id", Integer, primary_key=True), Column("v1", String()), Column("v2", String()), ) return context, t1 def _test_bulk_insert(self, dialect, as_sql): context, t1 = self._table_fixture(dialect, as_sql) op.bulk_insert( t1, [ {"id": 1, "v1": "row v1", "v2": "row v5"}, {"id": 2, "v1": "row v2", "v2": "row v6"}, {"id": 3, "v1": "row v3", "v2": "row v7"}, {"id": 4, "v1": "row v4", "v2": "row v8"}, ], ) return context def _test_bulk_insert_single(self, dialect, as_sql): context, t1 = self._table_fixture(dialect, as_sql) op.bulk_insert(t1, [{"id": 1, "v1": "row v1", "v2": "row v5"}]) return context def _test_bulk_insert_single_bigt(self, dialect, as_sql): context, t1 = self._big_t_table_fixture(dialect, as_sql) op.bulk_insert(t1, [{"id": 1, "v1": "row v1", "v2": "row v5"}]) return context def test_bulk_insert(self): context = self._test_bulk_insert("default", False) context.assert_( "INSERT INTO ins_table (id, v1, v2) VALUES (:id, :v1, :v2)" ) def test_bulk_insert_wrong_cols(self): context = op_fixture("postgresql") t1 = table( "ins_table", column("id", Integer), column("v1", String()), column("v2", String()), ) op.bulk_insert(t1, [{"v1": "row v1"}]) context.assert_( "INSERT INTO ins_table (id, v1, v2) " "VALUES (%(id)s, %(v1)s, %(v2)s)" ) def test_bulk_insert_no_rows(self): context, t1 = self._table_fixture("default", False) op.bulk_insert(t1, []) context.assert_() def test_bulk_insert_pg(self): context = self._test_bulk_insert("postgresql", False) context.assert_( "INSERT INTO ins_table (id, v1, v2) " "VALUES (%(id)s, %(v1)s, %(v2)s)" ) def test_bulk_insert_pg_single(self): context = self._test_bulk_insert_single("postgresql", False) context.assert_( "INSERT INTO ins_table (id, v1, v2) " "VALUES (%(id)s, %(v1)s, %(v2)s)" ) def test_bulk_insert_pg_single_as_sql(self): context = self._test_bulk_insert_single("postgresql", True) context.assert_( "INSERT INTO ins_table (id, v1, v2) VALUES (1, 'row v1', 'row v5')" ) def test_bulk_insert_pg_single_big_t_as_sql(self): context = self._test_bulk_insert_single_bigt("postgresql", True) context.assert_( "INSERT INTO ins_table (id, v1, v2) " "VALUES (1, 'row v1', 'row v5')" ) def test_bulk_insert_mssql(self): context = self._test_bulk_insert("mssql", False) context.assert_( "INSERT INTO ins_table (id, v1, v2) VALUES (:id, :v1, :v2)" ) def test_bulk_insert_inline_literal_as_sql(self): context = op_fixture("postgresql", True) class MyType(TypeEngine): pass t1 = table("t", column("id", Integer), column("data", MyType())) op.bulk_insert( t1, [ {"id": 1, "data": op.inline_literal("d1")}, {"id": 2, "data": op.inline_literal("d2")}, ], ) context.assert_( "INSERT INTO t (id, data) VALUES (1, 'd1')", "INSERT INTO t (id, data) VALUES (2, 'd2')", ) def test_bulk_insert_as_sql(self): context = self._test_bulk_insert("default", True) context.assert_( "INSERT INTO ins_table (id, v1, v2) " "VALUES (1, 'row v1', 'row v5')", "INSERT INTO ins_table (id, v1, v2) " "VALUES (2, 'row v2', 'row v6')", "INSERT INTO ins_table (id, v1, v2) " "VALUES (3, 'row v3', 'row v7')", "INSERT INTO ins_table (id, v1, v2) " "VALUES (4, 'row v4', 'row v8')", ) def test_bulk_insert_as_sql_pg(self): context = self._test_bulk_insert("postgresql", True) context.assert_( "INSERT INTO ins_table (id, v1, v2) " "VALUES (1, 'row v1', 'row v5')", "INSERT INTO ins_table (id, v1, v2) " "VALUES (2, 'row v2', 'row v6')", "INSERT INTO ins_table (id, v1, v2) " "VALUES (3, 'row v3', 'row v7')", "INSERT INTO ins_table (id, v1, v2) " "VALUES (4, 'row v4', 'row v8')", ) def test_bulk_insert_as_sql_mssql(self): context = self._test_bulk_insert("mssql", True) # SQL server requires IDENTITY_INSERT # TODO: figure out if this is safe to enable for a table that # doesn't have an IDENTITY column context.assert_( "SET IDENTITY_INSERT ins_table ON", "GO", "INSERT INTO ins_table (id, v1, v2) " "VALUES (1, 'row v1', 'row v5')", "GO", "INSERT INTO ins_table (id, v1, v2) " "VALUES (2, 'row v2', 'row v6')", "GO", "INSERT INTO ins_table (id, v1, v2) " "VALUES (3, 'row v3', 'row v7')", "GO", "INSERT INTO ins_table (id, v1, v2) " "VALUES (4, 'row v4', 'row v8')", "GO", "SET IDENTITY_INSERT ins_table OFF", "GO", ) def test_bulk_insert_from_new_table(self): context = op_fixture("postgresql", True) t1 = op.create_table( "ins_table", Column("id", Integer), Column("v1", String()), Column("v2", String()), ) op.bulk_insert( t1, [ {"id": 1, "v1": "row v1", "v2": "row v5"}, {"id": 2, "v1": "row v2", "v2": "row v6"}, ], ) context.assert_( "CREATE TABLE ins_table (id INTEGER, v1 VARCHAR, v2 VARCHAR)", "INSERT INTO ins_table (id, v1, v2) VALUES " "(1, 'row v1', 'row v5')", "INSERT INTO ins_table (id, v1, v2) VALUES " "(2, 'row v2', 'row v6')", ) def test_invalid_format(self): context, t1 = self._table_fixture("sqlite", False) assert_raises_message( TypeError, "List expected", op.bulk_insert, t1, {"id": 5} ) assert_raises_message( TypeError, "List of dictionaries expected", op.bulk_insert, t1, [(5,)], ) class RoundTripTest(TestBase): __only_on__ = "sqlite" def setUp(self): self.conn = config.db.connect() self.conn.execute( """ create table foo( id integer primary key, data varchar(50), x integer ) """ ) context = MigrationContext.configure(self.conn) self.op = op.Operations(context) self.t1 = table("foo", column("id"), column("data"), column("x")) def tearDown(self): self.conn.execute("drop table foo") self.conn.close() def test_single_insert_round_trip(self): self.op.bulk_insert(self.t1, [{"data": "d1", "x": "x1"}]) eq_( self.conn.execute("select id, data, x from foo").fetchall(), [(1, "d1", "x1")], ) def test_bulk_insert_round_trip(self): self.op.bulk_insert( self.t1, [ {"data": "d1", "x": "x1"}, {"data": "d2", "x": "x2"}, {"data": "d3", "x": "x3"}, ], ) eq_( self.conn.execute("select id, data, x from foo").fetchall(), [(1, "d1", "x1"), (2, "d2", "x2"), (3, "d3", "x3")], ) def test_bulk_insert_inline_literal(self): class MyType(TypeEngine): pass t1 = table("foo", column("id", Integer), column("data", MyType())) self.op.bulk_insert( t1, [ {"id": 1, "data": self.op.inline_literal("d1")}, {"id": 2, "data": self.op.inline_literal("d2")}, ], multiinsert=False, ) eq_( self.conn.execute("select id, data from foo").fetchall(), [(1, "d1"), (2, "d2")], ) def test_bulk_insert_from_new_table(self): t1 = self.op.create_table( "ins_table", Column("id", Integer), Column("v1", String()), Column("v2", String()), ) self.op.bulk_insert( t1, [ {"id": 1, "v1": "row v1", "v2": "row v5"}, {"id": 2, "v1": "row v2", "v2": "row v6"}, ], ) eq_( self.conn.execute( "select id, v1, v2 from ins_table order by id" ).fetchall(), [(1, u"row v1", u"row v5"), (2, u"row v2", u"row v6")], ) zzzeek-alembic-bee044a1c187/tests/test_command.py000066400000000000000000000717441353106760100220100ustar00rootroot00000000000000from contextlib import contextmanager import inspect from io import BytesIO from io import TextIOWrapper import re from sqlalchemy import exc as sqla_exc from alembic import command from alembic import config from alembic import testing from alembic import util from alembic.script import ScriptDirectory from alembic.testing import assert_raises from alembic.testing import assert_raises_message from alembic.testing import eq_ from alembic.testing import mock from alembic.testing.env import _no_sql_testing_config from alembic.testing.env import _sqlite_file_db from alembic.testing.env import _sqlite_testing_config from alembic.testing.env import clear_staging_env from alembic.testing.env import env_file_fixture from alembic.testing.env import staging_env from alembic.testing.env import three_rev_fixture from alembic.testing.env import write_script from alembic.testing.fixtures import capture_context_buffer from alembic.testing.fixtures import TestBase class _BufMixin(object): def _buf_fixture(self): # try to simulate how sys.stdout looks - we send it u'' # but then it's trying to encode to something. buf = BytesIO() wrapper = TextIOWrapper(buf, encoding="ascii", line_buffering=True) wrapper.getvalue = buf.getvalue return wrapper class HistoryTest(_BufMixin, TestBase): @classmethod def setup_class(cls): cls.env = staging_env() cls.cfg = _sqlite_testing_config() cls.a, cls.b, cls.c = three_rev_fixture(cls.cfg) cls._setup_env_file() @classmethod def teardown_class(cls): clear_staging_env() def teardown(self): self.cfg.set_main_option("revision_environment", "false") @classmethod def _setup_env_file(self): env_file_fixture( r""" from sqlalchemy import MetaData, engine_from_config target_metadata = MetaData() engine = engine_from_config( config.get_section(config.config_ini_section), prefix='sqlalchemy.') connection = engine.connect() context.configure( connection=connection, target_metadata=target_metadata ) try: with context.begin_transaction(): config.stdout.write(u"environment included OK\n") context.run_migrations() finally: connection.close() """ ) def _eq_cmd_output(self, buf, expected, env_token=False, currents=()): script = ScriptDirectory.from_config(self.cfg) # test default encode/decode behavior as well, # rev B has a non-ascii char in it + a coding header. assert_lines = [] for _id in expected: rev = script.get_revision(_id) if _id in currents: rev._db_current_indicator = True assert_lines.append(rev.log_entry) if env_token: assert_lines.insert(0, "environment included OK") eq_( buf.getvalue().decode("ascii", "replace").strip(), "\n".join(assert_lines) .encode("ascii", "replace") .decode("ascii") .strip(), ) def test_history_full(self): self.cfg.stdout = buf = self._buf_fixture() command.history(self.cfg, verbose=True) self._eq_cmd_output(buf, [self.c, self.b, self.a]) def test_history_full_environment(self): self.cfg.stdout = buf = self._buf_fixture() self.cfg.set_main_option("revision_environment", "true") command.history(self.cfg, verbose=True) self._eq_cmd_output(buf, [self.c, self.b, self.a], env_token=True) def test_history_num_range(self): self.cfg.stdout = buf = self._buf_fixture() command.history(self.cfg, "%s:%s" % (self.a, self.b), verbose=True) self._eq_cmd_output(buf, [self.b, self.a]) def test_history_num_range_environment(self): self.cfg.stdout = buf = self._buf_fixture() self.cfg.set_main_option("revision_environment", "true") command.history(self.cfg, "%s:%s" % (self.a, self.b), verbose=True) self._eq_cmd_output(buf, [self.b, self.a], env_token=True) def test_history_base_to_num(self): self.cfg.stdout = buf = self._buf_fixture() command.history(self.cfg, ":%s" % (self.b), verbose=True) self._eq_cmd_output(buf, [self.b, self.a]) def test_history_num_to_head(self): self.cfg.stdout = buf = self._buf_fixture() command.history(self.cfg, "%s:" % (self.a), verbose=True) self._eq_cmd_output(buf, [self.c, self.b, self.a]) def test_history_num_to_head_environment(self): self.cfg.stdout = buf = self._buf_fixture() self.cfg.set_main_option("revision_environment", "true") command.history(self.cfg, "%s:" % (self.a), verbose=True) self._eq_cmd_output(buf, [self.c, self.b, self.a], env_token=True) def test_history_num_plus_relative(self): self.cfg.stdout = buf = self._buf_fixture() command.history(self.cfg, "%s:+2" % (self.a), verbose=True) self._eq_cmd_output(buf, [self.c, self.b, self.a]) def test_history_relative_to_num(self): self.cfg.stdout = buf = self._buf_fixture() command.history(self.cfg, "-2:%s" % (self.c), verbose=True) self._eq_cmd_output(buf, [self.c, self.b, self.a]) def test_history_too_large_relative_to_num(self): self.cfg.stdout = buf = self._buf_fixture() command.history(self.cfg, "-5:%s" % (self.c), verbose=True) self._eq_cmd_output(buf, [self.c, self.b, self.a]) def test_history_current_to_head_as_b(self): command.stamp(self.cfg, self.b) self.cfg.stdout = buf = self._buf_fixture() command.history(self.cfg, "current:", verbose=True) self._eq_cmd_output(buf, [self.c, self.b], env_token=True) def test_history_current_to_head_as_base(self): command.stamp(self.cfg, "base") self.cfg.stdout = buf = self._buf_fixture() command.history(self.cfg, "current:", verbose=True) self._eq_cmd_output(buf, [self.c, self.b, self.a], env_token=True) def test_history_include_env(self): self.cfg.stdout = buf = self._buf_fixture() self.cfg.set_main_option("revision_environment", "true") command.history(self.cfg, verbose=True) self._eq_cmd_output(buf, [self.c, self.b, self.a], env_token=True) def test_history_indicate_current(self): command.stamp(self.cfg, (self.b,)) self.cfg.stdout = buf = self._buf_fixture() command.history(self.cfg, indicate_current=True, verbose=True) self._eq_cmd_output( buf, [self.c, self.b, self.a], currents=(self.b,), env_token=True ) class CurrentTest(_BufMixin, TestBase): @classmethod def setup_class(cls): cls.env = env = staging_env() cls.cfg = _sqlite_testing_config() cls.a1 = env.generate_revision("a1", "a1") cls.a2 = env.generate_revision("a2", "a2") cls.a3 = env.generate_revision("a3", "a3") cls.b1 = env.generate_revision("b1", "b1", head="base") cls.b2 = env.generate_revision("b2", "b2", head="b1", depends_on="a2") cls.b3 = env.generate_revision("b3", "b3", head="b2") @classmethod def teardown_class(cls): clear_staging_env() @contextmanager def _assert_lines(self, revs): self.cfg.stdout = buf = self._buf_fixture() yield lines = set( [ re.match(r"(^.\w)", elem).group(1) for elem in re.split( "\n", buf.getvalue().decode("ascii", "replace").strip() ) if elem ] ) eq_(lines, set(revs)) def test_no_current(self): command.stamp(self.cfg, ()) with self._assert_lines([]): command.current(self.cfg) def test_plain_current(self): command.stamp(self.cfg, ()) command.stamp(self.cfg, self.a3.revision) with self._assert_lines(["a3"]): command.current(self.cfg) def test_two_heads(self): command.stamp(self.cfg, ()) command.stamp(self.cfg, (self.a1.revision, self.b1.revision)) with self._assert_lines(["a1", "b1"]): command.current(self.cfg) def test_heads_one_is_dependent(self): command.stamp(self.cfg, ()) command.stamp(self.cfg, (self.b2.revision,)) with self._assert_lines(["a2", "b2"]): command.current(self.cfg) def test_heads_upg(self): command.stamp(self.cfg, (self.b2.revision,)) command.upgrade(self.cfg, (self.b3.revision)) with self._assert_lines(["a2", "b3"]): command.current(self.cfg) class RevisionTest(TestBase): def setUp(self): self.env = staging_env() self.cfg = _sqlite_testing_config() def tearDown(self): clear_staging_env() def _env_fixture(self, version_table_pk=True): env_file_fixture( """ from sqlalchemy import MetaData, engine_from_config target_metadata = MetaData() engine = engine_from_config( config.get_section(config.config_ini_section), prefix='sqlalchemy.') connection = engine.connect() context.configure( connection=connection, target_metadata=target_metadata, version_table_pk=%r ) try: with context.begin_transaction(): context.run_migrations() finally: connection.close() """ % (version_table_pk,) ) def test_create_rev_plain_db_not_up_to_date(self): self._env_fixture() command.revision(self.cfg) command.revision(self.cfg) # no problem def test_create_rev_autogen(self): self._env_fixture() command.revision(self.cfg, autogenerate=True) def test_create_rev_autogen_db_not_up_to_date(self): self._env_fixture() assert command.revision(self.cfg) assert_raises_message( util.CommandError, "Target database is not up to date.", command.revision, self.cfg, autogenerate=True, ) def test_create_rev_autogen_db_not_up_to_date_multi_heads(self): self._env_fixture() command.revision(self.cfg) rev2 = command.revision(self.cfg) rev3a = command.revision(self.cfg) command.revision(self.cfg, head=rev2.revision, splice=True) command.upgrade(self.cfg, "heads") command.revision(self.cfg, head=rev3a.revision) assert_raises_message( util.CommandError, "Target database is not up to date.", command.revision, self.cfg, autogenerate=True, ) def test_create_rev_plain_db_not_up_to_date_multi_heads(self): self._env_fixture() command.revision(self.cfg) rev2 = command.revision(self.cfg) rev3a = command.revision(self.cfg) command.revision(self.cfg, head=rev2.revision, splice=True) command.upgrade(self.cfg, "heads") command.revision(self.cfg, head=rev3a.revision) assert_raises_message( util.CommandError, "Multiple heads are present; please specify the head revision " "on which the new revision should be based, or perform a merge.", command.revision, self.cfg, ) def test_create_rev_autogen_need_to_select_head(self): self._env_fixture() command.revision(self.cfg) rev2 = command.revision(self.cfg) command.revision(self.cfg) command.revision(self.cfg, head=rev2.revision, splice=True) command.upgrade(self.cfg, "heads") # there's multiple heads present assert_raises_message( util.CommandError, "Multiple heads are present; please specify the head revision " "on which the new revision should be based, or perform a merge.", command.revision, self.cfg, autogenerate=True, ) def test_pk_constraint_normally_prevents_dupe_rows(self): self._env_fixture() command.revision(self.cfg) r2 = command.revision(self.cfg) db = _sqlite_file_db() command.upgrade(self.cfg, "head") assert_raises( sqla_exc.IntegrityError, db.execute, "insert into alembic_version values ('%s')" % r2.revision, ) def test_err_correctly_raised_on_dupe_rows_no_pk(self): self._env_fixture(version_table_pk=False) command.revision(self.cfg) r2 = command.revision(self.cfg) db = _sqlite_file_db() command.upgrade(self.cfg, "head") db.execute("insert into alembic_version values ('%s')" % r2.revision) assert_raises_message( util.CommandError, "Online migration expected to match one row when " "updating .* in 'alembic_version'; 2 found", command.downgrade, self.cfg, "-1", ) def test_create_rev_plain_need_to_select_head(self): self._env_fixture() command.revision(self.cfg) rev2 = command.revision(self.cfg) command.revision(self.cfg) command.revision(self.cfg, head=rev2.revision, splice=True) command.upgrade(self.cfg, "heads") # there's multiple heads present assert_raises_message( util.CommandError, "Multiple heads are present; please specify the head revision " "on which the new revision should be based, or perform a merge.", command.revision, self.cfg, ) def test_create_rev_plain_post_merge(self): self._env_fixture() command.revision(self.cfg) rev2 = command.revision(self.cfg) command.revision(self.cfg) command.revision(self.cfg, head=rev2.revision, splice=True) command.merge(self.cfg, "heads") command.revision(self.cfg) def test_create_rev_autogenerate_post_merge(self): self._env_fixture() command.revision(self.cfg) rev2 = command.revision(self.cfg) command.revision(self.cfg) command.revision(self.cfg, head=rev2.revision, splice=True) command.merge(self.cfg, "heads") command.upgrade(self.cfg, "heads") command.revision(self.cfg, autogenerate=True) def test_create_rev_depends_on(self): self._env_fixture() command.revision(self.cfg) rev2 = command.revision(self.cfg) rev3 = command.revision(self.cfg, depends_on=rev2.revision) eq_(rev3._resolved_dependencies, (rev2.revision,)) rev4 = command.revision( self.cfg, depends_on=[rev2.revision, rev3.revision] ) eq_(rev4._resolved_dependencies, (rev2.revision, rev3.revision)) def test_create_rev_depends_on_branch_label(self): self._env_fixture() command.revision(self.cfg) rev2 = command.revision(self.cfg, branch_label="foobar") rev3 = command.revision(self.cfg, depends_on="foobar") eq_(rev3.dependencies, "foobar") eq_(rev3._resolved_dependencies, (rev2.revision,)) def test_create_rev_depends_on_partial_revid(self): self._env_fixture() command.revision(self.cfg) rev2 = command.revision(self.cfg) assert len(rev2.revision) > 7 rev3 = command.revision(self.cfg, depends_on=rev2.revision[0:4]) eq_(rev3.dependencies, rev2.revision) eq_(rev3._resolved_dependencies, (rev2.revision,)) def test_create_rev_invalid_depends_on(self): self._env_fixture() command.revision(self.cfg) assert_raises_message( util.CommandError, "Can't locate revision identified by 'invalid'", command.revision, self.cfg, depends_on="invalid", ) def test_create_rev_autogenerate_db_not_up_to_date_post_merge(self): self._env_fixture() command.revision(self.cfg) rev2 = command.revision(self.cfg) command.revision(self.cfg) command.revision(self.cfg, head=rev2.revision, splice=True) command.upgrade(self.cfg, "heads") command.merge(self.cfg, "heads") assert_raises_message( util.CommandError, "Target database is not up to date.", command.revision, self.cfg, autogenerate=True, ) def test_nonsensical_sql_mode_autogen(self): self._env_fixture() assert_raises_message( util.CommandError, "Using --sql with --autogenerate does not make any sense", command.revision, self.cfg, autogenerate=True, sql=True, ) def test_nonsensical_sql_no_env(self): self._env_fixture() assert_raises_message( util.CommandError, "Using --sql with the revision command when revision_environment " "is not configured does not make any sense", command.revision, self.cfg, sql=True, ) def test_sensical_sql_w_env(self): self._env_fixture() self.cfg.set_main_option("revision_environment", "true") command.revision(self.cfg, sql=True) class UpgradeDowngradeStampTest(TestBase): def setUp(self): self.env = staging_env() self.cfg = cfg = _no_sql_testing_config() cfg.set_main_option("dialect_name", "sqlite") cfg.remove_main_option("url") self.a, self.b, self.c = three_rev_fixture(cfg) def tearDown(self): clear_staging_env() def test_version_from_none_insert(self): with capture_context_buffer() as buf: command.upgrade(self.cfg, self.a, sql=True) assert "CREATE TABLE alembic_version" in buf.getvalue() assert "INSERT INTO alembic_version" in buf.getvalue() assert "CREATE STEP 1" in buf.getvalue() assert "CREATE STEP 2" not in buf.getvalue() assert "CREATE STEP 3" not in buf.getvalue() def test_version_from_middle_update(self): with capture_context_buffer() as buf: command.upgrade(self.cfg, "%s:%s" % (self.b, self.c), sql=True) assert "CREATE TABLE alembic_version" not in buf.getvalue() assert "UPDATE alembic_version" in buf.getvalue() assert "CREATE STEP 1" not in buf.getvalue() assert "CREATE STEP 2" not in buf.getvalue() assert "CREATE STEP 3" in buf.getvalue() def test_version_to_none(self): with capture_context_buffer() as buf: command.downgrade(self.cfg, "%s:base" % self.c, sql=True) assert "CREATE TABLE alembic_version" not in buf.getvalue() assert "INSERT INTO alembic_version" not in buf.getvalue() assert "DROP TABLE alembic_version" in buf.getvalue() assert "DROP STEP 3" in buf.getvalue() assert "DROP STEP 2" in buf.getvalue() assert "DROP STEP 1" in buf.getvalue() def test_version_to_middle(self): with capture_context_buffer() as buf: command.downgrade(self.cfg, "%s:%s" % (self.c, self.a), sql=True) assert "CREATE TABLE alembic_version" not in buf.getvalue() assert "INSERT INTO alembic_version" not in buf.getvalue() assert "DROP TABLE alembic_version" not in buf.getvalue() assert "DROP STEP 3" in buf.getvalue() assert "DROP STEP 2" in buf.getvalue() assert "DROP STEP 1" not in buf.getvalue() def test_none_to_head_sql(self): with capture_context_buffer() as buf: command.upgrade(self.cfg, "head", sql=True) assert "CREATE TABLE alembic_version" in buf.getvalue() assert "UPDATE alembic_version" in buf.getvalue() assert "CREATE STEP 1" in buf.getvalue() assert "CREATE STEP 2" in buf.getvalue() assert "CREATE STEP 3" in buf.getvalue() def test_base_to_head_sql(self): with capture_context_buffer() as buf: command.upgrade(self.cfg, "base:head", sql=True) assert "CREATE TABLE alembic_version" in buf.getvalue() assert "UPDATE alembic_version" in buf.getvalue() assert "CREATE STEP 1" in buf.getvalue() assert "CREATE STEP 2" in buf.getvalue() assert "CREATE STEP 3" in buf.getvalue() def test_sql_stamp_from_rev(self): with capture_context_buffer() as buf: command.stamp(self.cfg, "%s:head" % self.a, sql=True) assert ( "UPDATE alembic_version " "SET version_num='%s' " "WHERE alembic_version.version_num = '%s';" % (self.c, self.a) ) in buf.getvalue() def test_sql_stamp_from_partial_rev(self): with capture_context_buffer() as buf: command.stamp(self.cfg, "%s:head" % self.a[0:7], sql=True) assert ( "UPDATE alembic_version " "SET version_num='%s' " "WHERE alembic_version.version_num = '%s';" % (self.c, self.a) ) in buf.getvalue() class LiveStampTest(TestBase): __only_on__ = "sqlite" def setUp(self): self.bind = _sqlite_file_db() self.env = staging_env() self.cfg = _sqlite_testing_config() self.a = a = util.rev_id() self.b = b = util.rev_id() script = ScriptDirectory.from_config(self.cfg) script.generate_revision(a, None, refresh=True) write_script( script, a, """ revision = '%s' down_revision = None """ % a, ) script.generate_revision(b, None, refresh=True) write_script( script, b, """ revision = '%s' down_revision = '%s' """ % (b, a), ) def tearDown(self): clear_staging_env() def test_stamp_creates_table(self): command.stamp(self.cfg, "head") eq_( self.bind.scalar("select version_num from alembic_version"), self.b ) def test_stamp_existing_upgrade(self): command.stamp(self.cfg, self.a) command.stamp(self.cfg, self.b) eq_( self.bind.scalar("select version_num from alembic_version"), self.b ) def test_stamp_existing_downgrade(self): command.stamp(self.cfg, self.b) command.stamp(self.cfg, self.a) eq_( self.bind.scalar("select version_num from alembic_version"), self.a ) class EditTest(TestBase): @classmethod def setup_class(cls): cls.env = staging_env() cls.cfg = _sqlite_testing_config() cls.a, cls.b, cls.c = three_rev_fixture(cls.cfg) @classmethod def teardown_class(cls): clear_staging_env() def setUp(self): command.stamp(self.cfg, "base") def test_edit_head(self): expected_call_arg = "%s/scripts/versions/%s_revision_c.py" % ( EditTest.cfg.config_args["here"], EditTest.c, ) with mock.patch("alembic.util.edit") as edit: command.edit(self.cfg, "head") edit.assert_called_with(expected_call_arg) def test_edit_b(self): expected_call_arg = "%s/scripts/versions/%s_revision_b.py" % ( EditTest.cfg.config_args["here"], EditTest.b, ) with mock.patch("alembic.util.edit") as edit: command.edit(self.cfg, self.b[0:3]) edit.assert_called_with(expected_call_arg) @testing.emits_python_deprecation_warning("the imp module is deprecated") def test_edit_with_missing_editor(self): with mock.patch("editor.edit") as edit_mock: edit_mock.side_effect = OSError("file not found") assert_raises_message( util.CommandError, "file not found", util.edit, "/not/a/file.txt", ) def test_edit_no_revs(self): assert_raises_message( util.CommandError, "No revision files indicated by symbol 'base'", command.edit, self.cfg, "base", ) def test_edit_no_current(self): assert_raises_message( util.CommandError, "No current revisions", command.edit, self.cfg, "current", ) def test_edit_current(self): expected_call_arg = "%s/scripts/versions/%s_revision_b.py" % ( EditTest.cfg.config_args["here"], EditTest.b, ) command.stamp(self.cfg, self.b) with mock.patch("alembic.util.edit") as edit: command.edit(self.cfg, "current") edit.assert_called_with(expected_call_arg) class CommandLineTest(TestBase): @classmethod def setup_class(cls): cls.env = staging_env() cls.cfg = _sqlite_testing_config() cls.a, cls.b, cls.c = three_rev_fixture(cls.cfg) def test_run_cmd_args_missing(self): canary = mock.Mock() orig_revision = command.revision # the command function has "process_revision_directives" # however the ArgumentParser does not. ensure things work def revision( config, message=None, autogenerate=False, sql=False, head="head", splice=False, branch_label=None, version_path=None, rev_id=None, depends_on=None, process_revision_directives=None, ): canary(config, message=message) revision.__module__ = "alembic.command" # CommandLine() pulls the function into the ArgumentParser # and needs the full signature, so we can't patch the "revision" # command normally as ArgumentParser gives us no way to get to it. config.command.revision = revision try: commandline = config.CommandLine() options = commandline.parser.parse_args(["revision", "-m", "foo"]) commandline.run_cmd(self.cfg, options) finally: config.command.revision = orig_revision eq_(canary.mock_calls, [mock.call(self.cfg, message="foo")]) def test_help_text(self): commands = { fn.__name__ for fn in [getattr(command, n) for n in dir(command)] if inspect.isfunction(fn) and fn.__name__[0] != "_" and fn.__module__ == "alembic.command" } # make sure we found them assert commands.intersection( {"upgrade", "downgrade", "merge", "revision"} ) # catch help text coming intersection with mock.patch("alembic.config.ArgumentParser") as argparse: config.CommandLine() for kall in argparse().add_subparsers().mock_calls: for sub_kall in kall.call_list(): if sub_kall[0] == "add_parser": cmdname = sub_kall[1][0] help_text = sub_kall[2]["help"] if help_text: commands.remove(cmdname) # more than two spaces assert not re.search(r" ", help_text) # no markup stuff assert ":" not in help_text # no newlines assert "\n" not in help_text # ends with a period assert help_text.endswith(".") # not too long assert len(help_text) < 80 assert not commands, "Commands without help text: %s" % commands def test_init_file_exists_and_is_not_empty(self): with mock.patch( "alembic.command.os.listdir", return_value=["file1", "file2"] ), mock.patch("alembic.command.os.access", return_value=True): directory = "alembic" assert_raises_message( util.CommandError, "Directory %s already exists and is not empty" % directory, command.init, self.cfg, directory=directory, ) def test_init_file_exists_and_is_empty(self): def access_(path, mode): if "generic" in path or path == "foobar": return True else: return False def listdir_(path): if path == "foobar": return [] else: return ["file1", "file2", "alembic.ini.mako"] with mock.patch( "alembic.command.os.access", side_effect=access_ ), mock.patch("alembic.command.os.makedirs") as makedirs, mock.patch( "alembic.command.os.listdir", side_effect=listdir_ ), mock.patch( "alembic.command.ScriptDirectory" ): command.init(self.cfg, directory="foobar") eq_(makedirs.mock_calls, [mock.call("foobar/versions")]) def test_init_file_doesnt_exist(self): def access_(path, mode): if "generic" in path: return True else: return False with mock.patch( "alembic.command.os.access", side_effect=access_ ), mock.patch("alembic.command.os.makedirs") as makedirs, mock.patch( "alembic.command.ScriptDirectory" ): command.init(self.cfg, directory="foobar") eq_( makedirs.mock_calls, [mock.call("foobar"), mock.call("foobar/versions")], ) zzzeek-alembic-bee044a1c187/tests/test_config.py000066400000000000000000000120671353106760100216300ustar00rootroot00000000000000#!coding: utf-8 from alembic import config from alembic import util from alembic.migration import MigrationContext from alembic.operations import Operations from alembic.script import ScriptDirectory from alembic.testing import assert_raises_message from alembic.testing import eq_ from alembic.testing import mock from alembic.testing.env import _no_sql_testing_config from alembic.testing.env import _write_config_file from alembic.testing.env import clear_staging_env from alembic.testing.env import staging_env from alembic.testing.fixtures import capture_db from alembic.testing.fixtures import TestBase from alembic.util import compat class FileConfigTest(TestBase): def test_config_args(self): cfg = _write_config_file( """ [alembic] migrations = %(base_path)s/db/migrations """ ) test_cfg = config.Config( cfg.config_file_name, config_args=dict(base_path="/home/alembic") ) eq_( test_cfg.get_section_option("alembic", "migrations"), "/home/alembic/db/migrations", ) def tearDown(self): clear_staging_env() class ConfigTest(TestBase): def test_config_no_file_main_option(self): cfg = config.Config() cfg.set_main_option("url", "postgresql://foo/bar") eq_(cfg.get_main_option("url"), "postgresql://foo/bar") def test_config_no_file_section_option(self): cfg = config.Config() cfg.set_section_option("foo", "url", "postgresql://foo/bar") eq_(cfg.get_section_option("foo", "url"), "postgresql://foo/bar") cfg.set_section_option("foo", "echo", "True") eq_(cfg.get_section_option("foo", "echo"), "True") def test_config_set_main_option_percent(self): cfg = config.Config() cfg.set_main_option("foob", "a %% percent") eq_(cfg.get_main_option("foob"), "a % percent") def test_config_set_section_option_percent(self): cfg = config.Config() cfg.set_section_option("some_section", "foob", "a %% percent") eq_(cfg.get_section_option("some_section", "foob"), "a % percent") def test_config_set_section_option_interpolation(self): cfg = config.Config() cfg.set_section_option("some_section", "foob", "foob_value") cfg.set_section_option("some_section", "bar", "bar with %(foob)s") eq_( cfg.get_section_option("some_section", "bar"), "bar with foob_value", ) def test_standalone_op(self): eng, buf = capture_db() env = MigrationContext.configure(eng) op = Operations(env) op.alter_column("t", "c", nullable=True) eq_(buf, ["ALTER TABLE t ALTER COLUMN c DROP NOT NULL"]) def test_no_script_error(self): cfg = config.Config() assert_raises_message( util.CommandError, "No 'script_location' key found in configuration.", ScriptDirectory.from_config, cfg, ) def test_attributes_attr(self): m1 = mock.Mock() cfg = config.Config() cfg.attributes["connection"] = m1 eq_(cfg.attributes["connection"], m1) def test_attributes_construtor(self): m1 = mock.Mock() m2 = mock.Mock() cfg = config.Config(attributes={"m1": m1}) cfg.attributes["connection"] = m2 eq_(cfg.attributes, {"m1": m1, "connection": m2}) class StdoutOutputEncodingTest(TestBase): def test_plain(self): stdout = mock.Mock(encoding="latin-1") cfg = config.Config(stdout=stdout) cfg.print_stdout("test %s %s", "x", "y") eq_( stdout.mock_calls, [mock.call.write("test x y"), mock.call.write("\n")], ) def test_utf8_unicode(self): stdout = mock.Mock(encoding="latin-1") cfg = config.Config(stdout=stdout) cfg.print_stdout(compat.u("méil %s %s"), "x", "y") eq_( stdout.mock_calls, [mock.call.write(compat.u("méil x y")), mock.call.write("\n")], ) def test_ascii_unicode(self): stdout = mock.Mock(encoding=None) cfg = config.Config(stdout=stdout) cfg.print_stdout(compat.u("méil %s %s"), "x", "y") eq_( stdout.mock_calls, [mock.call.write("m?il x y"), mock.call.write("\n")], ) def test_only_formats_output_with_args(self): stdout = mock.Mock(encoding=None) cfg = config.Config(stdout=stdout) cfg.print_stdout(compat.u("test 3%")) eq_( stdout.mock_calls, [mock.call.write("test 3%"), mock.call.write("\n")], ) class TemplateOutputEncodingTest(TestBase): def setUp(self): staging_env() self.cfg = _no_sql_testing_config() def tearDown(self): clear_staging_env() def test_default(self): script = ScriptDirectory.from_config(self.cfg) eq_(script.output_encoding, "utf-8") def test_setting(self): self.cfg.set_main_option("output_encoding", "latin-1") script = ScriptDirectory.from_config(self.cfg) eq_(script.output_encoding, "latin-1") zzzeek-alembic-bee044a1c187/tests/test_environment.py000066400000000000000000000072041353106760100227240ustar00rootroot00000000000000#!coding: utf-8 from alembic import command from alembic.environment import EnvironmentContext from alembic.migration import MigrationContext from alembic.script import ScriptDirectory from alembic.testing import config from alembic.testing import eq_ from alembic.testing import is_ from alembic.testing import mock from alembic.testing.assertions import expect_warnings from alembic.testing.env import _no_sql_testing_config from alembic.testing.env import _sqlite_file_db from alembic.testing.env import clear_staging_env from alembic.testing.env import staging_env from alembic.testing.env import write_script from alembic.testing.fixtures import capture_context_buffer from alembic.testing.fixtures import TestBase class EnvironmentTest(TestBase): def setUp(self): staging_env() self.cfg = _no_sql_testing_config() def tearDown(self): clear_staging_env() def _fixture(self, **kw): script = ScriptDirectory.from_config(self.cfg) env = EnvironmentContext(self.cfg, script, **kw) return env def test_x_arg(self): env = self._fixture() self.cfg.cmd_opts = mock.Mock(x="y=5") eq_(env.get_x_argument(), "y=5") def test_x_arg_asdict(self): env = self._fixture() self.cfg.cmd_opts = mock.Mock(x=["y=5"]) eq_(env.get_x_argument(as_dictionary=True), {"y": "5"}) def test_x_arg_no_opts(self): env = self._fixture() eq_(env.get_x_argument(), []) def test_x_arg_no_opts_asdict(self): env = self._fixture() eq_(env.get_x_argument(as_dictionary=True), {}) def test_tag_arg(self): env = self._fixture(tag="x") eq_(env.get_tag_argument(), "x") def test_migration_context_has_config(self): env = self._fixture() env.configure(url="sqlite://") ctx = env._migration_context is_(ctx.config, self.cfg) ctx = MigrationContext(ctx.dialect, None, {}) is_(ctx.config, None) @config.requirements.sqlalchemy_issue_3740 def test_sql_mode_parameters(self): env = self._fixture() a_rev = "arev" env.script.generate_revision(a_rev, "revision a", refresh=True) write_script( env.script, a_rev, """\ "Rev A" revision = '{}' down_revision = None from alembic import op def upgrade(): op.execute(''' do some SQL thing with a % percent sign % ''') """.format( a_rev ), ) with capture_context_buffer(transactional_ddl=True) as buf: command.upgrade(self.cfg, "arev", sql=True) assert "do some SQL thing with a % percent sign %" in buf.getvalue() def test_warning_on_passing_engine(self): env = self._fixture() engine = _sqlite_file_db() a_rev = "arev" env.script.generate_revision(a_rev, "revision a", refresh=True) write_script( env.script, a_rev, """\ "Rev A" revision = '%s' down_revision = None from alembic import op def upgrade(): pass def downgrade(): pass """ % a_rev, ) migration_fn = mock.MagicMock() def upgrade(rev, context): migration_fn(rev, context) return env.script._upgrade_revs(a_rev, rev) with expect_warnings( r"'connection' argument to configure\(\) is " r"expected to be a sqlalchemy.engine.Connection " ): env.configure( connection=engine, fn=upgrade, transactional_ddl=False ) env.run_migrations() eq_(migration_fn.mock_calls, [mock.call((), env._migration_context)]) zzzeek-alembic-bee044a1c187/tests/test_external_dialect.py000066400000000000000000000102331353106760100236630ustar00rootroot00000000000000from sqlalchemy import MetaData from sqlalchemy import types as sqla_types from sqlalchemy.engine import default from alembic import autogenerate from alembic.autogenerate import api from alembic.autogenerate import render from alembic.ddl import impl from alembic.migration import MigrationContext from alembic.testing import eq_ from alembic.testing import eq_ignore_whitespace from alembic.testing.fixtures import TestBase class CustomDialect(default.DefaultDialect): name = "custom_dialect" try: from sqlalchemy.dialects import registry except ImportError: pass else: registry.register("custom_dialect", __name__, "CustomDialect") class CustomDialectImpl(impl.DefaultImpl): __dialect__ = "custom_dialect" transactional_ddl = False def render_type(self, type_, autogen_context): if type_.__module__ == __name__: autogen_context.imports.add( "from %s import custom_dialect_types" % (__name__,) ) is_external = True else: is_external = False if is_external and hasattr( self, "_render_%s_type" % type_.__visit_name__ ): meth = getattr(self, "_render_%s_type" % type_.__visit_name__) return meth(type_, autogen_context) if is_external: return "%s.%r" % ("custom_dialect_types", type_) else: return None def _render_EXT_ARRAY_type(self, type_, autogen_context): return render._render_type_w_subtype( type_, autogen_context, "item_type", r"(.+?\()", prefix="custom_dialect_types.", ) class EXT_ARRAY(sqla_types.TypeEngine): __visit_name__ = "EXT_ARRAY" def __init__(self, item_type): if isinstance(item_type, type): item_type = item_type() self.item_type = item_type super(EXT_ARRAY, self).__init__() class FOOBARTYPE(sqla_types.TypeEngine): __visit_name__ = "FOOBARTYPE" class ExternalDialectRenderTest(TestBase): def setUp(self): ctx_opts = { "sqlalchemy_module_prefix": "sa.", "alembic_module_prefix": "op.", "target_metadata": MetaData(), "user_module_prefix": None, } context = MigrationContext.configure( dialect_name="custom_dialect", opts=ctx_opts ) self.autogen_context = api.AutogenContext(context) def test_render_type(self): eq_ignore_whitespace( autogenerate.render._repr_type(FOOBARTYPE(), self.autogen_context), "custom_dialect_types.FOOBARTYPE()", ) eq_( self.autogen_context.imports, set( [ "from tests.test_external_dialect " "import custom_dialect_types" ] ), ) def test_external_nested_render_sqla_type(self): eq_ignore_whitespace( autogenerate.render._repr_type( EXT_ARRAY(sqla_types.Integer), self.autogen_context ), "custom_dialect_types.EXT_ARRAY(sa.Integer())", ) eq_ignore_whitespace( autogenerate.render._repr_type( EXT_ARRAY(sqla_types.DateTime(timezone=True)), self.autogen_context, ), "custom_dialect_types.EXT_ARRAY(sa.DateTime(timezone=True))", ) eq_( self.autogen_context.imports, set( [ "from tests.test_external_dialect " "import custom_dialect_types" ] ), ) def test_external_nested_render_external_type(self): eq_ignore_whitespace( autogenerate.render._repr_type( EXT_ARRAY(FOOBARTYPE), self.autogen_context ), "custom_dialect_types.EXT_ARRAY" "(custom_dialect_types.FOOBARTYPE())", ) eq_( self.autogen_context.imports, set( [ "from tests.test_external_dialect " "import custom_dialect_types" ] ), ) zzzeek-alembic-bee044a1c187/tests/test_mssql.py000066400000000000000000000262151353106760100215220ustar00rootroot00000000000000"""Test op functions against MSSQL.""" from sqlalchemy import Column from sqlalchemy import Integer from alembic import command from alembic import op from alembic import util from alembic.testing import assert_raises_message from alembic.testing import eq_ from alembic.testing.env import _no_sql_testing_config from alembic.testing.env import clear_staging_env from alembic.testing.env import staging_env from alembic.testing.env import three_rev_fixture from alembic.testing.fixtures import capture_context_buffer from alembic.testing.fixtures import op_fixture from alembic.testing.fixtures import TestBase class FullEnvironmentTests(TestBase): @classmethod def setup_class(cls): staging_env() directives = "sqlalchemy.legacy_schema_aliasing=false" cls.cfg = cfg = _no_sql_testing_config("mssql", directives) cls.a, cls.b, cls.c = three_rev_fixture(cfg) @classmethod def teardown_class(cls): clear_staging_env() def test_begin_commit(self): with capture_context_buffer(transactional_ddl=True) as buf: command.upgrade(self.cfg, self.a, sql=True) assert "BEGIN TRANSACTION;" in buf.getvalue() # ensure ends in COMMIT; GO eq_( [x for x in buf.getvalue().splitlines() if x][-2:], ["COMMIT;", "GO"], ) def test_batch_separator_default(self): with capture_context_buffer() as buf: command.upgrade(self.cfg, self.a, sql=True) assert "GO" in buf.getvalue() def test_batch_separator_custom(self): with capture_context_buffer(mssql_batch_separator="BYE") as buf: command.upgrade(self.cfg, self.a, sql=True) assert "BYE" in buf.getvalue() class OpTest(TestBase): def test_add_column(self): context = op_fixture("mssql") op.add_column("t1", Column("c1", Integer, nullable=False)) context.assert_("ALTER TABLE t1 ADD c1 INTEGER NOT NULL") def test_add_column_with_default(self): context = op_fixture("mssql") op.add_column( "t1", Column("c1", Integer, nullable=False, server_default="12") ) context.assert_("ALTER TABLE t1 ADD c1 INTEGER NOT NULL DEFAULT '12'") def test_alter_column_rename_mssql(self): context = op_fixture("mssql") op.alter_column("t", "c", new_column_name="x") context.assert_("EXEC sp_rename 't.c', x, 'COLUMN'") def test_alter_column_rename_quoted_mssql(self): context = op_fixture("mssql") op.alter_column("t", "c", new_column_name="SomeFancyName") context.assert_("EXEC sp_rename 't.c', [SomeFancyName], 'COLUMN'") def test_alter_column_new_type(self): context = op_fixture("mssql") op.alter_column("t", "c", type_=Integer) context.assert_("ALTER TABLE t ALTER COLUMN c INTEGER") def test_alter_column_dont_touch_constraints(self): context = op_fixture("mssql") from sqlalchemy import Boolean op.alter_column( "tests", "col", existing_type=Boolean(), nullable=False ) context.assert_("ALTER TABLE tests ALTER COLUMN col BIT NOT NULL") def test_drop_index(self): context = op_fixture("mssql") op.drop_index("my_idx", "my_table") context.assert_contains("DROP INDEX my_idx ON my_table") def test_drop_column_w_default(self): context = op_fixture("mssql") op.drop_column("t1", "c1", mssql_drop_default=True) op.drop_column("t1", "c2", mssql_drop_default=True) context.assert_contains( "exec('alter table t1 drop constraint ' + @const_name)" ) context.assert_contains("ALTER TABLE t1 DROP COLUMN c1") def test_drop_column_w_default_in_batch(self): context = op_fixture("mssql") with op.batch_alter_table("t1", schema=None) as batch_op: batch_op.drop_column("c1", mssql_drop_default=True) batch_op.drop_column("c2", mssql_drop_default=True) context.assert_contains( "exec('alter table t1 drop constraint ' + @const_name)" ) context.assert_contains("ALTER TABLE t1 DROP COLUMN c1") def test_alter_column_drop_default(self): context = op_fixture("mssql") op.alter_column("t", "c", server_default=None) context.assert_contains( "exec('alter table t drop constraint ' + @const_name)" ) def test_alter_column_dont_drop_default(self): context = op_fixture("mssql") op.alter_column("t", "c", server_default=False) context.assert_() def test_drop_column_w_schema(self): context = op_fixture("mssql") op.drop_column("t1", "c1", schema="xyz") context.assert_contains("ALTER TABLE xyz.t1 DROP COLUMN c1") def test_drop_column_w_check(self): context = op_fixture("mssql") op.drop_column("t1", "c1", mssql_drop_check=True) op.drop_column("t1", "c2", mssql_drop_check=True) context.assert_contains( "exec('alter table t1 drop constraint ' + @const_name)" ) context.assert_contains("ALTER TABLE t1 DROP COLUMN c1") def test_drop_column_w_check_in_batch(self): context = op_fixture("mssql") with op.batch_alter_table("t1", schema=None) as batch_op: batch_op.drop_column("c1", mssql_drop_check=True) batch_op.drop_column("c2", mssql_drop_check=True) context.assert_contains( "exec('alter table t1 drop constraint ' + @const_name)" ) context.assert_contains("ALTER TABLE t1 DROP COLUMN c1") def test_drop_column_w_check_quoting(self): context = op_fixture("mssql") op.drop_column("table", "column", mssql_drop_check=True) context.assert_contains( "exec('alter table [table] drop constraint ' + @const_name)" ) context.assert_contains("ALTER TABLE [table] DROP COLUMN [column]") def test_alter_column_nullable_w_existing_type(self): context = op_fixture("mssql") op.alter_column("t", "c", nullable=True, existing_type=Integer) context.assert_("ALTER TABLE t ALTER COLUMN c INTEGER NULL") def test_drop_column_w_fk(self): context = op_fixture("mssql") op.drop_column("t1", "c1", mssql_drop_foreign_key=True) context.assert_contains( "exec('alter table t1 drop constraint ' + @const_name)" ) context.assert_contains("ALTER TABLE t1 DROP COLUMN c1") def test_drop_column_w_fk_in_batch(self): context = op_fixture("mssql") with op.batch_alter_table("t1", schema=None) as batch_op: batch_op.drop_column("c1", mssql_drop_foreign_key=True) context.assert_contains( "exec('alter table t1 drop constraint ' + @const_name)" ) context.assert_contains("ALTER TABLE t1 DROP COLUMN c1") def test_alter_column_not_nullable_w_existing_type(self): context = op_fixture("mssql") op.alter_column("t", "c", nullable=False, existing_type=Integer) context.assert_("ALTER TABLE t ALTER COLUMN c INTEGER NOT NULL") def test_alter_column_nullable_w_new_type(self): context = op_fixture("mssql") op.alter_column("t", "c", nullable=True, type_=Integer) context.assert_("ALTER TABLE t ALTER COLUMN c INTEGER NULL") def test_alter_column_not_nullable_w_new_type(self): context = op_fixture("mssql") op.alter_column("t", "c", nullable=False, type_=Integer) context.assert_("ALTER TABLE t ALTER COLUMN c INTEGER NOT NULL") def test_alter_column_nullable_type_required(self): op_fixture("mssql") assert_raises_message( util.CommandError, "MS-SQL ALTER COLUMN operations with NULL or " "NOT NULL require the existing_type or a new " "type_ be passed.", op.alter_column, "t", "c", nullable=False, ) def test_alter_add_server_default(self): context = op_fixture("mssql") op.alter_column("t", "c", server_default="5") context.assert_("ALTER TABLE t ADD DEFAULT '5' FOR c") def test_alter_replace_server_default(self): context = op_fixture("mssql") op.alter_column( "t", "c", server_default="5", existing_server_default="6" ) context.assert_contains( "exec('alter table t drop constraint ' + @const_name)" ) context.assert_contains("ALTER TABLE t ADD DEFAULT '5' FOR c") def test_alter_remove_server_default(self): context = op_fixture("mssql") op.alter_column("t", "c", server_default=None) context.assert_contains( "exec('alter table t drop constraint ' + @const_name)" ) def test_alter_do_everything(self): context = op_fixture("mssql") op.alter_column( "t", "c", new_column_name="c2", nullable=True, type_=Integer, server_default="5", ) context.assert_( "ALTER TABLE t ALTER COLUMN c INTEGER NULL", "ALTER TABLE t ADD DEFAULT '5' FOR c", "EXEC sp_rename 't.c', c2, 'COLUMN'", ) def test_rename_table(self): context = op_fixture("mssql") op.rename_table("t1", "t2") context.assert_contains("EXEC sp_rename 't1', t2") def test_rename_table_schema(self): context = op_fixture("mssql") op.rename_table("t1", "t2", schema="foobar") context.assert_contains("EXEC sp_rename 'foobar.t1', t2") def test_rename_table_casesens(self): context = op_fixture("mssql") op.rename_table("TeeOne", "TeeTwo") # yup, ran this in SQL Server 2014, the two levels of quoting # seems to be understood. Can't do the two levels on the # target name though ! context.assert_contains("EXEC sp_rename '[TeeOne]', [TeeTwo]") def test_rename_table_schema_casesens(self): context = op_fixture("mssql") op.rename_table("TeeOne", "TeeTwo", schema="FooBar") # yup, ran this in SQL Server 2014, the two levels of quoting # seems to be understood. Can't do the two levels on the # target name though ! context.assert_contains("EXEC sp_rename '[FooBar].[TeeOne]', [TeeTwo]") def test_alter_column_rename_mssql_schema(self): context = op_fixture("mssql") op.alter_column("t", "c", name="x", schema="y") context.assert_("EXEC sp_rename 'y.t.c', x, 'COLUMN'") def test_create_index_mssql_include(self): context = op_fixture("mssql") op.create_index( op.f("ix_mytable_a_b"), "mytable", ["col_a", "col_b"], unique=False, mssql_include=["col_c"], ) context.assert_contains( "CREATE INDEX ix_mytable_a_b ON mytable " "(col_a, col_b) INCLUDE (col_c)" ) def test_create_index_mssql_include_is_none(self): context = op_fixture("mssql") op.create_index( op.f("ix_mytable_a_b"), "mytable", ["col_a", "col_b"], unique=False ) context.assert_contains( "CREATE INDEX ix_mytable_a_b ON mytable " "(col_a, col_b)" ) zzzeek-alembic-bee044a1c187/tests/test_mysql.py000066400000000000000000000457141353106760100215350ustar00rootroot00000000000000from sqlalchemy import Boolean from sqlalchemy import Column from sqlalchemy import DATETIME from sqlalchemy import func from sqlalchemy import Integer from sqlalchemy import MetaData from sqlalchemy import Table from sqlalchemy import text from sqlalchemy import TIMESTAMP from sqlalchemy.engine.reflection import Inspector from alembic import op from alembic import util from alembic.migration import MigrationContext from alembic.testing import assert_raises_message from alembic.testing import config from alembic.testing.env import clear_staging_env from alembic.testing.env import staging_env from alembic.testing.fixtures import AlterColRoundTripFixture from alembic.testing.fixtures import op_fixture from alembic.testing.fixtures import TestBase class MySQLOpTest(TestBase): @config.requirements.comments_api def test_create_table_with_comment(self): context = op_fixture("mysql") op.create_table( "t2", Column("c1", Integer, primary_key=True), comment="This is a table comment", ) context.assert_contains("COMMENT='This is a table comment'") @config.requirements.comments_api def test_create_table_with_column_comments(self): context = op_fixture("mysql") op.create_table( "t2", Column("c1", Integer, primary_key=True, comment="c1 comment"), Column("c2", Integer, comment="c2 comment"), comment="This is a table comment", ) context.assert_( "CREATE TABLE t2 " "(c1 INTEGER NOT NULL COMMENT 'c1 comment' AUTO_INCREMENT, " # TODO: why is there no space at the end here? is that on the # SQLA side? "c2 INTEGER COMMENT 'c2 comment', PRIMARY KEY (c1))" "COMMENT='This is a table comment'" ) @config.requirements.comments_api def test_add_column_with_comment(self): context = op_fixture("mysql") op.add_column("t", Column("q", Integer, comment="This is a comment")) context.assert_( "ALTER TABLE t ADD COLUMN q INTEGER COMMENT 'This is a comment'" ) def test_rename_column(self): context = op_fixture("mysql") op.alter_column( "t1", "c1", new_column_name="c2", existing_type=Integer ) context.assert_("ALTER TABLE t1 CHANGE c1 c2 INTEGER NULL") def test_rename_column_quotes_needed_one(self): context = op_fixture("mysql") op.alter_column( "MyTable", "ColumnOne", new_column_name="ColumnTwo", existing_type=Integer, ) context.assert_( "ALTER TABLE `MyTable` CHANGE `ColumnOne` `ColumnTwo` INTEGER NULL" ) def test_rename_column_quotes_needed_two(self): context = op_fixture("mysql") op.alter_column( "my table", "column one", new_column_name="column two", existing_type=Integer, ) context.assert_( "ALTER TABLE `my table` CHANGE `column one` " "`column two` INTEGER NULL" ) def test_rename_column_serv_default(self): context = op_fixture("mysql") op.alter_column( "t1", "c1", new_column_name="c2", existing_type=Integer, existing_server_default="q", ) context.assert_("ALTER TABLE t1 CHANGE c1 c2 INTEGER NULL DEFAULT 'q'") def test_rename_column_serv_compiled_default(self): context = op_fixture("mysql") op.alter_column( "t1", "c1", existing_type=Integer, server_default=func.utc_thing(func.current_timestamp()), ) # this is not a valid MySQL default but the point is to just # test SQL expression rendering context.assert_( "ALTER TABLE t1 ALTER COLUMN c1 " "SET DEFAULT utc_thing(CURRENT_TIMESTAMP)" ) def test_rename_column_autoincrement(self): context = op_fixture("mysql") op.alter_column( "t1", "c1", new_column_name="c2", existing_type=Integer, existing_autoincrement=True, ) context.assert_( "ALTER TABLE t1 CHANGE c1 c2 INTEGER NULL AUTO_INCREMENT" ) def test_col_add_autoincrement(self): context = op_fixture("mysql") op.alter_column("t1", "c1", existing_type=Integer, autoincrement=True) context.assert_("ALTER TABLE t1 MODIFY c1 INTEGER NULL AUTO_INCREMENT") def test_col_remove_autoincrement(self): context = op_fixture("mysql") op.alter_column( "t1", "c1", existing_type=Integer, existing_autoincrement=True, autoincrement=False, ) context.assert_("ALTER TABLE t1 MODIFY c1 INTEGER NULL") def test_col_dont_remove_server_default(self): context = op_fixture("mysql") op.alter_column( "t1", "c1", existing_type=Integer, existing_server_default="1", server_default=False, ) context.assert_() def test_alter_column_drop_default(self): context = op_fixture("mysql") op.alter_column("t", "c", existing_type=Integer, server_default=None) context.assert_("ALTER TABLE t ALTER COLUMN c DROP DEFAULT") def test_alter_column_remove_schematype(self): context = op_fixture("mysql") op.alter_column( "t", "c", type_=Integer, existing_type=Boolean(create_constraint=True, name="ck1"), server_default=None, ) context.assert_("ALTER TABLE t MODIFY c INTEGER NULL") def test_alter_column_modify_default(self): context = op_fixture("mysql") # notice we dont need the existing type on this one... op.alter_column("t", "c", server_default="1") context.assert_("ALTER TABLE t ALTER COLUMN c SET DEFAULT '1'") def test_alter_column_modify_datetime_default(self): # use CHANGE format when the datatype is DATETIME or TIMESTAMP, # as this is needed for a functional default which is what you'd # get with a DATETIME/TIMESTAMP. Will also work in the very unlikely # case the default is a fixed timestamp value. context = op_fixture("mysql") op.alter_column( "t", "c", existing_type=DATETIME(), server_default=text("CURRENT_TIMESTAMP"), ) context.assert_( "ALTER TABLE t CHANGE c c DATETIME NULL DEFAULT CURRENT_TIMESTAMP" ) def test_col_not_nullable(self): context = op_fixture("mysql") op.alter_column("t1", "c1", nullable=False, existing_type=Integer) context.assert_("ALTER TABLE t1 MODIFY c1 INTEGER NOT NULL") def test_col_not_nullable_existing_serv_default(self): context = op_fixture("mysql") op.alter_column( "t1", "c1", nullable=False, existing_type=Integer, existing_server_default="5", ) context.assert_( "ALTER TABLE t1 MODIFY c1 INTEGER NOT NULL DEFAULT '5'" ) def test_col_nullable(self): context = op_fixture("mysql") op.alter_column("t1", "c1", nullable=True, existing_type=Integer) context.assert_("ALTER TABLE t1 MODIFY c1 INTEGER NULL") def test_col_multi_alter(self): context = op_fixture("mysql") op.alter_column( "t1", "c1", nullable=False, server_default="q", type_=Integer ) context.assert_( "ALTER TABLE t1 MODIFY c1 INTEGER NOT NULL DEFAULT 'q'" ) def test_alter_column_multi_alter_w_drop_default(self): context = op_fixture("mysql") op.alter_column( "t1", "c1", nullable=False, server_default=None, type_=Integer ) context.assert_("ALTER TABLE t1 MODIFY c1 INTEGER NOT NULL") def test_col_alter_type_required(self): op_fixture("mysql") assert_raises_message( util.CommandError, "MySQL CHANGE/MODIFY COLUMN operations require the existing type.", op.alter_column, "t1", "c1", nullable=False, server_default="q", ) @config.requirements.comments_api def test_alter_column_add_comment(self): context = op_fixture("mysql") op.alter_column( "t1", "c1", comment="This is a column comment", existing_type=Boolean(), schema="foo", ) context.assert_( "ALTER TABLE foo.t1 MODIFY c1 BOOL NULL " "COMMENT 'This is a column comment'" ) @config.requirements.comments_api def test_alter_column_add_comment_quoting(self): context = op_fixture("mysql") op.alter_column( "t1", "c1", comment="This is a 'column' comment", existing_type=Boolean(), schema="foo", ) context.assert_( "ALTER TABLE foo.t1 MODIFY c1 BOOL NULL " "COMMENT 'This is a ''column'' comment'" ) @config.requirements.comments_api def test_alter_column_drop_comment(self): context = op_fixture("mysql") op.alter_column( "t", "c", existing_type=Boolean(), schema="foo", comment=None, existing_comment="This is a column comment", ) context.assert_("ALTER TABLE foo.t MODIFY c BOOL NULL") @config.requirements.comments_api def test_alter_column_existing_comment(self): context = op_fixture("mysql") op.alter_column( "t1", "c1", nullable=False, existing_comment="existing column comment", existing_type=Integer, ) context.assert_( "ALTER TABLE t1 MODIFY c1 INTEGER NOT NULL " "COMMENT 'existing column comment'" ) @config.requirements.comments_api def test_rename_column_existing_comment(self): context = op_fixture("mysql") op.alter_column( "t1", "c1", new_column_name="newc1", existing_nullable=False, existing_comment="existing column comment", existing_type=Integer, ) context.assert_( "ALTER TABLE t1 CHANGE c1 newc1 INTEGER NOT NULL " "COMMENT 'existing column comment'" ) @config.requirements.comments_api def test_alter_column_new_comment_replaces_existing(self): context = op_fixture("mysql") op.alter_column( "t1", "c1", nullable=False, comment="This is a column comment", existing_comment="existing column comment", existing_type=Integer, ) context.assert_( "ALTER TABLE t1 MODIFY c1 INTEGER NOT NULL " "COMMENT 'This is a column comment'" ) @config.requirements.comments_api def test_create_table_comment(self): # this is handled by SQLAlchemy's compilers context = op_fixture("mysql") op.create_table_comment("t2", comment="t2 table", schema="foo") context.assert_("ALTER TABLE foo.t2 COMMENT 't2 table'") @config.requirements.comments_api @config.requirements.sqlalchemy_issue_4436 def test_drop_table_comment(self): # this is handled by SQLAlchemy's compilers context = op_fixture("mysql") op.drop_table_comment("t2", existing_comment="t2 table", schema="foo") context.assert_("ALTER TABLE foo.t2 COMMENT ''") def test_drop_fk(self): context = op_fixture("mysql") op.drop_constraint("f1", "t1", "foreignkey") context.assert_("ALTER TABLE t1 DROP FOREIGN KEY f1") def test_drop_fk_quoted(self): context = op_fixture("mysql") op.drop_constraint("MyFk", "MyTable", "foreignkey") context.assert_("ALTER TABLE `MyTable` DROP FOREIGN KEY `MyFk`") def test_drop_constraint_primary(self): context = op_fixture("mysql") op.drop_constraint("primary", "t1", type_="primary") context.assert_("ALTER TABLE t1 DROP PRIMARY KEY") def test_drop_unique(self): context = op_fixture("mysql") op.drop_constraint("f1", "t1", "unique") context.assert_("ALTER TABLE t1 DROP INDEX f1") def test_drop_unique_quoted(self): context = op_fixture("mysql") op.drop_constraint("MyUnique", "MyTable", "unique") context.assert_("ALTER TABLE `MyTable` DROP INDEX `MyUnique`") def test_drop_check_mariadb(self): context = op_fixture("mariadb") op.drop_constraint("f1", "t1", "check") context.assert_("ALTER TABLE t1 DROP CONSTRAINT f1") def test_drop_check_quoted_mariadb(self): context = op_fixture("mariadb") op.drop_constraint("MyCheck", "MyTable", "check") context.assert_("ALTER TABLE `MyTable` DROP CONSTRAINT `MyCheck`") def test_drop_check_mysql(self): context = op_fixture("mysql") op.drop_constraint("f1", "t1", "check") context.assert_("ALTER TABLE t1 DROP CHECK f1") def test_drop_check_quoted_mysql(self): context = op_fixture("mysql") op.drop_constraint("MyCheck", "MyTable", "check") context.assert_("ALTER TABLE `MyTable` DROP CHECK `MyCheck`") def test_drop_unknown(self): op_fixture("mysql") assert_raises_message( TypeError, "'type' can be one of 'check', 'foreignkey', " "'primary', 'unique', None", op.drop_constraint, "f1", "t1", "typo", ) def test_drop_generic_constraint(self): op_fixture("mysql") assert_raises_message( NotImplementedError, "No generic 'DROP CONSTRAINT' in MySQL - please " "specify constraint type", op.drop_constraint, "f1", "t1", ) class MySQLBackendOpTest(AlterColRoundTripFixture, TestBase): __only_on__ = "mysql" __backend__ = True def test_add_timestamp_server_default_current_timestamp(self): self._run_alter_col( {"type": TIMESTAMP()}, {"server_default": text("CURRENT_TIMESTAMP")}, ) def test_add_datetime_server_default_current_timestamp(self): self._run_alter_col( {"type": DATETIME()}, {"server_default": text("CURRENT_TIMESTAMP")} ) def test_add_timestamp_server_default_now(self): self._run_alter_col( {"type": TIMESTAMP()}, {"server_default": text("NOW()")}, compare={"server_default": text("CURRENT_TIMESTAMP")}, ) def test_add_datetime_server_default_now(self): self._run_alter_col( {"type": DATETIME()}, {"server_default": text("NOW()")}, compare={"server_default": text("CURRENT_TIMESTAMP")}, ) def test_add_timestamp_server_default_current_timestamp_bundle_onupdate( self ): # note SQLAlchemy reflection bundles the ON UPDATE part into the # server default reflection see # https://github.com/sqlalchemy/sqlalchemy/issues/4652 self._run_alter_col( {"type": TIMESTAMP()}, { "server_default": text( "CURRENT_TIMESTAMP ON UPDATE CURRENT_TIMESTAMP" ) }, ) def test_add_datetime_server_default_current_timestamp_bundle_onupdate( self ): # note SQLAlchemy reflection bundles the ON UPDATE part into the # server default reflection see # https://github.com/sqlalchemy/sqlalchemy/issues/4652 self._run_alter_col( {"type": DATETIME()}, { "server_default": text( "CURRENT_TIMESTAMP ON UPDATE CURRENT_TIMESTAMP" ) }, ) class MySQLDefaultCompareTest(TestBase): __only_on__ = "mysql" __backend__ = True __requires__ = ("mysql_timestamp_reflection",) @classmethod def setup_class(cls): cls.bind = config.db staging_env() context = MigrationContext.configure( connection=cls.bind.connect(), opts={"compare_type": True, "compare_server_default": True}, ) connection = context.bind cls.autogen_context = { "imports": set(), "connection": connection, "dialect": connection.dialect, "context": context, } @classmethod def teardown_class(cls): clear_staging_env() def setUp(self): self.metadata = MetaData(self.bind) def tearDown(self): self.metadata.drop_all() def _compare_default_roundtrip(self, type_, txt, alternate=None): if alternate: expected = True else: alternate = txt expected = False t = Table( "test", self.metadata, Column( "somecol", type_, server_default=text(txt) if txt else None ), ) t2 = Table( "test", MetaData(), Column("somecol", type_, server_default=text(alternate)), ) assert ( self._compare_default(t, t2, t2.c.somecol, alternate) is expected ) def _compare_default(self, t1, t2, col, rendered): t1.create(self.bind) insp = Inspector.from_engine(self.bind) cols = insp.get_columns(t1.name) refl = Table(t1.name, MetaData()) insp.reflecttable(refl, None) ctx = self.autogen_context["context"] return ctx.impl.compare_server_default( refl.c[cols[0]["name"]], col, rendered, cols[0]["default"] ) def test_compare_timestamp_current_timestamp(self): self._compare_default_roundtrip(TIMESTAMP(), "CURRENT_TIMESTAMP") def test_compare_timestamp_current_timestamp_diff(self): self._compare_default_roundtrip(TIMESTAMP(), None, "CURRENT_TIMESTAMP") def test_compare_timestamp_current_timestamp_bundle_onupdate(self): self._compare_default_roundtrip( TIMESTAMP(), "CURRENT_TIMESTAMP ON UPDATE CURRENT_TIMESTAMP" ) def test_compare_timestamp_current_timestamp_diff_bundle_onupdate(self): self._compare_default_roundtrip( TIMESTAMP(), None, "CURRENT_TIMESTAMP ON UPDATE CURRENT_TIMESTAMP" ) def test_compare_integer_from_none(self): self._compare_default_roundtrip(Integer(), None, "0") def test_compare_integer_same(self): self._compare_default_roundtrip(Integer(), "5") def test_compare_integer_diff(self): self._compare_default_roundtrip(Integer(), "5", "7") def test_compare_boolean_same(self): self._compare_default_roundtrip(Boolean(), "1") def test_compare_boolean_diff(self): self._compare_default_roundtrip(Boolean(), "1", "0") zzzeek-alembic-bee044a1c187/tests/test_offline_environment.py000066400000000000000000000230661353106760100244320ustar00rootroot00000000000000import re from alembic import command from alembic import util from alembic.testing import assert_raises_message from alembic.testing.env import _no_sql_testing_config from alembic.testing.env import clear_staging_env from alembic.testing.env import env_file_fixture from alembic.testing.env import multi_heads_fixture from alembic.testing.env import staging_env from alembic.testing.env import three_rev_fixture from alembic.testing.fixtures import capture_context_buffer from alembic.testing.fixtures import TestBase a = b = c = None class OfflineEnvironmentTest(TestBase): def setUp(self): staging_env() self.cfg = _no_sql_testing_config() global a, b, c a, b, c = three_rev_fixture(self.cfg) def tearDown(self): clear_staging_env() def test_not_requires_connection(self): env_file_fixture( """ assert not context.requires_connection() """ ) command.upgrade(self.cfg, a, sql=True) command.downgrade(self.cfg, "%s:%s" % (b, a), sql=True) def test_requires_connection(self): env_file_fixture( """ assert context.requires_connection() """ ) command.upgrade(self.cfg, a) command.downgrade(self.cfg, a) def test_starting_rev_post_context(self): env_file_fixture( """ context.configure(dialect_name='sqlite', starting_rev='x') assert context.get_starting_revision_argument() == 'x' """ ) command.upgrade(self.cfg, a, sql=True) command.downgrade(self.cfg, "%s:%s" % (b, a), sql=True) command.current(self.cfg) command.stamp(self.cfg, a) def test_starting_rev_pre_context(self): env_file_fixture( """ assert context.get_starting_revision_argument() == 'x' """ ) command.upgrade(self.cfg, "x:y", sql=True) command.downgrade(self.cfg, "x:y", sql=True) def test_starting_rev_pre_context_cmd_w_no_startrev(self): env_file_fixture( """ assert context.get_starting_revision_argument() == 'x' """ ) assert_raises_message( util.CommandError, "No starting revision argument is available.", command.current, self.cfg, ) def test_starting_rev_current_pre_context(self): env_file_fixture( """ assert context.get_starting_revision_argument() is None """ ) assert_raises_message( util.CommandError, "No starting revision argument is available.", command.current, self.cfg, ) def test_destination_rev_pre_context(self): env_file_fixture( """ assert context.get_revision_argument() == '%s' """ % b ) command.upgrade(self.cfg, b, sql=True) command.stamp(self.cfg, b, sql=True) command.downgrade(self.cfg, "%s:%s" % (c, b), sql=True) def test_destination_rev_pre_context_multihead(self): d, e, f = multi_heads_fixture(self.cfg, a, b, c) env_file_fixture( """ assert set(context.get_revision_argument()) == set(('%s', '%s', '%s', )) """ % (f, e, c) ) command.upgrade(self.cfg, "heads", sql=True) def test_destination_rev_post_context(self): env_file_fixture( """ context.configure(dialect_name='sqlite') assert context.get_revision_argument() == '%s' """ % b ) command.upgrade(self.cfg, b, sql=True) command.downgrade(self.cfg, "%s:%s" % (c, b), sql=True) command.stamp(self.cfg, b, sql=True) def test_destination_rev_post_context_multihead(self): d, e, f = multi_heads_fixture(self.cfg, a, b, c) env_file_fixture( """ context.configure(dialect_name='sqlite') assert set(context.get_revision_argument()) == set(('%s', '%s', '%s', )) """ % (f, e, c) ) command.upgrade(self.cfg, "heads", sql=True) def test_head_rev_pre_context(self): env_file_fixture( """ assert context.get_head_revision() == '%s' assert context.get_head_revisions() == ('%s', ) """ % (c, c) ) command.upgrade(self.cfg, b, sql=True) command.downgrade(self.cfg, "%s:%s" % (b, a), sql=True) command.stamp(self.cfg, b, sql=True) command.current(self.cfg) def test_head_rev_pre_context_multihead(self): d, e, f = multi_heads_fixture(self.cfg, a, b, c) env_file_fixture( """ assert set(context.get_head_revisions()) == set(('%s', '%s', '%s', )) """ % (e, f, c) ) command.upgrade(self.cfg, e, sql=True) command.downgrade(self.cfg, "%s:%s" % (e, b), sql=True) command.stamp(self.cfg, c, sql=True) command.current(self.cfg) def test_head_rev_post_context(self): env_file_fixture( """ context.configure(dialect_name='sqlite') assert context.get_head_revision() == '%s' assert context.get_head_revisions() == ('%s', ) """ % (c, c) ) command.upgrade(self.cfg, b, sql=True) command.downgrade(self.cfg, "%s:%s" % (b, a), sql=True) command.stamp(self.cfg, b, sql=True) command.current(self.cfg) def test_head_rev_post_context_multihead(self): d, e, f = multi_heads_fixture(self.cfg, a, b, c) env_file_fixture( """ context.configure(dialect_name='sqlite') assert set(context.get_head_revisions()) == set(('%s', '%s', '%s', )) """ % (e, f, c) ) command.upgrade(self.cfg, e, sql=True) command.downgrade(self.cfg, "%s:%s" % (e, b), sql=True) command.stamp(self.cfg, c, sql=True) command.current(self.cfg) def test_tag_pre_context(self): env_file_fixture( """ assert context.get_tag_argument() == 'hi' """ ) command.upgrade(self.cfg, b, sql=True, tag="hi") command.downgrade(self.cfg, "%s:%s" % (b, a), sql=True, tag="hi") def test_tag_pre_context_None(self): env_file_fixture( """ assert context.get_tag_argument() is None """ ) command.upgrade(self.cfg, b, sql=True) command.downgrade(self.cfg, "%s:%s" % (b, a), sql=True) def test_tag_cmd_arg(self): env_file_fixture( """ context.configure(dialect_name='sqlite') assert context.get_tag_argument() == 'hi' """ ) command.upgrade(self.cfg, b, sql=True, tag="hi") command.downgrade(self.cfg, "%s:%s" % (b, a), sql=True, tag="hi") def test_tag_cfg_arg(self): env_file_fixture( """ context.configure(dialect_name='sqlite', tag='there') assert context.get_tag_argument() == 'there' """ ) command.upgrade(self.cfg, b, sql=True, tag="hi") command.downgrade(self.cfg, "%s:%s" % (b, a), sql=True, tag="hi") def test_tag_None(self): env_file_fixture( """ context.configure(dialect_name='sqlite') assert context.get_tag_argument() is None """ ) command.upgrade(self.cfg, b, sql=True) command.downgrade(self.cfg, "%s:%s" % (b, a), sql=True) def test_downgrade_wo_colon(self): env_file_fixture( """ context.configure(dialect_name='sqlite') """ ) assert_raises_message( util.CommandError, "downgrade with --sql requires :", command.downgrade, self.cfg, b, sql=True, ) def test_upgrade_with_output_encoding(self): env_file_fixture( """ url = config.get_main_option('sqlalchemy.url') context.configure(url=url, output_encoding='utf-8') assert not context.requires_connection() """ ) command.upgrade(self.cfg, a, sql=True) command.downgrade(self.cfg, "%s:%s" % (b, a), sql=True) def test_running_comments_not_in_sql(self): message = "this is a very long \nand multiline\nmessage" d = command.revision(self.cfg, message=message) with capture_context_buffer(transactional_ddl=True) as buf: command.upgrade(self.cfg, "%s:%s" % (a, d.revision), sql=True) assert not re.match( r".*-- .*and multiline", buf.getvalue(), re.S | re.M ) def test_starting_rev_pre_context_abbreviated(self): env_file_fixture( """ assert context.get_starting_revision_argument() == '%s' """ % b[0:4] ) command.upgrade(self.cfg, "%s:%s" % (b[0:4], c), sql=True) command.stamp(self.cfg, "%s:%s" % (b[0:4], c), sql=True) command.downgrade(self.cfg, "%s:%s" % (b[0:4], a), sql=True) def test_destination_rev_pre_context_abbreviated(self): env_file_fixture( """ assert context.get_revision_argument() == '%s' """ % b[0:4] ) command.upgrade(self.cfg, "%s:%s" % (a, b[0:4]), sql=True) command.stamp(self.cfg, b[0:4], sql=True) command.downgrade(self.cfg, "%s:%s" % (c, b[0:4]), sql=True) def test_starting_rev_context_runs_abbreviated(self): env_file_fixture( """ context.configure(dialect_name='sqlite') context.run_migrations() """ ) command.upgrade(self.cfg, "%s:%s" % (b[0:4], c), sql=True) command.downgrade(self.cfg, "%s:%s" % (b[0:4], a), sql=True) def test_destination_rev_context_runs_abbreviated(self): env_file_fixture( """ context.configure(dialect_name='sqlite') context.run_migrations() """ ) command.upgrade(self.cfg, "%s:%s" % (a, b[0:4]), sql=True) command.stamp(self.cfg, b[0:4], sql=True) command.downgrade(self.cfg, "%s:%s" % (c, b[0:4]), sql=True) zzzeek-alembic-bee044a1c187/tests/test_op.py000066400000000000000000001060321353106760100207750ustar00rootroot00000000000000"""Test against the builders in the op.* module.""" from sqlalchemy import Boolean from sqlalchemy import Column from sqlalchemy import event from sqlalchemy import ForeignKey from sqlalchemy import Integer from sqlalchemy import String from sqlalchemy import Table from sqlalchemy.sql import column from sqlalchemy.sql import func from sqlalchemy.sql import text from sqlalchemy.sql.schema import quoted_name from alembic import op from alembic.operations import ops from alembic.operations import schemaobj from alembic.testing import assert_raises_message from alembic.testing import config from alembic.testing import eq_ from alembic.testing import is_ from alembic.testing import mock from alembic.testing.fixtures import AlterColRoundTripFixture from alembic.testing.fixtures import op_fixture from alembic.testing.fixtures import TestBase @event.listens_for(Table, "after_parent_attach") def _add_cols(table, metadata): if table.name == "tbl_with_auto_appended_column": table.append_column(Column("bat", Integer)) class OpTest(TestBase): def test_rename_table(self): context = op_fixture() op.rename_table("t1", "t2") context.assert_("ALTER TABLE t1 RENAME TO t2") def test_rename_table_schema(self): context = op_fixture() op.rename_table("t1", "t2", schema="foo") context.assert_("ALTER TABLE foo.t1 RENAME TO foo.t2") def test_create_index_no_expr_allowed(self): op_fixture() assert_raises_message( ValueError, r"String or text\(\) construct expected", op.create_index, "name", "tname", [func.foo(column("x"))], ) def test_add_column_schema_hard_quoting(self): context = op_fixture("postgresql") op.add_column( "somename", Column("colname", String), schema=quoted_name("some.schema", quote=True), ) context.assert_( 'ALTER TABLE "some.schema".somename ADD COLUMN colname VARCHAR' ) def test_rename_table_schema_hard_quoting(self): context = op_fixture("postgresql") op.rename_table( "t1", "t2", schema=quoted_name("some.schema", quote=True) ) context.assert_('ALTER TABLE "some.schema".t1 RENAME TO t2') def test_add_constraint_schema_hard_quoting(self): context = op_fixture("postgresql") op.create_check_constraint( "ck_user_name_len", "user_table", func.len(column("name")) > 5, schema=quoted_name("some.schema", quote=True), ) context.assert_( 'ALTER TABLE "some.schema".user_table ADD ' "CONSTRAINT ck_user_name_len CHECK (len(name) > 5)" ) def test_create_index_quoting(self): context = op_fixture("postgresql") op.create_index("geocoded", "locations", ["IShouldBeQuoted"]) context.assert_( 'CREATE INDEX geocoded ON locations ("IShouldBeQuoted")' ) def test_create_index_expressions(self): context = op_fixture() op.create_index("geocoded", "locations", [text("lower(coordinates)")]) context.assert_( "CREATE INDEX geocoded ON locations (lower(coordinates))" ) def test_add_column(self): context = op_fixture() op.add_column("t1", Column("c1", Integer, nullable=False)) context.assert_("ALTER TABLE t1 ADD COLUMN c1 INTEGER NOT NULL") def test_add_column_schema(self): context = op_fixture() op.add_column( "t1", Column("c1", Integer, nullable=False), schema="foo" ) context.assert_("ALTER TABLE foo.t1 ADD COLUMN c1 INTEGER NOT NULL") def test_add_column_with_default(self): context = op_fixture() op.add_column( "t1", Column("c1", Integer, nullable=False, server_default="12") ) context.assert_( "ALTER TABLE t1 ADD COLUMN c1 INTEGER DEFAULT '12' NOT NULL" ) def test_add_column_with_index(self): context = op_fixture() op.add_column("t1", Column("c1", Integer, nullable=False, index=True)) context.assert_( "ALTER TABLE t1 ADD COLUMN c1 INTEGER NOT NULL", "CREATE INDEX ix_t1_c1 ON t1 (c1)", ) def test_add_column_schema_with_default(self): context = op_fixture() op.add_column( "t1", Column("c1", Integer, nullable=False, server_default="12"), schema="foo", ) context.assert_( "ALTER TABLE foo.t1 ADD COLUMN c1 INTEGER DEFAULT '12' NOT NULL" ) def test_add_column_fk(self): context = op_fixture() op.add_column( "t1", Column("c1", Integer, ForeignKey("c2.id"), nullable=False) ) context.assert_( "ALTER TABLE t1 ADD COLUMN c1 INTEGER NOT NULL", "ALTER TABLE t1 ADD FOREIGN KEY(c1) REFERENCES c2 (id)", ) def test_add_column_schema_fk(self): context = op_fixture() op.add_column( "t1", Column("c1", Integer, ForeignKey("c2.id"), nullable=False), schema="foo", ) context.assert_( "ALTER TABLE foo.t1 ADD COLUMN c1 INTEGER NOT NULL", "ALTER TABLE foo.t1 ADD FOREIGN KEY(c1) REFERENCES c2 (id)", ) def test_add_column_schema_type(self): """Test that a schema type generates its constraints....""" context = op_fixture() op.add_column("t1", Column("c1", Boolean, nullable=False)) context.assert_( "ALTER TABLE t1 ADD COLUMN c1 BOOLEAN NOT NULL", "ALTER TABLE t1 ADD CHECK (c1 IN (0, 1))", ) def test_add_column_schema_schema_type(self): """Test that a schema type generates its constraints....""" context = op_fixture() op.add_column( "t1", Column("c1", Boolean, nullable=False), schema="foo" ) context.assert_( "ALTER TABLE foo.t1 ADD COLUMN c1 BOOLEAN NOT NULL", "ALTER TABLE foo.t1 ADD CHECK (c1 IN (0, 1))", ) def test_add_column_schema_type_checks_rule(self): """Test that a schema type doesn't generate a constraint based on check rule.""" context = op_fixture("postgresql") op.add_column("t1", Column("c1", Boolean, nullable=False)) context.assert_("ALTER TABLE t1 ADD COLUMN c1 BOOLEAN NOT NULL") def test_add_column_fk_self_referential(self): context = op_fixture() op.add_column( "t1", Column("c1", Integer, ForeignKey("t1.c2"), nullable=False) ) context.assert_( "ALTER TABLE t1 ADD COLUMN c1 INTEGER NOT NULL", "ALTER TABLE t1 ADD FOREIGN KEY(c1) REFERENCES t1 (c2)", ) def test_add_column_schema_fk_self_referential(self): context = op_fixture() op.add_column( "t1", Column("c1", Integer, ForeignKey("foo.t1.c2"), nullable=False), schema="foo", ) context.assert_( "ALTER TABLE foo.t1 ADD COLUMN c1 INTEGER NOT NULL", "ALTER TABLE foo.t1 ADD FOREIGN KEY(c1) REFERENCES foo.t1 (c2)", ) def test_add_column_fk_schema(self): context = op_fixture() op.add_column( "t1", Column("c1", Integer, ForeignKey("remote.t2.c2"), nullable=False), ) context.assert_( "ALTER TABLE t1 ADD COLUMN c1 INTEGER NOT NULL", "ALTER TABLE t1 ADD FOREIGN KEY(c1) REFERENCES remote.t2 (c2)", ) def test_add_column_schema_fk_schema(self): context = op_fixture() op.add_column( "t1", Column("c1", Integer, ForeignKey("remote.t2.c2"), nullable=False), schema="foo", ) context.assert_( "ALTER TABLE foo.t1 ADD COLUMN c1 INTEGER NOT NULL", "ALTER TABLE foo.t1 ADD FOREIGN KEY(c1) REFERENCES remote.t2 (c2)", ) def test_drop_column(self): context = op_fixture() op.drop_column("t1", "c1") context.assert_("ALTER TABLE t1 DROP COLUMN c1") def test_drop_column_schema(self): context = op_fixture() op.drop_column("t1", "c1", schema="foo") context.assert_("ALTER TABLE foo.t1 DROP COLUMN c1") def test_alter_column_nullable(self): context = op_fixture() op.alter_column("t", "c", nullable=True) context.assert_( # TODO: not sure if this is PG only or standard # SQL "ALTER TABLE t ALTER COLUMN c DROP NOT NULL" ) def test_alter_column_schema_nullable(self): context = op_fixture() op.alter_column("t", "c", nullable=True, schema="foo") context.assert_( # TODO: not sure if this is PG only or standard # SQL "ALTER TABLE foo.t ALTER COLUMN c DROP NOT NULL" ) def test_alter_column_not_nullable(self): context = op_fixture() op.alter_column("t", "c", nullable=False) context.assert_( # TODO: not sure if this is PG only or standard # SQL "ALTER TABLE t ALTER COLUMN c SET NOT NULL" ) def test_alter_column_schema_not_nullable(self): context = op_fixture() op.alter_column("t", "c", nullable=False, schema="foo") context.assert_( # TODO: not sure if this is PG only or standard # SQL "ALTER TABLE foo.t ALTER COLUMN c SET NOT NULL" ) def test_alter_column_rename(self): context = op_fixture() op.alter_column("t", "c", new_column_name="x") context.assert_("ALTER TABLE t RENAME c TO x") def test_alter_column_schema_rename(self): context = op_fixture() op.alter_column("t", "c", new_column_name="x", schema="foo") context.assert_("ALTER TABLE foo.t RENAME c TO x") def test_alter_column_type(self): context = op_fixture() op.alter_column("t", "c", type_=String(50)) context.assert_("ALTER TABLE t ALTER COLUMN c TYPE VARCHAR(50)") def test_alter_column_schema_type(self): context = op_fixture() op.alter_column("t", "c", type_=String(50), schema="foo") context.assert_("ALTER TABLE foo.t ALTER COLUMN c TYPE VARCHAR(50)") def test_alter_column_set_default(self): context = op_fixture() op.alter_column("t", "c", server_default="q") context.assert_("ALTER TABLE t ALTER COLUMN c SET DEFAULT 'q'") def test_alter_column_schema_set_default(self): context = op_fixture() op.alter_column("t", "c", server_default="q", schema="foo") context.assert_("ALTER TABLE foo.t ALTER COLUMN c SET DEFAULT 'q'") def test_alter_column_set_compiled_default(self): context = op_fixture() op.alter_column( "t", "c", server_default=func.utc_thing(func.current_timestamp()) ) context.assert_( "ALTER TABLE t ALTER COLUMN c " "SET DEFAULT utc_thing(CURRENT_TIMESTAMP)" ) def test_alter_column_schema_set_compiled_default(self): context = op_fixture() op.alter_column( "t", "c", server_default=func.utc_thing(func.current_timestamp()), schema="foo", ) context.assert_( "ALTER TABLE foo.t ALTER COLUMN c " "SET DEFAULT utc_thing(CURRENT_TIMESTAMP)" ) def test_alter_column_drop_default(self): context = op_fixture() op.alter_column("t", "c", server_default=None) context.assert_("ALTER TABLE t ALTER COLUMN c DROP DEFAULT") def test_alter_column_schema_drop_default(self): context = op_fixture() op.alter_column("t", "c", server_default=None, schema="foo") context.assert_("ALTER TABLE foo.t ALTER COLUMN c DROP DEFAULT") def test_alter_column_schema_type_unnamed(self): context = op_fixture("mssql", native_boolean=False) op.alter_column("t", "c", type_=Boolean()) context.assert_( "ALTER TABLE t ALTER COLUMN c BIT", "ALTER TABLE t ADD CHECK (c IN (0, 1))", ) def test_alter_column_schema_schema_type_unnamed(self): context = op_fixture("mssql", native_boolean=False) op.alter_column("t", "c", type_=Boolean(), schema="foo") context.assert_( "ALTER TABLE foo.t ALTER COLUMN c BIT", "ALTER TABLE foo.t ADD CHECK (c IN (0, 1))", ) def test_alter_column_schema_type_named(self): context = op_fixture("mssql", native_boolean=False) op.alter_column("t", "c", type_=Boolean(name="xyz")) context.assert_( "ALTER TABLE t ALTER COLUMN c BIT", "ALTER TABLE t ADD CONSTRAINT xyz CHECK (c IN (0, 1))", ) def test_alter_column_schema_schema_type_named(self): context = op_fixture("mssql", native_boolean=False) op.alter_column("t", "c", type_=Boolean(name="xyz"), schema="foo") context.assert_( "ALTER TABLE foo.t ALTER COLUMN c BIT", "ALTER TABLE foo.t ADD CONSTRAINT xyz CHECK (c IN (0, 1))", ) def test_alter_column_schema_type_existing_type(self): context = op_fixture("mssql", native_boolean=False) op.alter_column( "t", "c", type_=String(10), existing_type=Boolean(name="xyz") ) context.assert_( "ALTER TABLE t DROP CONSTRAINT xyz", "ALTER TABLE t ALTER COLUMN c VARCHAR(10)", ) def test_alter_column_schema_schema_type_existing_type(self): context = op_fixture("mssql", native_boolean=False) op.alter_column( "t", "c", type_=String(10), existing_type=Boolean(name="xyz"), schema="foo", ) context.assert_( "ALTER TABLE foo.t DROP CONSTRAINT xyz", "ALTER TABLE foo.t ALTER COLUMN c VARCHAR(10)", ) def test_alter_column_schema_type_existing_type_no_const(self): context = op_fixture("postgresql") op.alter_column("t", "c", type_=String(10), existing_type=Boolean()) context.assert_("ALTER TABLE t ALTER COLUMN c TYPE VARCHAR(10)") def test_alter_column_schema_schema_type_existing_type_no_const(self): context = op_fixture("postgresql") op.alter_column( "t", "c", type_=String(10), existing_type=Boolean(), schema="foo" ) context.assert_("ALTER TABLE foo.t ALTER COLUMN c TYPE VARCHAR(10)") def test_alter_column_schema_type_existing_type_no_new_type(self): context = op_fixture("postgresql") op.alter_column("t", "c", nullable=False, existing_type=Boolean()) context.assert_("ALTER TABLE t ALTER COLUMN c SET NOT NULL") def test_alter_column_schema_schema_type_existing_type_no_new_type(self): context = op_fixture("postgresql") op.alter_column( "t", "c", nullable=False, existing_type=Boolean(), schema="foo" ) context.assert_("ALTER TABLE foo.t ALTER COLUMN c SET NOT NULL") def test_add_foreign_key(self): context = op_fixture() op.create_foreign_key( "fk_test", "t1", "t2", ["foo", "bar"], ["bat", "hoho"] ) context.assert_( "ALTER TABLE t1 ADD CONSTRAINT fk_test FOREIGN KEY(foo, bar) " "REFERENCES t2 (bat, hoho)" ) def test_add_foreign_key_schema(self): context = op_fixture() op.create_foreign_key( "fk_test", "t1", "t2", ["foo", "bar"], ["bat", "hoho"], source_schema="foo2", referent_schema="bar2", ) context.assert_( "ALTER TABLE foo2.t1 ADD CONSTRAINT fk_test FOREIGN KEY(foo, bar) " "REFERENCES bar2.t2 (bat, hoho)" ) def test_add_foreign_key_schema_same_tablename(self): context = op_fixture() op.create_foreign_key( "fk_test", "t1", "t1", ["foo", "bar"], ["bat", "hoho"], source_schema="foo2", referent_schema="bar2", ) context.assert_( "ALTER TABLE foo2.t1 ADD CONSTRAINT fk_test FOREIGN KEY(foo, bar) " "REFERENCES bar2.t1 (bat, hoho)" ) def test_add_foreign_key_onupdate(self): context = op_fixture() op.create_foreign_key( "fk_test", "t1", "t2", ["foo", "bar"], ["bat", "hoho"], onupdate="CASCADE", ) context.assert_( "ALTER TABLE t1 ADD CONSTRAINT fk_test FOREIGN KEY(foo, bar) " "REFERENCES t2 (bat, hoho) ON UPDATE CASCADE" ) def test_add_foreign_key_ondelete(self): context = op_fixture() op.create_foreign_key( "fk_test", "t1", "t2", ["foo", "bar"], ["bat", "hoho"], ondelete="CASCADE", ) context.assert_( "ALTER TABLE t1 ADD CONSTRAINT fk_test FOREIGN KEY(foo, bar) " "REFERENCES t2 (bat, hoho) ON DELETE CASCADE" ) def test_add_foreign_key_deferrable(self): context = op_fixture() op.create_foreign_key( "fk_test", "t1", "t2", ["foo", "bar"], ["bat", "hoho"], deferrable=True, ) context.assert_( "ALTER TABLE t1 ADD CONSTRAINT fk_test FOREIGN KEY(foo, bar) " "REFERENCES t2 (bat, hoho) DEFERRABLE" ) def test_add_foreign_key_initially(self): context = op_fixture() op.create_foreign_key( "fk_test", "t1", "t2", ["foo", "bar"], ["bat", "hoho"], initially="deferred", ) context.assert_( "ALTER TABLE t1 ADD CONSTRAINT fk_test FOREIGN KEY(foo, bar) " "REFERENCES t2 (bat, hoho) INITIALLY deferred" ) @config.requirements.foreign_key_match def test_add_foreign_key_match(self): context = op_fixture() op.create_foreign_key( "fk_test", "t1", "t2", ["foo", "bar"], ["bat", "hoho"], match="SIMPLE", ) context.assert_( "ALTER TABLE t1 ADD CONSTRAINT fk_test FOREIGN KEY(foo, bar) " "REFERENCES t2 (bat, hoho) MATCH SIMPLE" ) def test_add_foreign_key_dialect_kw(self): op_fixture() with mock.patch("sqlalchemy.schema.ForeignKeyConstraint") as fkc: op.create_foreign_key( "fk_test", "t1", "t2", ["foo", "bar"], ["bat", "hoho"], foobar_arg="xyz", ) if config.requirements.foreign_key_match.enabled: eq_( fkc.mock_calls[0], mock.call( ["foo", "bar"], ["t2.bat", "t2.hoho"], onupdate=None, ondelete=None, name="fk_test", foobar_arg="xyz", deferrable=None, initially=None, match=None, ), ) else: eq_( fkc.mock_calls[0], mock.call( ["foo", "bar"], ["t2.bat", "t2.hoho"], onupdate=None, ondelete=None, name="fk_test", foobar_arg="xyz", deferrable=None, initially=None, ), ) def test_add_foreign_key_self_referential(self): context = op_fixture() op.create_foreign_key("fk_test", "t1", "t1", ["foo"], ["bar"]) context.assert_( "ALTER TABLE t1 ADD CONSTRAINT fk_test " "FOREIGN KEY(foo) REFERENCES t1 (bar)" ) def test_add_primary_key_constraint(self): context = op_fixture() op.create_primary_key("pk_test", "t1", ["foo", "bar"]) context.assert_( "ALTER TABLE t1 ADD CONSTRAINT pk_test PRIMARY KEY (foo, bar)" ) def test_add_primary_key_constraint_schema(self): context = op_fixture() op.create_primary_key("pk_test", "t1", ["foo"], schema="bar") context.assert_( "ALTER TABLE bar.t1 ADD CONSTRAINT pk_test PRIMARY KEY (foo)" ) def test_add_check_constraint(self): context = op_fixture() op.create_check_constraint( "ck_user_name_len", "user_table", func.len(column("name")) > 5 ) context.assert_( "ALTER TABLE user_table ADD CONSTRAINT ck_user_name_len " "CHECK (len(name) > 5)" ) def test_add_check_constraint_schema(self): context = op_fixture() op.create_check_constraint( "ck_user_name_len", "user_table", func.len(column("name")) > 5, schema="foo", ) context.assert_( "ALTER TABLE foo.user_table ADD CONSTRAINT ck_user_name_len " "CHECK (len(name) > 5)" ) def test_add_unique_constraint(self): context = op_fixture() op.create_unique_constraint("uk_test", "t1", ["foo", "bar"]) context.assert_( "ALTER TABLE t1 ADD CONSTRAINT uk_test UNIQUE (foo, bar)" ) def test_add_foreign_key_legacy_kwarg(self): context = op_fixture() op.create_foreign_key( name="some_fk", source="some_table", referent="referred_table", local_cols=["a", "b"], remote_cols=["c", "d"], ondelete="CASCADE", ) context.assert_( "ALTER TABLE some_table ADD CONSTRAINT some_fk " "FOREIGN KEY(a, b) REFERENCES referred_table (c, d) " "ON DELETE CASCADE" ) def test_add_unique_constraint_legacy_kwarg(self): context = op_fixture() op.create_unique_constraint( name="uk_test", source="t1", local_cols=["foo", "bar"] ) context.assert_( "ALTER TABLE t1 ADD CONSTRAINT uk_test UNIQUE (foo, bar)" ) def test_drop_constraint_legacy_kwarg(self): context = op_fixture() op.drop_constraint( name="pk_name", table_name="sometable", type_="primary" ) context.assert_("ALTER TABLE sometable DROP CONSTRAINT pk_name") def test_create_pk_legacy_kwarg(self): context = op_fixture() op.create_primary_key( name=None, table_name="sometable", cols=["router_id", "l3_agent_id"], ) context.assert_( "ALTER TABLE sometable ADD PRIMARY KEY (router_id, l3_agent_id)" ) def test_legacy_kwarg_catches_arg_missing(self): op_fixture() assert_raises_message( TypeError, "missing required positional argument: columns", op.create_primary_key, name=None, table_name="sometable", wrong_cols=["router_id", "l3_agent_id"], ) def test_add_unique_constraint_schema(self): context = op_fixture() op.create_unique_constraint( "uk_test", "t1", ["foo", "bar"], schema="foo" ) context.assert_( "ALTER TABLE foo.t1 ADD CONSTRAINT uk_test UNIQUE (foo, bar)" ) def test_drop_constraint(self): context = op_fixture() op.drop_constraint("foo_bar_bat", "t1") context.assert_("ALTER TABLE t1 DROP CONSTRAINT foo_bar_bat") def test_drop_constraint_schema(self): context = op_fixture() op.drop_constraint("foo_bar_bat", "t1", schema="foo") context.assert_("ALTER TABLE foo.t1 DROP CONSTRAINT foo_bar_bat") def test_create_index(self): context = op_fixture() op.create_index("ik_test", "t1", ["foo", "bar"]) context.assert_("CREATE INDEX ik_test ON t1 (foo, bar)") def test_create_unique_index(self): context = op_fixture() op.create_index("ik_test", "t1", ["foo", "bar"], unique=True) context.assert_("CREATE UNIQUE INDEX ik_test ON t1 (foo, bar)") def test_create_index_quote_flag(self): context = op_fixture() op.create_index("ik_test", "t1", ["foo", "bar"], quote=True) context.assert_('CREATE INDEX "ik_test" ON t1 (foo, bar)') def test_create_index_table_col_event(self): context = op_fixture() op.create_index( "ik_test", "tbl_with_auto_appended_column", ["foo", "bar"] ) context.assert_( "CREATE INDEX ik_test ON tbl_with_auto_appended_column (foo, bar)" ) def test_add_unique_constraint_col_event(self): context = op_fixture() op.create_unique_constraint( "ik_test", "tbl_with_auto_appended_column", ["foo", "bar"] ) context.assert_( "ALTER TABLE tbl_with_auto_appended_column " "ADD CONSTRAINT ik_test UNIQUE (foo, bar)" ) def test_create_index_schema(self): context = op_fixture() op.create_index("ik_test", "t1", ["foo", "bar"], schema="foo") context.assert_("CREATE INDEX ik_test ON foo.t1 (foo, bar)") def test_drop_index(self): context = op_fixture() op.drop_index("ik_test") context.assert_("DROP INDEX ik_test") def test_drop_index_schema(self): context = op_fixture() op.drop_index("ik_test", schema="foo") context.assert_("DROP INDEX foo.ik_test") def test_drop_table(self): context = op_fixture() op.drop_table("tb_test") context.assert_("DROP TABLE tb_test") def test_drop_table_schema(self): context = op_fixture() op.drop_table("tb_test", schema="foo") context.assert_("DROP TABLE foo.tb_test") def test_create_table_selfref(self): context = op_fixture() op.create_table( "some_table", Column("id", Integer, primary_key=True), Column("st_id", Integer, ForeignKey("some_table.id")), ) context.assert_( "CREATE TABLE some_table (" "id INTEGER NOT NULL, " "st_id INTEGER, " "PRIMARY KEY (id), " "FOREIGN KEY(st_id) REFERENCES some_table (id))" ) def test_create_table_fk_and_schema(self): context = op_fixture() t1 = op.create_table( "some_table", Column("id", Integer, primary_key=True), Column("foo_id", Integer, ForeignKey("foo.id")), schema="schema", ) context.assert_( "CREATE TABLE schema.some_table (" "id INTEGER NOT NULL, " "foo_id INTEGER, " "PRIMARY KEY (id), " "FOREIGN KEY(foo_id) REFERENCES foo (id))" ) eq_(t1.c.id.name, "id") eq_(t1.schema, "schema") def test_create_table_no_pk(self): context = op_fixture() t1 = op.create_table( "some_table", Column("x", Integer), Column("y", Integer), Column("z", Integer), ) context.assert_( "CREATE TABLE some_table (x INTEGER, y INTEGER, z INTEGER)" ) assert not t1.primary_key def test_create_table_two_fk(self): context = op_fixture() op.create_table( "some_table", Column("id", Integer, primary_key=True), Column("foo_id", Integer, ForeignKey("foo.id")), Column("foo_bar", Integer, ForeignKey("foo.bar")), ) context.assert_( "CREATE TABLE some_table (" "id INTEGER NOT NULL, " "foo_id INTEGER, " "foo_bar INTEGER, " "PRIMARY KEY (id), " "FOREIGN KEY(foo_id) REFERENCES foo (id), " "FOREIGN KEY(foo_bar) REFERENCES foo (bar))" ) def test_inline_literal(self): context = op_fixture() from sqlalchemy.sql import table, column from sqlalchemy import String, Integer account = table( "account", column("name", String), column("id", Integer) ) op.execute( account.update() .where(account.c.name == op.inline_literal("account 1")) .values({"name": op.inline_literal("account 2")}) ) op.execute( account.update() .where(account.c.id == op.inline_literal(1)) .values({"id": op.inline_literal(2)}) ) context.assert_( "UPDATE account SET name='account 2' " "WHERE account.name = 'account 1'", "UPDATE account SET id=2 WHERE account.id = 1", ) def test_cant_op(self): if hasattr(op, "_proxy"): del op._proxy assert_raises_message( NameError, "Can't invoke function 'inline_literal', as the " "proxy object has not yet been established " "for the Alembic 'Operations' class. " "Try placing this code inside a callable.", op.inline_literal, "asdf", ) def test_naming_changes(self): context = op_fixture() op.alter_column("t", "c", name="x") context.assert_("ALTER TABLE t RENAME c TO x") context = op_fixture() op.alter_column("t", "c", new_column_name="x") context.assert_("ALTER TABLE t RENAME c TO x") context = op_fixture("mysql") op.drop_constraint("f1", "t1", type="foreignkey") context.assert_("ALTER TABLE t1 DROP FOREIGN KEY f1") context = op_fixture("mysql") op.drop_constraint("f1", "t1", type_="foreignkey") context.assert_("ALTER TABLE t1 DROP FOREIGN KEY f1") def test_naming_changes_drop_idx(self): context = op_fixture("mssql") op.drop_index("ik_test", tablename="t1") context.assert_("DROP INDEX ik_test ON t1") @config.requirements.comments def test_create_table_comment_op(self): context = op_fixture() op.create_table_comment("some_table", "table comment") context.assert_("COMMENT ON TABLE some_table IS 'table comment'") @config.requirements.comments def test_drop_table_comment_op(self): context = op_fixture() op.drop_table_comment("some_table") context.assert_("COMMENT ON TABLE some_table IS NULL") class SQLModeOpTest(TestBase): def test_auto_literals(self): context = op_fixture(as_sql=True, literal_binds=True) from sqlalchemy.sql import table, column from sqlalchemy import String, Integer account = table( "account", column("name", String), column("id", Integer) ) op.execute( account.update() .where(account.c.name == op.inline_literal("account 1")) .values({"name": op.inline_literal("account 2")}) ) op.execute(text("update table set foo=:bar").bindparams(bar="bat")) context.assert_( "UPDATE account SET name='account 2' " "WHERE account.name = 'account 1'", "update table set foo='bat'", ) def test_create_table_literal_binds(self): context = op_fixture(as_sql=True, literal_binds=True) op.create_table( "some_table", Column("id", Integer, primary_key=True), Column("st_id", Integer, ForeignKey("some_table.id")), ) context.assert_( "CREATE TABLE some_table (id INTEGER NOT NULL, st_id INTEGER, " "PRIMARY KEY (id), FOREIGN KEY(st_id) REFERENCES some_table (id))" ) class CustomOpTest(TestBase): def test_custom_op(self): from alembic.operations import Operations, MigrateOperation @Operations.register_operation("create_sequence") class CreateSequenceOp(MigrateOperation): """Create a SEQUENCE.""" def __init__(self, sequence_name, **kw): self.sequence_name = sequence_name self.kw = kw @classmethod def create_sequence(cls, operations, sequence_name, **kw): """Issue a "CREATE SEQUENCE" instruction.""" op = CreateSequenceOp(sequence_name, **kw) return operations.invoke(op) @Operations.implementation_for(CreateSequenceOp) def create_sequence(operations, operation): operations.execute("CREATE SEQUENCE %s" % operation.sequence_name) context = op_fixture() op.create_sequence("foob") context.assert_("CREATE SEQUENCE foob") class EnsureOrigObjectFromToTest(TestBase): """the to_XYZ and from_XYZ methods are used heavily in autogenerate. It's critical that these methods, at least the "drop" form, always return the *same* object if available so that all the info passed into to_XYZ is maintained in the from_XYZ. """ def test_drop_index(self): schema_obj = schemaobj.SchemaObjects() idx = schema_obj.index("x", "y", ["z"]) op = ops.DropIndexOp.from_index(idx) is_(op.to_index(), idx) def test_create_index(self): schema_obj = schemaobj.SchemaObjects() idx = schema_obj.index("x", "y", ["z"]) op = ops.CreateIndexOp.from_index(idx) is_(op.to_index(), idx) def test_drop_table(self): schema_obj = schemaobj.SchemaObjects() table = schema_obj.table("x", Column("q", Integer)) op = ops.DropTableOp.from_table(table) is_(op.to_table(), table) def test_create_table(self): schema_obj = schemaobj.SchemaObjects() table = schema_obj.table("x", Column("q", Integer)) op = ops.CreateTableOp.from_table(table) is_(op.to_table(), table) def test_drop_unique_constraint(self): schema_obj = schemaobj.SchemaObjects() const = schema_obj.unique_constraint("x", "foobar", ["a"]) op = ops.DropConstraintOp.from_constraint(const) is_(op.to_constraint(), const) def test_drop_constraint_not_available(self): op = ops.DropConstraintOp("x", "y", type_="unique") assert_raises_message( ValueError, "constraint cannot be produced", op.to_constraint ) class BackendAlterColumnTest(AlterColRoundTripFixture, TestBase): __backend__ = True def test_rename_column(self): self._run_alter_col({}, {"name": "newname"}) def test_modify_type_int_str(self): self._run_alter_col({"type": Integer()}, {"type": String(50)}) def test_add_server_default_int(self): self._run_alter_col({"type": Integer}, {"server_default": text("5")}) def test_modify_server_default_int(self): self._run_alter_col( {"type": Integer, "server_default": text("2")}, {"server_default": text("5")}, ) def test_modify_nullable_to_non(self): self._run_alter_col({}, {"nullable": False}) def test_modify_non_nullable_to_nullable(self): self._run_alter_col({"nullable": False}, {"nullable": True}) zzzeek-alembic-bee044a1c187/tests/test_op_naming_convention.py000066400000000000000000000122501353106760100245660ustar00rootroot00000000000000from sqlalchemy import Boolean from sqlalchemy import CheckConstraint from sqlalchemy import Column from sqlalchemy import Integer from sqlalchemy import MetaData from sqlalchemy import Table from sqlalchemy.sql import column from sqlalchemy.sql import func from alembic import op from alembic.testing.fixtures import op_fixture from alembic.testing.fixtures import TestBase class AutoNamingConventionTest(TestBase): def test_add_check_constraint(self): context = op_fixture( naming_convention={"ck": "ck_%(table_name)s_%(constraint_name)s"} ) op.create_check_constraint( "foo", "user_table", func.len(column("name")) > 5 ) context.assert_( "ALTER TABLE user_table ADD CONSTRAINT ck_user_table_foo " "CHECK (len(name) > 5)" ) def test_add_check_constraint_name_is_none(self): context = op_fixture(naming_convention={"ck": "ck_%(table_name)s_foo"}) op.create_check_constraint( None, "user_table", func.len(column("name")) > 5 ) context.assert_( "ALTER TABLE user_table ADD CONSTRAINT ck_user_table_foo " "CHECK (len(name) > 5)" ) def test_add_unique_constraint_name_is_none(self): context = op_fixture(naming_convention={"uq": "uq_%(table_name)s_foo"}) op.create_unique_constraint(None, "user_table", "x") context.assert_( "ALTER TABLE user_table " "ADD CONSTRAINT uq_user_table_foo UNIQUE (x)" ) def test_add_index_name_is_none(self): context = op_fixture(naming_convention={"ix": "ix_%(table_name)s_foo"}) op.create_index(None, "user_table", "x") context.assert_("CREATE INDEX ix_user_table_foo ON user_table (x)") def test_add_check_constraint_already_named_from_schema(self): m1 = MetaData( naming_convention={"ck": "ck_%(table_name)s_%(constraint_name)s"} ) ck = CheckConstraint("im a constraint", name="cc1") Table("t", m1, Column("x"), ck) context = op_fixture( naming_convention={"ck": "ck_%(table_name)s_%(constraint_name)s"} ) op.create_table("some_table", Column("x", Integer, ck)) context.assert_( "CREATE TABLE some_table " "(x INTEGER CONSTRAINT ck_t_cc1 CHECK (im a constraint))" ) def test_add_check_constraint_inline_on_table(self): context = op_fixture( naming_convention={"ck": "ck_%(table_name)s_%(constraint_name)s"} ) op.create_table( "some_table", Column("x", Integer), CheckConstraint("im a constraint", name="cc1"), ) context.assert_( "CREATE TABLE some_table " "(x INTEGER, CONSTRAINT ck_some_table_cc1 CHECK (im a constraint))" ) def test_add_check_constraint_inline_on_table_w_f(self): context = op_fixture( naming_convention={"ck": "ck_%(table_name)s_%(constraint_name)s"} ) op.create_table( "some_table", Column("x", Integer), CheckConstraint("im a constraint", name=op.f("ck_some_table_cc1")), ) context.assert_( "CREATE TABLE some_table " "(x INTEGER, CONSTRAINT ck_some_table_cc1 CHECK (im a constraint))" ) def test_add_check_constraint_inline_on_column(self): context = op_fixture( naming_convention={"ck": "ck_%(table_name)s_%(constraint_name)s"} ) op.create_table( "some_table", Column( "x", Integer, CheckConstraint("im a constraint", name="cc1") ), ) context.assert_( "CREATE TABLE some_table " "(x INTEGER CONSTRAINT ck_some_table_cc1 CHECK (im a constraint))" ) def test_add_check_constraint_inline_on_column_w_f(self): context = op_fixture( naming_convention={"ck": "ck_%(table_name)s_%(constraint_name)s"} ) op.create_table( "some_table", Column( "x", Integer, CheckConstraint("im a constraint", name=op.f("ck_q_cc1")), ), ) context.assert_( "CREATE TABLE some_table " "(x INTEGER CONSTRAINT ck_q_cc1 CHECK (im a constraint))" ) def test_add_column_schema_type(self): context = op_fixture( naming_convention={"ck": "ck_%(table_name)s_%(constraint_name)s"} ) op.add_column("t1", Column("c1", Boolean(name="foo"), nullable=False)) context.assert_( "ALTER TABLE t1 ADD COLUMN c1 BOOLEAN NOT NULL", "ALTER TABLE t1 ADD CONSTRAINT ck_t1_foo CHECK (c1 IN (0, 1))", ) def test_add_column_schema_type_w_f(self): context = op_fixture( naming_convention={"ck": "ck_%(table_name)s_%(constraint_name)s"} ) op.add_column( "t1", Column("c1", Boolean(name=op.f("foo")), nullable=False) ) context.assert_( "ALTER TABLE t1 ADD COLUMN c1 BOOLEAN NOT NULL", "ALTER TABLE t1 ADD CONSTRAINT foo CHECK (c1 IN (0, 1))", ) zzzeek-alembic-bee044a1c187/tests/test_oracle.py000066400000000000000000000171061353106760100216270ustar00rootroot00000000000000from sqlalchemy import Column from sqlalchemy import Integer from alembic import command from alembic import op from alembic.testing import config from alembic.testing.env import _no_sql_testing_config from alembic.testing.env import clear_staging_env from alembic.testing.env import staging_env from alembic.testing.env import three_rev_fixture from alembic.testing.fixtures import capture_context_buffer from alembic.testing.fixtures import op_fixture from alembic.testing.fixtures import TestBase class FullEnvironmentTests(TestBase): @classmethod def setup_class(cls): staging_env() cls.cfg = cfg = _no_sql_testing_config("oracle") cls.a, cls.b, cls.c = three_rev_fixture(cfg) @classmethod def teardown_class(cls): clear_staging_env() def test_begin_comit(self): with capture_context_buffer(transactional_ddl=True) as buf: command.upgrade(self.cfg, self.a, sql=True) assert "SET TRANSACTION READ WRITE\n\n/" in buf.getvalue() assert "COMMIT\n\n/" in buf.getvalue() def test_batch_separator_default(self): with capture_context_buffer() as buf: command.upgrade(self.cfg, self.a, sql=True) assert "/" in buf.getvalue() assert ";" not in buf.getvalue() def test_batch_separator_custom(self): with capture_context_buffer(oracle_batch_separator="BYE") as buf: command.upgrade(self.cfg, self.a, sql=True) assert "BYE" in buf.getvalue() class OpTest(TestBase): def test_add_column(self): context = op_fixture("oracle") op.add_column("t1", Column("c1", Integer, nullable=False)) context.assert_("ALTER TABLE t1 ADD c1 INTEGER NOT NULL") def test_add_column_with_default(self): context = op_fixture("oracle") op.add_column( "t1", Column("c1", Integer, nullable=False, server_default="12") ) context.assert_("ALTER TABLE t1 ADD c1 INTEGER DEFAULT '12' NOT NULL") @config.requirements.comments def test_add_column_with_comment(self): context = op_fixture("oracle") op.add_column( "t1", Column("c1", Integer, nullable=False, comment="c1 comment") ) context.assert_( "ALTER TABLE t1 ADD c1 INTEGER NOT NULL", "COMMENT ON COLUMN t1.c1 IS 'c1 comment'", ) def test_alter_column_rename_oracle(self): context = op_fixture("oracle") op.alter_column("t", "c", name="x") context.assert_("ALTER TABLE t RENAME COLUMN c TO x") def test_alter_column_new_type(self): context = op_fixture("oracle") op.alter_column("t", "c", type_=Integer) context.assert_("ALTER TABLE t MODIFY c INTEGER") def test_alter_column_add_comment(self): context = op_fixture("oracle") op.alter_column("t", "c", type_=Integer, comment="c comment") context.assert_( "ALTER TABLE t MODIFY c INTEGER", "COMMENT ON COLUMN t.c IS 'c comment'", ) def test_alter_column_add_comment_quotes(self): context = op_fixture("oracle") op.alter_column("t", "c", type_=Integer, comment="c 'comment'") context.assert_( "ALTER TABLE t MODIFY c INTEGER", "COMMENT ON COLUMN t.c IS 'c ''comment'''", ) def test_alter_column_drop_comment(self): context = op_fixture("oracle") op.alter_column("t", "c", type_=Integer, comment=None) context.assert_( "ALTER TABLE t MODIFY c INTEGER", "COMMENT ON COLUMN t.c IS ''" ) @config.requirements.comments_api def test_create_table_comment(self): # this is handled by SQLAlchemy's compilers context = op_fixture("oracle") op.create_table_comment( 't2', comment='t2 table', schema='foo' ) context.assert_( "COMMENT ON TABLE foo.t2 IS 't2 table'" ) @config.requirements.comments_api @config.requirements.sqlalchemy_issue_4436 def test_drop_table_comment(self): # this is handled by SQLAlchemy's compilers context = op_fixture("oracle") op.drop_table_comment( 't2', existing_comment='t2 table', schema='foo' ) context.assert_( "COMMENT ON TABLE foo.t2 IS ''" ) def test_drop_index(self): context = op_fixture("oracle") op.drop_index("my_idx", "my_table") context.assert_contains("DROP INDEX my_idx") def test_drop_column_w_default(self): context = op_fixture("oracle") op.drop_column("t1", "c1") context.assert_("ALTER TABLE t1 DROP COLUMN c1") def test_drop_column_w_check(self): context = op_fixture("oracle") op.drop_column("t1", "c1") context.assert_("ALTER TABLE t1 DROP COLUMN c1") def test_alter_column_nullable_w_existing_type(self): context = op_fixture("oracle") op.alter_column("t", "c", nullable=True, existing_type=Integer) context.assert_("ALTER TABLE t MODIFY c NULL") def test_alter_column_not_nullable_w_existing_type(self): context = op_fixture("oracle") op.alter_column("t", "c", nullable=False, existing_type=Integer) context.assert_("ALTER TABLE t MODIFY c NOT NULL") def test_alter_column_nullable_w_new_type(self): context = op_fixture("oracle") op.alter_column("t", "c", nullable=True, type_=Integer) context.assert_( "ALTER TABLE t MODIFY c NULL", "ALTER TABLE t MODIFY c INTEGER" ) def test_alter_column_not_nullable_w_new_type(self): context = op_fixture("oracle") op.alter_column("t", "c", nullable=False, type_=Integer) context.assert_( "ALTER TABLE t MODIFY c NOT NULL", "ALTER TABLE t MODIFY c INTEGER" ) def test_alter_add_server_default(self): context = op_fixture("oracle") op.alter_column("t", "c", server_default="5") context.assert_("ALTER TABLE t MODIFY c DEFAULT '5'") def test_alter_replace_server_default(self): context = op_fixture("oracle") op.alter_column( "t", "c", server_default="5", existing_server_default="6" ) context.assert_("ALTER TABLE t MODIFY c DEFAULT '5'") def test_alter_remove_server_default(self): context = op_fixture("oracle") op.alter_column("t", "c", server_default=None) context.assert_("ALTER TABLE t MODIFY c DEFAULT NULL") def test_alter_do_everything(self): context = op_fixture("oracle") op.alter_column( "t", "c", name="c2", nullable=True, type_=Integer, server_default="5", ) context.assert_( "ALTER TABLE t MODIFY c NULL", "ALTER TABLE t MODIFY c DEFAULT '5'", "ALTER TABLE t MODIFY c INTEGER", "ALTER TABLE t RENAME COLUMN c TO c2", ) @config.requirements.comments def test_create_table_with_column_comments(self): context = op_fixture("oracle") op.create_table( "t2", Column("c1", Integer, primary_key=True), comment="t2 comment" ) context.assert_( "CREATE TABLE t2 (c1 INTEGER NOT NULL, PRIMARY KEY (c1))", "COMMENT ON TABLE t2 IS 't2 comment'", ) # TODO: when we add schema support # def test_alter_column_rename_oracle_schema(self): # context = op_fixture('oracle') # op.alter_column("t", "c", name="x", schema="y") # context.assert_( # 'ALTER TABLE y.t RENAME COLUMN c TO c2' # ) zzzeek-alembic-bee044a1c187/tests/test_postgresql.py000066400000000000000000000750241353106760100225700ustar00rootroot00000000000000from sqlalchemy import BigInteger from sqlalchemy import Boolean from sqlalchemy import Column from sqlalchemy import DateTime from sqlalchemy import Float from sqlalchemy import func from sqlalchemy import Index from sqlalchemy import Integer from sqlalchemy import Interval from sqlalchemy import MetaData from sqlalchemy import Numeric from sqlalchemy import Sequence from sqlalchemy import String from sqlalchemy import Table from sqlalchemy import text from sqlalchemy import types from sqlalchemy.dialects.postgresql import ARRAY from sqlalchemy.dialects.postgresql import BYTEA from sqlalchemy.dialects.postgresql import HSTORE from sqlalchemy.dialects.postgresql import JSON from sqlalchemy.dialects.postgresql import JSONB from sqlalchemy.dialects.postgresql import UUID from sqlalchemy.engine.reflection import Inspector from sqlalchemy.sql import column from sqlalchemy.sql import false from sqlalchemy.sql import table from alembic import autogenerate from alembic import command from alembic import op from alembic import util from alembic.autogenerate import api from alembic.autogenerate.compare import _compare_server_default from alembic.autogenerate.compare import _compare_tables from alembic.autogenerate.compare import _render_server_default_for_compare from alembic.migration import MigrationContext from alembic.operations import Operations from alembic.operations import ops from alembic.script import ScriptDirectory from alembic.testing import config from alembic.testing import eq_ from alembic.testing import eq_ignore_whitespace from alembic.testing import provide_metadata from alembic.testing.env import _no_sql_testing_config from alembic.testing.env import clear_staging_env from alembic.testing.env import staging_env from alembic.testing.env import write_script from alembic.testing.fixtures import capture_context_buffer from alembic.testing.fixtures import op_fixture from alembic.testing.fixtures import TestBase class PostgresqlOpTest(TestBase): def test_rename_table_postgresql(self): context = op_fixture("postgresql") op.rename_table("t1", "t2") context.assert_("ALTER TABLE t1 RENAME TO t2") def test_rename_table_schema_postgresql(self): context = op_fixture("postgresql") op.rename_table("t1", "t2", schema="foo") context.assert_("ALTER TABLE foo.t1 RENAME TO t2") def test_create_index_postgresql_expressions(self): context = op_fixture("postgresql") op.create_index( "geocoded", "locations", [text("lower(coordinates)")], postgresql_where=text("locations.coordinates != Null"), ) context.assert_( "CREATE INDEX geocoded ON locations (lower(coordinates)) " "WHERE locations.coordinates != Null" ) def test_create_index_postgresql_where(self): context = op_fixture("postgresql") op.create_index( "geocoded", "locations", ["coordinates"], postgresql_where=text("locations.coordinates != Null"), ) context.assert_( "CREATE INDEX geocoded ON locations (coordinates) " "WHERE locations.coordinates != Null" ) def test_create_index_postgresql_concurrently(self): context = op_fixture("postgresql") op.create_index( "geocoded", "locations", ["coordinates"], postgresql_concurrently=True, ) context.assert_( "CREATE INDEX CONCURRENTLY geocoded ON locations (coordinates)" ) def test_drop_index_postgresql_concurrently(self): context = op_fixture("postgresql") op.drop_index("geocoded", "locations", postgresql_concurrently=True) context.assert_("DROP INDEX CONCURRENTLY geocoded") def test_alter_column_type_using(self): context = op_fixture("postgresql") op.alter_column("t", "c", type_=Integer, postgresql_using="c::integer") context.assert_( "ALTER TABLE t ALTER COLUMN c TYPE INTEGER USING c::integer" ) def test_col_w_pk_is_serial(self): context = op_fixture("postgresql") op.add_column("some_table", Column("q", Integer, primary_key=True)) context.assert_("ALTER TABLE some_table ADD COLUMN q SERIAL NOT NULL") def test_create_exclude_constraint(self): context = op_fixture("postgresql") op.create_exclude_constraint( "ex1", "t1", ("x", ">"), where="x > 5", using="gist" ) context.assert_( "ALTER TABLE t1 ADD CONSTRAINT ex1 EXCLUDE USING gist (x WITH >) " "WHERE (x > 5)" ) def test_create_exclude_constraint_quoted_literal(self): context = op_fixture("postgresql") op.create_exclude_constraint( "ex1", "SomeTable", (column("SomeColumn"), ">"), where='"SomeColumn" > 5', using="gist", ) context.assert_( 'ALTER TABLE "SomeTable" ADD CONSTRAINT ex1 EXCLUDE USING gist ' '("SomeColumn" WITH >) WHERE ("SomeColumn" > 5)' ) def test_create_exclude_constraint_quoted_column(self): context = op_fixture("postgresql") op.create_exclude_constraint( "ex1", "SomeTable", (column("SomeColumn"), ">"), where=column("SomeColumn") > 5, using="gist", ) context.assert_( 'ALTER TABLE "SomeTable" ADD CONSTRAINT ex1 EXCLUDE ' 'USING gist ("SomeColumn" WITH >) WHERE ("SomeColumn" > 5)' ) @config.requirements.comments_api def test_add_column_with_comment(self): context = op_fixture("postgresql") op.add_column("t", Column("q", Integer, comment="This is a comment")) context.assert_( "ALTER TABLE t ADD COLUMN q INTEGER", "COMMENT ON COLUMN t.q IS 'This is a comment'", ) @config.requirements.comments_api def test_alter_column_with_comment(self): context = op_fixture("postgresql") op.alter_column( "t", "c", nullable=False, existing_type=Boolean(), schema="foo", comment="This is a column comment", ) context.assert_( "ALTER TABLE foo.t ALTER COLUMN c SET NOT NULL", "COMMENT ON COLUMN t.c IS 'This is a column comment'", ) @config.requirements.comments_api def test_alter_column_add_comment(self): context = op_fixture("postgresql") op.alter_column( "t", "c", existing_type=Boolean(), schema="foo", comment="This is a column comment", ) context.assert_("COMMENT ON COLUMN t.c IS 'This is a column comment'") @config.requirements.comments_api def test_alter_column_add_comment_quoting(self): context = op_fixture("postgresql") op.alter_column( "t", "c", existing_type=Boolean(), schema="foo", comment="This is a column 'comment'", ) context.assert_( "COMMENT ON COLUMN t.c IS 'This is a column ''comment'''" ) @config.requirements.comments_api def test_alter_column_drop_comment(self): context = op_fixture("postgresql") op.alter_column( "t", "c", existing_type=Boolean(), schema="foo", comment=None, existing_comment="This is a column comment", ) context.assert_("COMMENT ON COLUMN t.c IS NULL") @config.requirements.comments_api def test_create_table_with_comment(self): context = op_fixture("postgresql") op.create_table( "t2", Column("c1", Integer, primary_key=True), Column("c2", Integer), comment="t2 comment", ) context.assert_( "CREATE TABLE t2 (c1 SERIAL NOT NULL, " "c2 INTEGER, PRIMARY KEY (c1))", "COMMENT ON TABLE t2 IS 't2 comment'", ) @config.requirements.comments_api def test_create_table_with_column_comments(self): context = op_fixture("postgresql") op.create_table( "t2", Column("c1", Integer, primary_key=True, comment="c1 comment"), Column("c2", Integer, comment="c2 comment"), comment="t2 comment", ) context.assert_( "CREATE TABLE t2 (c1 SERIAL NOT NULL, " "c2 INTEGER, PRIMARY KEY (c1))", "COMMENT ON TABLE t2 IS 't2 comment'", "COMMENT ON COLUMN t2.c1 IS 'c1 comment'", "COMMENT ON COLUMN t2.c2 IS 'c2 comment'", ) @config.requirements.comments_api def test_create_table_comment(self): # this is handled by SQLAlchemy's compilers context = op_fixture("postgresql") op.create_table_comment("t2", comment="t2 table", schema="foo") context.assert_("COMMENT ON TABLE foo.t2 IS 't2 table'") @config.requirements.comments_api def test_drop_table_comment(self): # this is handled by SQLAlchemy's compilers context = op_fixture("postgresql") op.drop_table_comment("t2", existing_comment="t2 table", schema="foo") context.assert_("COMMENT ON TABLE foo.t2 IS NULL") class PGOfflineEnumTest(TestBase): def setUp(self): staging_env() self.cfg = cfg = _no_sql_testing_config() self.rid = rid = util.rev_id() self.script = script = ScriptDirectory.from_config(cfg) script.generate_revision(rid, None, refresh=True) def tearDown(self): clear_staging_env() def _inline_enum_script(self): write_script( self.script, self.rid, """ revision = '%s' down_revision = None from alembic import op from sqlalchemy.dialects.postgresql import ENUM from sqlalchemy import Column def upgrade(): op.create_table("sometable", Column("data", ENUM("one", "two", "three", name="pgenum")) ) def downgrade(): op.drop_table("sometable") """ % self.rid, ) def _distinct_enum_script(self): write_script( self.script, self.rid, """ revision = '%s' down_revision = None from alembic import op from sqlalchemy.dialects.postgresql import ENUM from sqlalchemy import Column def upgrade(): enum = ENUM("one", "two", "three", name="pgenum", create_type=False) enum.create(op.get_bind(), checkfirst=False) op.create_table("sometable", Column("data", enum) ) def downgrade(): op.drop_table("sometable") ENUM(name="pgenum").drop(op.get_bind(), checkfirst=False) """ % self.rid, ) def test_offline_inline_enum_create(self): self._inline_enum_script() with capture_context_buffer() as buf: command.upgrade(self.cfg, self.rid, sql=True) assert ( "CREATE TYPE pgenum AS " "ENUM ('one', 'two', 'three')" in buf.getvalue() ) assert "CREATE TABLE sometable (\n data pgenum\n)" in buf.getvalue() def test_offline_inline_enum_drop(self): self._inline_enum_script() with capture_context_buffer() as buf: command.downgrade(self.cfg, "%s:base" % self.rid, sql=True) assert "DROP TABLE sometable" in buf.getvalue() # no drop since we didn't emit events assert "DROP TYPE pgenum" not in buf.getvalue() def test_offline_distinct_enum_create(self): self._distinct_enum_script() with capture_context_buffer() as buf: command.upgrade(self.cfg, self.rid, sql=True) assert ( "CREATE TYPE pgenum AS ENUM " "('one', 'two', 'three')" in buf.getvalue() ) assert "CREATE TABLE sometable (\n data pgenum\n)" in buf.getvalue() def test_offline_distinct_enum_drop(self): self._distinct_enum_script() with capture_context_buffer() as buf: command.downgrade(self.cfg, "%s:base" % self.rid, sql=True) assert "DROP TABLE sometable" in buf.getvalue() assert "DROP TYPE pgenum" in buf.getvalue() class PostgresqlInlineLiteralTest(TestBase): __only_on__ = "postgresql" __backend__ = True @classmethod def setup_class(cls): cls.bind = config.db cls.bind.execute( """ create table tab ( col varchar(50) ) """ ) cls.bind.execute( """ insert into tab (col) values ('old data 1'), ('old data 2.1'), ('old data 3') """ ) @classmethod def teardown_class(cls): cls.bind.execute("drop table tab") def setUp(self): self.conn = self.bind.connect() ctx = MigrationContext.configure(self.conn) self.op = Operations(ctx) def tearDown(self): self.conn.close() def test_inline_percent(self): # TODO: here's the issue, you need to escape this. tab = table("tab", column("col")) self.op.execute( tab.update() .where(tab.c.col.like(self.op.inline_literal("%.%"))) .values(col=self.op.inline_literal("new data")), execution_options={"no_parameters": True}, ) eq_( self.conn.execute( "select count(*) from tab where col='new data'" ).scalar(), 1, ) class PostgresqlDefaultCompareTest(TestBase): __only_on__ = "postgresql" __backend__ = True @classmethod def setup_class(cls): cls.bind = config.db staging_env() cls.migration_context = MigrationContext.configure( connection=cls.bind.connect(), opts={"compare_type": True, "compare_server_default": True}, ) def setUp(self): self.metadata = MetaData(self.bind) self.autogen_context = api.AutogenContext(self.migration_context) @classmethod def teardown_class(cls): clear_staging_env() def tearDown(self): self.metadata.drop_all() def _compare_default_roundtrip( self, type_, orig_default, alternate=None, diff_expected=None ): diff_expected = ( diff_expected if diff_expected is not None else alternate is not None ) if alternate is None: alternate = orig_default t1 = Table( "test", self.metadata, Column("somecol", type_, server_default=orig_default), ) t2 = Table( "test", MetaData(), Column("somecol", type_, server_default=alternate), ) t1.create(self.bind) insp = Inspector.from_engine(self.bind) cols = insp.get_columns(t1.name) insp_col = Column( "somecol", cols[0]["type"], server_default=text(cols[0]["default"]) ) op = ops.AlterColumnOp("test", "somecol") _compare_server_default( self.autogen_context, op, None, "test", "somecol", insp_col, t2.c.somecol, ) diffs = op.to_diff_tuple() eq_(bool(diffs), diff_expected) def _compare_default(self, t1, t2, col, rendered): t1.create(self.bind, checkfirst=True) insp = Inspector.from_engine(self.bind) cols = insp.get_columns(t1.name) ctx = self.autogen_context.migration_context return ctx.impl.compare_server_default( None, col, rendered, cols[0]["default"] ) def test_compare_string_blank_default(self): self._compare_default_roundtrip(String(8), "") def test_compare_string_nonblank_default(self): self._compare_default_roundtrip(String(8), "hi") def test_compare_interval_str(self): # this form shouldn't be used but testing here # for compatibility self._compare_default_roundtrip(Interval, "14 days") @config.requirements.postgresql_uuid_ossp def test_compare_uuid_text(self): self._compare_default_roundtrip(UUID, text("uuid_generate_v4()")) def test_compare_interval_text(self): self._compare_default_roundtrip(Interval, text("'14 days'")) def test_compare_array_of_integer_text(self): self._compare_default_roundtrip( ARRAY(Integer), text("(ARRAY[]::integer[])") ) def test_compare_current_timestamp_text(self): self._compare_default_roundtrip( DateTime(), text("TIMEZONE('utc', CURRENT_TIMESTAMP)") ) def test_compare_current_timestamp_fn_w_binds(self): self._compare_default_roundtrip( DateTime(), func.timezone("utc", func.current_timestamp()) ) def test_compare_integer_str(self): self._compare_default_roundtrip(Integer(), "5") def test_compare_integer_text(self): self._compare_default_roundtrip(Integer(), text("5")) def test_compare_integer_text_diff(self): self._compare_default_roundtrip(Integer(), text("5"), "7") def test_compare_float_str(self): self._compare_default_roundtrip(Float(), "5.2") def test_compare_float_text(self): self._compare_default_roundtrip(Float(), text("5.2")) def test_compare_float_no_diff1(self): self._compare_default_roundtrip( Float(), text("5.2"), "5.2", diff_expected=False ) def test_compare_float_no_diff2(self): self._compare_default_roundtrip( Float(), "5.2", text("5.2"), diff_expected=False ) def test_compare_float_no_diff3(self): self._compare_default_roundtrip( Float(), text("5"), text("5.0"), diff_expected=False ) def test_compare_float_no_diff4(self): self._compare_default_roundtrip( Float(), "5", "5.0", diff_expected=False ) def test_compare_float_no_diff5(self): self._compare_default_roundtrip( Float(), text("5"), "5.0", diff_expected=False ) def test_compare_float_no_diff6(self): self._compare_default_roundtrip( Float(), "5", text("5.0"), diff_expected=False ) def test_compare_numeric_no_diff(self): self._compare_default_roundtrip( Numeric(), text("5"), "5.0", diff_expected=False ) def test_compare_unicode_literal(self): self._compare_default_roundtrip(String(), u"im a default") # TOOD: will need to actually eval() the repr() and # spend more effort figuring out exactly the kind of expression # to use def _TODO_test_compare_character_str_w_singlequote(self): self._compare_default_roundtrip(String(), "hel''lo") def test_compare_character_str(self): self._compare_default_roundtrip(String(), "hello") def test_compare_character_text(self): self._compare_default_roundtrip(String(), text("'hello'")) def test_compare_character_str_diff(self): self._compare_default_roundtrip(String(), "hello", "there") def test_compare_character_text_diff(self): self._compare_default_roundtrip( String(), text("'hello'"), text("'there'") ) def test_primary_key_skip(self): """Test that SERIAL cols are just skipped""" t1 = Table( "sometable", self.metadata, Column("id", Integer, primary_key=True) ) t2 = Table( "sometable", MetaData(), Column("id", Integer, primary_key=True) ) assert not self._compare_default(t1, t2, t2.c.id, "") class PostgresqlDetectSerialTest(TestBase): __only_on__ = "postgresql" __backend__ = True @classmethod def setup_class(cls): cls.bind = config.db staging_env() def setUp(self): self.conn = self.bind.connect() self.migration_context = MigrationContext.configure( connection=self.conn, opts={"compare_type": True, "compare_server_default": True}, ) self.autogen_context = api.AutogenContext(self.migration_context) def tearDown(self): self.conn.close() @classmethod def teardown_class(cls): clear_staging_env() @provide_metadata def _expect_default(self, c_expected, col, seq=None): Table("t", self.metadata, col) self.autogen_context.metadata = self.metadata if seq: seq._set_metadata(self.metadata) self.metadata.create_all(config.db) insp = Inspector.from_engine(config.db) uo = ops.UpgradeOps(ops=[]) _compare_tables( set([(None, "t")]), set([]), insp, uo, self.autogen_context ) diffs = uo.as_diffs() tab = diffs[0][1] eq_( _render_server_default_for_compare( tab.c.x.server_default, tab.c.x, self.autogen_context ), c_expected, ) insp = Inspector.from_engine(config.db) uo = ops.UpgradeOps(ops=[]) m2 = MetaData() Table("t", m2, Column("x", BigInteger())) self.autogen_context.metadata = m2 _compare_tables( set([(None, "t")]), set([(None, "t")]), insp, uo, self.autogen_context, ) diffs = uo.as_diffs() server_default = diffs[0][0][4]["existing_server_default"] eq_( _render_server_default_for_compare( server_default, tab.c.x, self.autogen_context ), c_expected, ) def test_serial(self): self._expect_default(None, Column("x", Integer, primary_key=True)) def test_separate_seq(self): seq = Sequence("x_id_seq") self._expect_default( "nextval('x_id_seq'::regclass)", Column( "x", Integer, server_default=seq.next_value(), primary_key=True ), seq, ) def test_numeric(self): seq = Sequence("x_id_seq") self._expect_default( "nextval('x_id_seq'::regclass)", Column( "x", Numeric(8, 2), server_default=seq.next_value(), primary_key=True, ), seq, ) def test_no_default(self): self._expect_default( None, Column("x", Integer, autoincrement=False, primary_key=True) ) class PostgresqlAutogenRenderTest(TestBase): def setUp(self): ctx_opts = { "sqlalchemy_module_prefix": "sa.", "alembic_module_prefix": "op.", "target_metadata": MetaData(), } context = MigrationContext.configure( dialect_name="postgresql", opts=ctx_opts ) self.autogen_context = api.AutogenContext(context) def test_render_add_index_pg_where(self): autogen_context = self.autogen_context m = MetaData() t = Table("t", m, Column("x", String), Column("y", String)) idx = Index( "foo_idx", t.c.x, t.c.y, postgresql_where=(t.c.y == "something") ) op_obj = ops.CreateIndexOp.from_index(idx) eq_ignore_whitespace( autogenerate.render_op_text(autogen_context, op_obj), """op.create_index('foo_idx', 't', \ ['x', 'y'], unique=False, """ """postgresql_where=sa.text(!U"y = 'something'"))""", ) def test_render_server_default_native_boolean(self): c = Column( "updated_at", Boolean(), server_default=false(), nullable=False ) result = autogenerate.render._render_column(c, self.autogen_context) eq_ignore_whitespace( result, "sa.Column('updated_at', sa.Boolean(), " "server_default=sa.text(!U'false'), " "nullable=False)", ) def test_postgresql_array_type(self): eq_ignore_whitespace( autogenerate.render._repr_type( ARRAY(Integer), self.autogen_context ), "postgresql.ARRAY(sa.Integer())", ) eq_ignore_whitespace( autogenerate.render._repr_type( ARRAY(DateTime(timezone=True)), self.autogen_context ), "postgresql.ARRAY(sa.DateTime(timezone=True))", ) eq_ignore_whitespace( autogenerate.render._repr_type( ARRAY(BYTEA, as_tuple=True, dimensions=2), self.autogen_context ), "postgresql.ARRAY(postgresql.BYTEA(), " "as_tuple=True, dimensions=2)", ) assert ( "from sqlalchemy.dialects import postgresql" in self.autogen_context.imports ) def test_postgresql_hstore_subtypes(self): eq_ignore_whitespace( autogenerate.render._repr_type(HSTORE(), self.autogen_context), "postgresql.HSTORE(text_type=sa.Text())", ) eq_ignore_whitespace( autogenerate.render._repr_type( HSTORE(text_type=String()), self.autogen_context ), "postgresql.HSTORE(text_type=sa.String())", ) eq_ignore_whitespace( autogenerate.render._repr_type( HSTORE(text_type=BYTEA()), self.autogen_context ), "postgresql.HSTORE(text_type=postgresql.BYTEA())", ) assert ( "from sqlalchemy.dialects import postgresql" in self.autogen_context.imports ) def test_generic_array_type(self): eq_ignore_whitespace( autogenerate.render._repr_type( types.ARRAY(Integer), self.autogen_context ), "sa.ARRAY(sa.Integer())", ) eq_ignore_whitespace( autogenerate.render._repr_type( types.ARRAY(DateTime(timezone=True)), self.autogen_context ), "sa.ARRAY(sa.DateTime(timezone=True))", ) assert ( "from sqlalchemy.dialects import postgresql" not in self.autogen_context.imports ) eq_ignore_whitespace( autogenerate.render._repr_type( types.ARRAY(BYTEA, as_tuple=True, dimensions=2), self.autogen_context, ), "sa.ARRAY(postgresql.BYTEA(), as_tuple=True, dimensions=2)", ) assert ( "from sqlalchemy.dialects import postgresql" in self.autogen_context.imports ) def test_array_type_user_defined_inner(self): def repr_type(typestring, object_, autogen_context): if typestring == "type" and isinstance(object_, String): return "foobar.MYVARCHAR" else: return False self.autogen_context.opts.update(render_item=repr_type) eq_ignore_whitespace( autogenerate.render._repr_type( ARRAY(String), self.autogen_context ), "postgresql.ARRAY(foobar.MYVARCHAR)", ) def test_add_exclude_constraint(self): from sqlalchemy.dialects.postgresql import ExcludeConstraint autogen_context = self.autogen_context m = MetaData() t = Table("t", m, Column("x", String), Column("y", String)) op_obj = ops.AddConstraintOp.from_constraint( ExcludeConstraint( (t.c.x, ">"), where=t.c.x != 2, using="gist", name="t_excl_x" ) ) eq_ignore_whitespace( autogenerate.render_op_text(autogen_context, op_obj), "op.create_exclude_constraint('t_excl_x', " "'t', (sa.column('x'), '>'), " "where=sa.text(!U'x != 2'), using='gist')", ) def test_add_exclude_constraint_case_sensitive(self): from sqlalchemy.dialects.postgresql import ExcludeConstraint autogen_context = self.autogen_context m = MetaData() t = Table( "TTAble", m, Column("XColumn", String), Column("YColumn", String) ) op_obj = ops.AddConstraintOp.from_constraint( ExcludeConstraint( (t.c.XColumn, ">"), where=t.c.XColumn != 2, using="gist", name="t_excl_x", ) ) eq_ignore_whitespace( autogenerate.render_op_text(autogen_context, op_obj), "op.create_exclude_constraint('t_excl_x', 'TTAble', " "(sa.column('XColumn'), '>'), " "where=sa.text(!U'\"XColumn\" != 2'), using='gist')", ) def test_inline_exclude_constraint(self): from sqlalchemy.dialects.postgresql import ExcludeConstraint autogen_context = self.autogen_context m = MetaData() t = Table( "t", m, Column("x", String), Column("y", String), ExcludeConstraint( (column("x"), ">"), using="gist", where="x != 2", name="t_excl_x", ), ) op_obj = ops.CreateTableOp.from_table(t) eq_ignore_whitespace( autogenerate.render_op_text(autogen_context, op_obj), "op.create_table('t',sa.Column('x', sa.String(), nullable=True)," "sa.Column('y', sa.String(), nullable=True)," "postgresql.ExcludeConstraint((sa.column('x'), '>'), " "where=sa.text(!U'x != 2'), using='gist', name='t_excl_x')" ")", ) def test_inline_exclude_constraint_case_sensitive(self): from sqlalchemy.dialects.postgresql import ExcludeConstraint autogen_context = self.autogen_context m = MetaData() t = Table( "TTable", m, Column("XColumn", String), Column("YColumn", String) ) ExcludeConstraint( (t.c.XColumn, ">"), using="gist", where='"XColumn" != 2', name="TExclX", ) op_obj = ops.CreateTableOp.from_table(t) eq_ignore_whitespace( autogenerate.render_op_text(autogen_context, op_obj), "op.create_table('TTable',sa.Column('XColumn', sa.String(), " "nullable=True)," "sa.Column('YColumn', sa.String(), nullable=True)," "postgresql.ExcludeConstraint((sa.column('XColumn'), '>'), " "where=sa.text(!U'\"XColumn\" != 2'), using='gist', " "name='TExclX'))", ) def test_json_type(self): eq_ignore_whitespace( autogenerate.render._repr_type(JSON(), self.autogen_context), "postgresql.JSON(astext_type=sa.Text())", ) def test_jsonb_type(self): eq_ignore_whitespace( autogenerate.render._repr_type(JSONB(), self.autogen_context), "postgresql.JSONB(astext_type=sa.Text())", ) zzzeek-alembic-bee044a1c187/tests/test_revision.py000066400000000000000000001043721353106760100222220ustar00rootroot00000000000000from alembic.script.revision import MultipleHeads from alembic.script.revision import Revision from alembic.script.revision import RevisionError from alembic.script.revision import RevisionMap from alembic.testing import assert_raises_message from alembic.testing import config from alembic.testing import eq_ from alembic.testing.fixtures import TestBase from . import _large_map class APITest(TestBase): @config.requirements.python3 def test_invalid_datatype(self): map_ = RevisionMap( lambda: [ Revision("a", ()), Revision("b", ("a",)), Revision("c", ("b",)), ] ) assert_raises_message( RevisionError, "revision identifier b'12345' is not a string; " "ensure database driver settings are correct", map_.get_revisions, b'12345' ) assert_raises_message( RevisionError, "revision identifier b'12345' is not a string; " "ensure database driver settings are correct", map_.get_revision, b'12345' ) assert_raises_message( RevisionError, r"revision identifier \(b'12345',\) is not a string; " "ensure database driver settings are correct", map_.get_revision, (b'12345', ) ) map_.get_revision(("a", )) map_.get_revision("a") def test_add_revision_one_head(self): map_ = RevisionMap( lambda: [ Revision("a", ()), Revision("b", ("a",)), Revision("c", ("b",)), ] ) eq_(map_.heads, ("c",)) map_.add_revision(Revision("d", ("c",))) eq_(map_.heads, ("d",)) def test_add_revision_two_head(self): map_ = RevisionMap( lambda: [ Revision("a", ()), Revision("b", ("a",)), Revision("c1", ("b",)), Revision("c2", ("b",)), ] ) eq_(map_.heads, ("c1", "c2")) map_.add_revision(Revision("d1", ("c1",))) eq_(map_.heads, ("c2", "d1")) def test_get_revision_head_single(self): map_ = RevisionMap( lambda: [ Revision("a", ()), Revision("b", ("a",)), Revision("c", ("b",)), ] ) eq_(map_.get_revision("head"), map_._revision_map["c"]) def test_get_revision_base_single(self): map_ = RevisionMap( lambda: [ Revision("a", ()), Revision("b", ("a",)), Revision("c", ("b",)), ] ) eq_(map_.get_revision("base"), None) def test_get_revision_head_multiple(self): map_ = RevisionMap( lambda: [ Revision("a", ()), Revision("b", ("a",)), Revision("c1", ("b",)), Revision("c2", ("b",)), ] ) assert_raises_message( MultipleHeads, "Multiple heads are present", map_.get_revision, "head", ) def test_get_revision_heads_multiple(self): map_ = RevisionMap( lambda: [ Revision("a", ()), Revision("b", ("a",)), Revision("c1", ("b",)), Revision("c2", ("b",)), ] ) assert_raises_message( MultipleHeads, "Multiple heads are present", map_.get_revision, "heads", ) def test_get_revision_base_multiple(self): map_ = RevisionMap( lambda: [ Revision("a", ()), Revision("b", ("a",)), Revision("c", ()), Revision("d", ("c",)), ] ) eq_(map_.get_revision("base"), None) def test_iterate_tolerates_dupe_targets(self): map_ = RevisionMap( lambda: [ Revision("a", ()), Revision("b", ("a",)), Revision("c", ("b",)), ] ) eq_( [r.revision for r in map_._iterate_revisions(("c", "c"), "a")], ["c", "b", "a"], ) def test_repr_revs(self): map_ = RevisionMap( lambda: [ Revision("a", ()), Revision("b", ("a",)), Revision("c", (), dependencies=("a", "b")), ] ) c = map_._revision_map["c"] eq_(repr(c), "Revision('c', None, dependencies=('a', 'b'))") class DownIterateTest(TestBase): def _assert_iteration( self, upper, lower, assertion, inclusive=True, map_=None, implicit_base=False, select_for_downgrade=False, ): if map_ is None: map_ = self.map eq_( [ rev.revision for rev in map_.iterate_revisions( upper, lower, inclusive=inclusive, implicit_base=implicit_base, select_for_downgrade=select_for_downgrade, ) ], assertion, ) class DiamondTest(DownIterateTest): def setUp(self): self.map = RevisionMap( lambda: [ Revision("a", ()), Revision("b1", ("a",)), Revision("b2", ("a",)), Revision("c", ("b1", "b2")), Revision("d", ("c",)), ] ) def test_iterate_simple_diamond(self): self._assert_iteration("d", "a", ["d", "c", "b1", "b2", "a"]) class EmptyMapTest(DownIterateTest): # see issue #258 def setUp(self): self.map = RevisionMap(lambda: []) def test_iterate(self): self._assert_iteration("head", "base", []) class LabeledBranchTest(DownIterateTest): def test_dupe_branch_collection(self): def fn(): return [ Revision("a", ()), Revision("b", ("a",)), Revision("c", ("b",), branch_labels=["xy1"]), Revision("d", ()), Revision("e", ("d",), branch_labels=["xy1"]), Revision("f", ("e",)), ] assert_raises_message( RevisionError, r"Branch name 'xy1' in revision (?:e|c) already " "used by revision (?:e|c)", getattr, RevisionMap(fn), "_revision_map", ) def test_filter_for_lineage_labeled_head_across_merge(self): def fn(): return [ Revision("a", ()), Revision("b", ("a",)), Revision("c1", ("b",), branch_labels="c1branch"), Revision("c2", ("b",)), Revision("d", ("c1", "c2")), ] map_ = RevisionMap(fn) c1 = map_.get_revision("c1") c2 = map_.get_revision("c2") d = map_.get_revision("d") eq_(map_.filter_for_lineage([c1, c2, d], "c1branch@head"), [c1, c2, d]) def test_filter_for_lineage_heads(self): eq_( self.map.filter_for_lineage([self.map.get_revision("f")], "heads"), [self.map.get_revision("f")], ) def setUp(self): self.map = RevisionMap( lambda: [ Revision("a", (), branch_labels="abranch"), Revision("b", ("a",)), Revision("somelongername", ("b",)), Revision("c", ("somelongername",)), Revision("d", ()), Revision("e", ("d",), branch_labels=["ebranch"]), Revision("someothername", ("e",)), Revision("f", ("someothername",)), ] ) def test_get_base_revisions_labeled(self): eq_(self.map._get_base_revisions("somelongername@base"), ["a"]) def test_get_current_named_rev(self): eq_(self.map.get_revision("ebranch@head"), self.map.get_revision("f")) def test_get_base_revisions(self): eq_(self.map._get_base_revisions("base"), ["a", "d"]) def test_iterate_head_to_named_base(self): self._assert_iteration( "heads", "ebranch@base", ["f", "someothername", "e", "d"] ) self._assert_iteration( "heads", "abranch@base", ["c", "somelongername", "b", "a"] ) def test_iterate_named_head_to_base(self): self._assert_iteration( "ebranch@head", "base", ["f", "someothername", "e", "d"] ) self._assert_iteration( "abranch@head", "base", ["c", "somelongername", "b", "a"] ) def test_iterate_named_head_to_heads(self): self._assert_iteration("heads", "ebranch@head", ["f"], inclusive=True) def test_iterate_named_rev_to_heads(self): self._assert_iteration( "heads", "ebranch@d", ["f", "someothername", "e", "d"], inclusive=True, ) def test_iterate_head_to_version_specific_base(self): self._assert_iteration( "heads", "e@base", ["f", "someothername", "e", "d"] ) self._assert_iteration( "heads", "c@base", ["c", "somelongername", "b", "a"] ) def test_iterate_to_branch_at_rev(self): self._assert_iteration( "heads", "ebranch@d", ["f", "someothername", "e", "d"] ) def test_branch_w_down_relative(self): self._assert_iteration( "heads", "ebranch@-2", ["f", "someothername", "e"] ) def test_branch_w_up_relative(self): self._assert_iteration( "ebranch@+2", "base", ["someothername", "e", "d"] ) def test_partial_id_resolve(self): eq_(self.map.get_revision("ebranch@some").revision, "someothername") eq_(self.map.get_revision("abranch@some").revision, "somelongername") def test_branch_at_heads(self): eq_(self.map.get_revision("abranch@heads").revision, "c") def test_branch_at_syntax(self): eq_(self.map.get_revision("abranch@head").revision, "c") eq_(self.map.get_revision("abranch@base"), None) eq_(self.map.get_revision("ebranch@head").revision, "f") eq_(self.map.get_revision("abranch@base"), None) eq_(self.map.get_revision("ebranch@d").revision, "d") def test_branch_at_self(self): eq_(self.map.get_revision("ebranch@ebranch").revision, "e") def test_retrieve_branch_revision(self): eq_(self.map.get_revision("abranch").revision, "a") eq_(self.map.get_revision("ebranch").revision, "e") def test_rev_not_in_branch(self): assert_raises_message( RevisionError, "Revision b is not a member of branch 'ebranch'", self.map.get_revision, "ebranch@b", ) assert_raises_message( RevisionError, "Revision d is not a member of branch 'abranch'", self.map.get_revision, "abranch@d", ) def test_no_revision_exists(self): assert_raises_message( RevisionError, "No such revision or branch 'q'", self.map.get_revision, "abranch@q", ) def test_not_actually_a_branch(self): eq_(self.map.get_revision("e@d").revision, "d") def test_not_actually_a_branch_partial_resolution(self): eq_(self.map.get_revision("someoth@d").revision, "d") def test_no_such_branch(self): assert_raises_message( RevisionError, "No such branch: 'x'", self.map.get_revision, "x@d" ) class LongShortBranchTest(DownIterateTest): def setUp(self): self.map = RevisionMap( lambda: [ Revision("a", ()), Revision("b1", ("a",)), Revision("b2", ("a",)), Revision("c1", ("b1",)), Revision("d11", ("c1",)), Revision("d12", ("c1",)), ] ) def test_iterate_full(self): self._assert_iteration( "heads", "base", ["b2", "d11", "d12", "c1", "b1", "a"] ) class MultipleBranchTest(DownIterateTest): def setUp(self): self.map = RevisionMap( lambda: [ Revision("a", ()), Revision("b1", ("a",)), Revision("b2", ("a",)), Revision("cb1", ("b1",)), Revision("cb2", ("b2",)), Revision("d1cb1", ("cb1",)), # head Revision("d2cb1", ("cb1",)), # head Revision("d1cb2", ("cb2",)), Revision("d2cb2", ("cb2",)), Revision("d3cb2", ("cb2",)), # head Revision("d1d2cb2", ("d1cb2", "d2cb2")), # head + merge point ] ) def test_iterate_from_merge_point(self): self._assert_iteration( "d1d2cb2", "a", ["d1d2cb2", "d1cb2", "d2cb2", "cb2", "b2", "a"] ) def test_iterate_multiple_heads(self): self._assert_iteration( ["d2cb2", "d3cb2"], "a", ["d2cb2", "d3cb2", "cb2", "b2", "a"] ) def test_iterate_single_branch(self): self._assert_iteration("d3cb2", "a", ["d3cb2", "cb2", "b2", "a"]) def test_iterate_single_branch_to_base(self): self._assert_iteration("d3cb2", "base", ["d3cb2", "cb2", "b2", "a"]) def test_iterate_multiple_branch_to_base(self): self._assert_iteration( ["d3cb2", "cb1"], "base", ["d3cb2", "cb2", "b2", "cb1", "b1", "a"] ) def test_iterate_multiple_heads_single_base(self): # head d1cb1 is omitted as it is not # a descendant of b2 self._assert_iteration( ["d1cb1", "d2cb2", "d3cb2"], "b2", ["d2cb2", "d3cb2", "cb2", "b2"] ) def test_same_branch_wrong_direction(self): # nodes b1 and d1cb1 are connected, but # db1cb1 is the descendant of b1 assert_raises_message( RevisionError, r"Revision d1cb1 is not an ancestor of revision b1", list, self.map._iterate_revisions("b1", "d1cb1"), ) def test_distinct_branches(self): # nodes db2cb2 and b1 have no path to each other assert_raises_message( RevisionError, r"Revision b1 is not an ancestor of revision d2cb2", list, self.map._iterate_revisions("d2cb2", "b1"), ) def test_wrong_direction_to_base_as_none(self): # this needs to raise and not just return empty iteration # as added by #258 assert_raises_message( RevisionError, r"Revision d1cb1 is not an ancestor of revision base", list, self.map._iterate_revisions(None, "d1cb1"), ) def test_wrong_direction_to_base_as_empty(self): # this needs to raise and not just return empty iteration # as added by #258 assert_raises_message( RevisionError, r"Revision d1cb1 is not an ancestor of revision base", list, self.map._iterate_revisions((), "d1cb1"), ) class BranchTravellingTest(DownIterateTest): """test the order of revs when going along multiple branches. We want depth-first along branches, but then we want to terminate all branches at their branch point before continuing to the nodes preceding that branch. """ def setUp(self): self.map = RevisionMap( lambda: [ Revision("a1", ()), Revision("a2", ("a1",)), Revision("a3", ("a2",)), Revision("b1", ("a3",)), Revision("b2", ("a3",)), Revision("cb1", ("b1",)), Revision("cb2", ("b2",)), Revision("db1", ("cb1",)), Revision("db2", ("cb2",)), Revision("e1b1", ("db1",)), Revision("fe1b1", ("e1b1",)), Revision("e2b1", ("db1",)), Revision("e2b2", ("db2",)), Revision("merge", ("e2b1", "e2b2")), ] ) def test_iterate_one_branch_both_to_merge(self): # test that when we hit a merge point, implicit base will # ensure all branches that supply the merge point are filled in self._assert_iteration( "merge", "db1", ["merge", "e2b1", "db1", "e2b2", "db2", "cb2", "b2"], implicit_base=True, ) def test_three_branches_end_in_single_branch(self): self._assert_iteration( ["merge", "fe1b1"], "a3", [ "merge", "e2b1", "e2b2", "db2", "cb2", "b2", "fe1b1", "e1b1", "db1", "cb1", "b1", "a3", ], ) def test_two_branches_to_root(self): # here we want 'a3' as a "stop" branch point, but *not* # 'db1', as we don't have multiple traversals on db1 self._assert_iteration( "merge", "a1", [ "merge", "e2b1", "db1", "cb1", "b1", # e2b1 branch "e2b2", "db2", "cb2", "b2", # e2b2 branch "a3", # both terminate at a3 "a2", "a1", # finish out ], # noqa ) def test_two_branches_end_in_branch(self): self._assert_iteration( "merge", "b1", # 'b1' is local to 'e2b1' # branch so that is all we get ["merge", "e2b1", "db1", "cb1", "b1"], # noqa ) def test_two_branches_end_behind_branch(self): self._assert_iteration( "merge", "a2", [ "merge", "e2b1", "db1", "cb1", "b1", # e2b1 branch "e2b2", "db2", "cb2", "b2", # e2b2 branch "a3", # both terminate at a3 "a2", ], # noqa ) def test_three_branches_to_root(self): # in this case, both "a3" and "db1" are stop points self._assert_iteration( ["merge", "fe1b1"], "a1", [ "merge", "e2b1", # e2b1 branch "e2b2", "db2", "cb2", "b2", # e2b2 branch "fe1b1", "e1b1", # fe1b1 branch "db1", # fe1b1 and e2b1 branches terminate at db1 "cb1", "b1", # e2b1 branch continued....might be nicer # if this was before the e2b2 branch... "a3", # e2b1 and e2b2 branches terminate at a3 "a2", "a1", # finish out ], # noqa ) def test_three_branches_end_multiple_bases(self): # in this case, both "a3" and "db1" are stop points self._assert_iteration( ["merge", "fe1b1"], ["cb1", "cb2"], [ "merge", "e2b1", "e2b2", "db2", "cb2", "fe1b1", "e1b1", "db1", "cb1", ], ) def test_three_branches_end_multiple_bases_exclusive(self): self._assert_iteration( ["merge", "fe1b1"], ["cb1", "cb2"], ["merge", "e2b1", "e2b2", "db2", "fe1b1", "e1b1", "db1"], inclusive=False, ) def test_detect_invalid_head_selection(self): # db1 is an ancestor of fe1b1 assert_raises_message( RevisionError, "Requested revision fe1b1 overlaps " "with other requested revisions", list, self.map._iterate_revisions(["db1", "b2", "fe1b1"], ()), ) def test_three_branches_end_multiple_bases_exclusive_blank(self): self._assert_iteration( ["e2b1", "b2", "fe1b1"], (), [ "e2b1", "b2", "fe1b1", "e1b1", "db1", "cb1", "b1", "a3", "a2", "a1", ], inclusive=False, ) def test_iterate_to_symbolic_base(self): self._assert_iteration( ["fe1b1"], "base", ["fe1b1", "e1b1", "db1", "cb1", "b1", "a3", "a2", "a1"], inclusive=False, ) def test_ancestor_nodes(self): merge = self.map.get_revision("merge") eq_( set( rev.revision for rev in self.map._get_ancestor_nodes([merge], check=True) ), set( [ "a1", "e2b2", "e2b1", "cb2", "merge", "a3", "a2", "b1", "b2", "db1", "db2", "cb1", ] ), ) class MultipleBaseTest(DownIterateTest): def setUp(self): self.map = RevisionMap( lambda: [ Revision("base1", ()), Revision("base2", ()), Revision("base3", ()), Revision("a1a", ("base1",)), Revision("a1b", ("base1",)), Revision("a2", ("base2",)), Revision("a3", ("base3",)), Revision("b1a", ("a1a",)), Revision("b1b", ("a1b",)), Revision("b2", ("a2",)), Revision("b3", ("a3",)), Revision("c2", ("b2",)), Revision("d2", ("c2",)), Revision("mergeb3d2", ("b3", "d2")), ] ) def test_heads_to_base(self): self._assert_iteration( "heads", "base", [ "b1a", "a1a", "b1b", "a1b", "mergeb3d2", "b3", "a3", "base3", "d2", "c2", "b2", "a2", "base2", "base1", ], ) def test_heads_to_base_exclusive(self): self._assert_iteration( "heads", "base", [ "b1a", "a1a", "b1b", "a1b", "mergeb3d2", "b3", "a3", "base3", "d2", "c2", "b2", "a2", "base2", "base1", ], inclusive=False, ) def test_heads_to_blank(self): self._assert_iteration( "heads", None, [ "b1a", "a1a", "b1b", "a1b", "mergeb3d2", "b3", "a3", "base3", "d2", "c2", "b2", "a2", "base2", "base1", ], ) def test_detect_invalid_base_selection(self): assert_raises_message( RevisionError, "Requested revision a2 overlaps with " "other requested revisions", list, self.map._iterate_revisions(["c2"], ["a2", "b2"]), ) def test_heads_to_revs_plus_implicit_base_exclusive(self): self._assert_iteration( "heads", ["c2"], [ "b1a", "a1a", "b1b", "a1b", "mergeb3d2", "b3", "a3", "base3", "d2", "base1", ], inclusive=False, implicit_base=True, ) def test_heads_to_revs_base_exclusive(self): self._assert_iteration( "heads", ["c2"], ["mergeb3d2", "d2"], inclusive=False ) def test_heads_to_revs_plus_implicit_base_inclusive(self): self._assert_iteration( "heads", ["c2"], [ "b1a", "a1a", "b1b", "a1b", "mergeb3d2", "b3", "a3", "base3", "d2", "c2", "base1", ], implicit_base=True, ) def test_specific_path_one(self): self._assert_iteration("b3", "base3", ["b3", "a3", "base3"]) def test_specific_path_two_implicit_base(self): self._assert_iteration( ["b3", "b2"], "base3", ["b3", "a3", "b2", "a2", "base2"], inclusive=False, implicit_base=True, ) class MultipleBaseCrossDependencyTestOne(DownIterateTest): def setUp(self): """ Structure:: base1 -----> a1a -> b1a +----> a1b -> b1b | +-----------+ | v base3 -> a3 -> b3 ^ | +-----------+ | base2 -> a2 -> b2 -> c2 -> d2 """ self.map = RevisionMap( lambda: [ Revision("base1", (), branch_labels="b_1"), Revision("a1a", ("base1",)), Revision("a1b", ("base1",)), Revision("b1a", ("a1a",)), Revision("b1b", ("a1b",), dependencies="a3"), Revision("base2", (), branch_labels="b_2"), Revision("a2", ("base2",)), Revision("b2", ("a2",)), Revision("c2", ("b2",), dependencies="a3"), Revision("d2", ("c2",)), Revision("base3", (), branch_labels="b_3"), Revision("a3", ("base3",)), Revision("b3", ("a3",)), ] ) def test_what_are_the_heads(self): eq_(self.map.heads, ("b1a", "b1b", "d2", "b3")) def test_heads_to_base(self): self._assert_iteration( "heads", "base", [ "b1a", "a1a", "b1b", "a1b", "d2", "c2", "b2", "a2", "base2", "b3", "a3", "base3", "base1", ], ) def test_heads_to_base_downgrade(self): self._assert_iteration( "heads", "base", [ "b1a", "a1a", "b1b", "a1b", "d2", "c2", "b2", "a2", "base2", "b3", "a3", "base3", "base1", ], select_for_downgrade=True, ) def test_same_branch_wrong_direction(self): assert_raises_message( RevisionError, r"Revision d2 is not an ancestor of revision b2", list, self.map._iterate_revisions("b2", "d2"), ) def test_different_branch_not_wrong_direction(self): self._assert_iteration("b3", "d2", []) def test_we_need_head2_upgrade(self): # the 2 branch relies on the 3 branch self._assert_iteration( "b_2@head", "base", ["d2", "c2", "b2", "a2", "base2", "a3", "base3"], ) def test_we_need_head2_downgrade(self): # the 2 branch relies on the 3 branch, but # on the downgrade side, don't need to touch the 3 branch self._assert_iteration( "b_2@head", "b_2@base", ["d2", "c2", "b2", "a2", "base2"], select_for_downgrade=True, ) def test_we_need_head3_upgrade(self): # the 3 branch can be upgraded alone. self._assert_iteration("b_3@head", "base", ["b3", "a3", "base3"]) def test_we_need_head3_downgrade(self): # the 3 branch can be upgraded alone. self._assert_iteration( "b_3@head", "base", ["b3", "a3", "base3"], select_for_downgrade=True, ) def test_we_need_head1_upgrade(self): # the 1 branch relies on the 3 branch self._assert_iteration( "b1b@head", "base", ["b1b", "a1b", "base1", "a3", "base3"] ) def test_we_need_head1_downgrade(self): # going down we don't need a3-> base3, as long # as we are limiting the base target self._assert_iteration( "b1b@head", "b1b@base", ["b1b", "a1b", "base1"], select_for_downgrade=True, ) def test_we_need_base2_upgrade(self): # consider a downgrade to b_2@base - we # want to run through all the "2"s alone, and we're done. self._assert_iteration( "heads", "b_2@base", ["d2", "c2", "b2", "a2", "base2"] ) def test_we_need_base2_downgrade(self): # consider a downgrade to b_2@base - we # want to run through all the "2"s alone, and we're done. self._assert_iteration( "heads", "b_2@base", ["d2", "c2", "b2", "a2", "base2"], select_for_downgrade=True, ) def test_we_need_base3_upgrade(self): self._assert_iteration( "heads", "b_3@base", ["b1b", "d2", "c2", "b3", "a3", "base3"] ) def test_we_need_base3_downgrade(self): # consider a downgrade to b_3@base - due to the a3 dependency, we # need to downgrade everything dependent on a3 # as well, which means b1b and c2. Then we can downgrade # the 3s. self._assert_iteration( "heads", "b_3@base", ["b1b", "d2", "c2", "b3", "a3", "base3"], select_for_downgrade=True, ) class MultipleBaseCrossDependencyTestTwo(DownIterateTest): def setUp(self): self.map = RevisionMap( lambda: [ Revision("base1", (), branch_labels="b_1"), Revision("a1", "base1"), Revision("b1", "a1"), Revision("c1", "b1"), Revision("base2", (), dependencies="b_1", branch_labels="b_2"), Revision("a2", "base2"), Revision("b2", "a2"), Revision("c2", "b2"), Revision("d2", "c2"), Revision("base3", (), branch_labels="b_3"), Revision("a3", "base3"), Revision("b3", "a3"), Revision("c3", "b3", dependencies="b2"), Revision("d3", "c3"), ] ) def test_what_are_the_heads(self): eq_(self.map.heads, ("c1", "d2", "d3")) def test_heads_to_base(self): self._assert_iteration( "heads", "base", [ "c1", "b1", "a1", "d2", "c2", "d3", "c3", "b3", "a3", "base3", "b2", "a2", "base2", "base1", ], ) def test_we_need_head2(self): self._assert_iteration( "b_2@head", "base", ["d2", "c2", "b2", "a2", "base2", "base1"] ) def test_we_need_head3(self): self._assert_iteration( "b_3@head", "base", ["d3", "c3", "b3", "a3", "base3", "b2", "a2", "base2", "base1"], ) def test_we_need_head1(self): self._assert_iteration("b_1@head", "base", ["c1", "b1", "a1", "base1"]) def test_we_need_base1(self): self._assert_iteration( "heads", "b_1@base", [ "c1", "b1", "a1", "d2", "c2", "d3", "c3", "b2", "a2", "base2", "base1", ], ) def test_we_need_base2(self): self._assert_iteration( "heads", "b_2@base", ["d2", "c2", "d3", "c3", "b2", "a2", "base2"] ) def test_we_need_base3(self): self._assert_iteration( "heads", "b_3@base", ["d3", "c3", "b3", "a3", "base3"] ) class LargeMapTest(DownIterateTest): def setUp(self): self.map = _large_map.map_ def test_all(self): raw = [r for r in self.map._revision_map.values() if r is not None] revs = [rev for rev in self.map.iterate_revisions("heads", "base")] eq_(set(raw), set(revs)) for idx, rev in enumerate(revs): ancestors = set(self.map._get_ancestor_nodes([rev])).difference( [rev] ) descendants = set( self.map._get_descendant_nodes([rev]) ).difference([rev]) assert not ancestors.intersection(descendants) remaining = set(revs[idx + 1 :]) if remaining: assert remaining.intersection(ancestors) class DepResolutionFailedTest(DownIterateTest): def setUp(self): self.map = RevisionMap( lambda: [ Revision("base1", ()), Revision("a1", "base1"), Revision("a2", "base1"), Revision("b1", "a1"), Revision("c1", "b1"), ] ) # intentionally make a broken map self.map._revision_map["fake"] = self.map._revision_map["a2"] self.map._revision_map["b1"].dependencies = "fake" self.map._revision_map["b1"]._resolved_dependencies = ("fake",) def test_failure_message(self): iter_ = self.map.iterate_revisions("c1", "base1") assert_raises_message( RevisionError, "Dependency resolution failed;", list, iter_ ) zzzeek-alembic-bee044a1c187/tests/test_script_consumption.py000066400000000000000000000501231353106760100243200ustar00rootroot00000000000000# coding: utf-8 from contextlib import contextmanager import os import re import textwrap from alembic import command from alembic import util from alembic.environment import EnvironmentContext from alembic.script import Script from alembic.script import ScriptDirectory from alembic.testing import assert_raises_message from alembic.testing import eq_ from alembic.testing import mock from alembic.testing.env import _no_sql_testing_config from alembic.testing.env import _sqlite_file_db from alembic.testing.env import _sqlite_testing_config from alembic.testing.env import clear_staging_env from alembic.testing.env import env_file_fixture from alembic.testing.env import staging_env from alembic.testing.env import three_rev_fixture from alembic.testing.env import write_script from alembic.testing.fixtures import capture_context_buffer from alembic.testing.fixtures import TestBase from alembic.util import compat class ApplyVersionsFunctionalTest(TestBase): __only_on__ = "sqlite" sourceless = False def setUp(self): self.bind = _sqlite_file_db() self.env = staging_env(sourceless=self.sourceless) self.cfg = _sqlite_testing_config(sourceless=self.sourceless) def tearDown(self): clear_staging_env() def test_steps(self): self._test_001_revisions() self._test_002_upgrade() self._test_003_downgrade() self._test_004_downgrade() self._test_005_upgrade() self._test_006_upgrade_again() self._test_007_stamp_upgrade() def _test_001_revisions(self): self.a = a = util.rev_id() self.b = b = util.rev_id() self.c = c = util.rev_id() script = ScriptDirectory.from_config(self.cfg) script.generate_revision(a, None, refresh=True) write_script( script, a, """ revision = '%s' down_revision = None from alembic import op def upgrade(): op.execute("CREATE TABLE foo(id integer)") def downgrade(): op.execute("DROP TABLE foo") """ % a, sourceless=self.sourceless, ) script.generate_revision(b, None, refresh=True) write_script( script, b, """ revision = '%s' down_revision = '%s' from alembic import op def upgrade(): op.execute("CREATE TABLE bar(id integer)") def downgrade(): op.execute("DROP TABLE bar") """ % (b, a), sourceless=self.sourceless, ) script.generate_revision(c, None, refresh=True) write_script( script, c, """ revision = '%s' down_revision = '%s' from alembic import op def upgrade(): op.execute("CREATE TABLE bat(id integer)") def downgrade(): op.execute("DROP TABLE bat") """ % (c, b), sourceless=self.sourceless, ) def _test_002_upgrade(self): command.upgrade(self.cfg, self.c) db = self.bind assert db.dialect.has_table(db.connect(), "foo") assert db.dialect.has_table(db.connect(), "bar") assert db.dialect.has_table(db.connect(), "bat") def _test_003_downgrade(self): command.downgrade(self.cfg, self.a) db = self.bind assert db.dialect.has_table(db.connect(), "foo") assert not db.dialect.has_table(db.connect(), "bar") assert not db.dialect.has_table(db.connect(), "bat") def _test_004_downgrade(self): command.downgrade(self.cfg, "base") db = self.bind assert not db.dialect.has_table(db.connect(), "foo") assert not db.dialect.has_table(db.connect(), "bar") assert not db.dialect.has_table(db.connect(), "bat") def _test_005_upgrade(self): command.upgrade(self.cfg, self.b) db = self.bind assert db.dialect.has_table(db.connect(), "foo") assert db.dialect.has_table(db.connect(), "bar") assert not db.dialect.has_table(db.connect(), "bat") def _test_006_upgrade_again(self): command.upgrade(self.cfg, self.b) db = self.bind assert db.dialect.has_table(db.connect(), "foo") assert db.dialect.has_table(db.connect(), "bar") assert not db.dialect.has_table(db.connect(), "bat") def _test_007_stamp_upgrade(self): command.stamp(self.cfg, self.c) db = self.bind assert db.dialect.has_table(db.connect(), "foo") assert db.dialect.has_table(db.connect(), "bar") assert not db.dialect.has_table(db.connect(), "bat") class SimpleSourcelessApplyVersionsTest(ApplyVersionsFunctionalTest): sourceless = "simple" class NewFangledSourcelessEnvOnlyApplyVersionsTest( ApplyVersionsFunctionalTest ): sourceless = "pep3147_envonly" __requires__ = ("pep3147",) class NewFangledSourcelessEverythingApplyVersionsTest( ApplyVersionsFunctionalTest ): sourceless = "pep3147_everything" __requires__ = ("pep3147",) class CallbackEnvironmentTest(ApplyVersionsFunctionalTest): exp_kwargs = frozenset(("ctx", "heads", "run_args", "step")) @staticmethod def _env_file_fixture(): env_file_fixture( textwrap.dedent( """\ import alembic from alembic import context from sqlalchemy import engine_from_config, pool config = context.config target_metadata = None def run_migrations_offline(): url = config.get_main_option('sqlalchemy.url') context.configure( url=url, target_metadata=target_metadata, on_version_apply=alembic.mock_event_listener, literal_binds=True) with context.begin_transaction(): context.run_migrations() def run_migrations_online(): connectable = engine_from_config( config.get_section(config.config_ini_section), prefix='sqlalchemy.', poolclass=pool.NullPool) with connectable.connect() as connection: context.configure( connection=connection, on_version_apply=alembic.mock_event_listener, target_metadata=target_metadata, ) with context.begin_transaction(): context.run_migrations() if context.is_offline_mode(): run_migrations_offline() else: run_migrations_online() """ ) ) def test_steps(self): import alembic alembic.mock_event_listener = None self._env_file_fixture() with mock.patch("alembic.mock_event_listener", mock.Mock()) as mymock: super(CallbackEnvironmentTest, self).test_steps() calls = mymock.call_args_list assert calls for call in calls: args, kw = call assert not args assert set(kw.keys()) >= self.exp_kwargs assert kw["run_args"] == {} assert hasattr(kw["ctx"], "get_current_revision") step = kw["step"] assert isinstance(step.is_upgrade, bool) assert isinstance(step.is_stamp, bool) assert isinstance(step.is_migration, bool) assert isinstance(step.up_revision_id, compat.string_types) assert isinstance(step.up_revision, Script) for revtype in "up", "down", "source", "destination": revs = getattr(step, "%s_revisions" % revtype) assert isinstance(revs, tuple) for rev in revs: assert isinstance(rev, Script) revids = getattr(step, "%s_revision_ids" % revtype) for revid in revids: assert isinstance(revid, compat.string_types) heads = kw["heads"] assert hasattr(heads, "__iter__") for h in heads: assert h is None or isinstance(h, compat.string_types) class OfflineTransactionalDDLTest(TestBase): def setUp(self): self.env = staging_env() self.cfg = cfg = _no_sql_testing_config() cfg.set_main_option("dialect_name", "sqlite") cfg.remove_main_option("url") self.a, self.b, self.c = three_rev_fixture(cfg) def tearDown(self): clear_staging_env() def test_begin_commit_transactional_ddl(self): with capture_context_buffer(transactional_ddl=True) as buf: command.upgrade(self.cfg, self.c, sql=True) assert re.match( (r"^BEGIN;\s+CREATE TABLE.*?%s.*" % self.a) + (r".*%s" % self.b) + (r".*%s.*?COMMIT;.*$" % self.c), buf.getvalue(), re.S, ) def test_begin_commit_nontransactional_ddl(self): with capture_context_buffer(transactional_ddl=False) as buf: command.upgrade(self.cfg, self.a, sql=True) assert re.match(r"^CREATE TABLE.*?\n+$", buf.getvalue(), re.S) assert "COMMIT;" not in buf.getvalue() def test_begin_commit_per_rev_ddl(self): with capture_context_buffer(transaction_per_migration=True) as buf: command.upgrade(self.cfg, self.c, sql=True) assert re.match( (r"^BEGIN;\s+CREATE TABLE.*%s.*?COMMIT;.*" % self.a) + (r"BEGIN;.*?%s.*?COMMIT;.*" % self.b) + (r"BEGIN;.*?%s.*?COMMIT;.*$" % self.c), buf.getvalue(), re.S, ) class OnlineTransactionalDDLTest(TestBase): def tearDown(self): clear_staging_env() def _opened_transaction_fixture(self): self.env = staging_env() self.cfg = _sqlite_testing_config() script = ScriptDirectory.from_config(self.cfg) a = util.rev_id() b = util.rev_id() c = util.rev_id() script.generate_revision(a, "revision a", refresh=True) write_script( script, a, """ "rev a" revision = '%s' down_revision = None def upgrade(): pass def downgrade(): pass """ % (a,), ) script.generate_revision(b, "revision b", refresh=True) write_script( script, b, """ "rev b" revision = '%s' down_revision = '%s' from alembic import op def upgrade(): conn = op.get_bind() trans = conn.begin() def downgrade(): pass """ % (b, a), ) script.generate_revision(c, "revision c", refresh=True) write_script( script, c, """ "rev c" revision = '%s' down_revision = '%s' from alembic import op def upgrade(): pass def downgrade(): pass """ % (c, b), ) return a, b, c @contextmanager def _patch_environment(self, transactional_ddl, transaction_per_migration): conf = EnvironmentContext.configure def configure(*arg, **opt): opt.update( transactional_ddl=transactional_ddl, transaction_per_migration=transaction_per_migration, ) return conf(*arg, **opt) with mock.patch.object(EnvironmentContext, "configure", configure): yield def test_raise_when_rev_leaves_open_transaction(self): a, b, c = self._opened_transaction_fixture() with self._patch_environment( transactional_ddl=False, transaction_per_migration=False ): assert_raises_message( util.CommandError, r'Migration "upgrade .*, rev b" has left an uncommitted ' r"transaction opened; transactional_ddl is False so Alembic " r"is not committing transactions", command.upgrade, self.cfg, c, ) def test_raise_when_rev_leaves_open_transaction_tpm(self): a, b, c = self._opened_transaction_fixture() with self._patch_environment( transactional_ddl=False, transaction_per_migration=True ): assert_raises_message( util.CommandError, r'Migration "upgrade .*, rev b" has left an uncommitted ' r"transaction opened; transactional_ddl is False so Alembic " r"is not committing transactions", command.upgrade, self.cfg, c, ) def test_noerr_rev_leaves_open_transaction_transactional_ddl(self): a, b, c = self._opened_transaction_fixture() with self._patch_environment( transactional_ddl=True, transaction_per_migration=False ): command.upgrade(self.cfg, c) def test_noerr_transaction_opened_externally(self): a, b, c = self._opened_transaction_fixture() env_file_fixture( """ from sqlalchemy import engine_from_config, pool def run_migrations_online(): connectable = engine_from_config( config.get_section(config.config_ini_section), prefix='sqlalchemy.', poolclass=pool.NullPool) with connectable.connect() as connection: with connection.begin() as real_trans: context.configure( connection=connection, transactional_ddl=False, transaction_per_migration=False ) with context.begin_transaction(): context.run_migrations() run_migrations_online() """ ) command.stamp(self.cfg, c) class EncodingTest(TestBase): def setUp(self): self.env = staging_env() self.cfg = cfg = _no_sql_testing_config() cfg.set_main_option("dialect_name", "sqlite") cfg.remove_main_option("url") self.a = util.rev_id() script = ScriptDirectory.from_config(cfg) script.generate_revision(self.a, "revision a", refresh=True) write_script( script, self.a, ( compat.u( """# coding: utf-8 from __future__ import unicode_literals revision = '%s' down_revision = None from alembic import op def upgrade(): op.execute("« S’il vous plaît…") def downgrade(): op.execute("drôle de petite voix m’a réveillé") """ ) % self.a ), encoding="utf-8", ) def tearDown(self): clear_staging_env() def test_encode(self): with capture_context_buffer( bytes_io=True, output_encoding="utf-8" ) as buf: command.upgrade(self.cfg, self.a, sql=True) assert compat.u("« S’il vous plaît…").encode("utf-8") in buf.getvalue() class VersionNameTemplateTest(TestBase): def setUp(self): self.env = staging_env() self.cfg = _sqlite_testing_config() def tearDown(self): clear_staging_env() def test_option(self): self.cfg.set_main_option("file_template", "myfile_%%(slug)s") script = ScriptDirectory.from_config(self.cfg) a = util.rev_id() script.generate_revision(a, "some message", refresh=True) write_script( script, a, """ revision = '%s' down_revision = None from alembic import op def upgrade(): op.execute("CREATE TABLE foo(id integer)") def downgrade(): op.execute("DROP TABLE foo") """ % a, ) script = ScriptDirectory.from_config(self.cfg) rev = script.get_revision(a) eq_(rev.revision, a) eq_(os.path.basename(rev.path), "myfile_some_message.py") def test_lookup_legacy(self): self.cfg.set_main_option("file_template", "%%(rev)s") script = ScriptDirectory.from_config(self.cfg) a = util.rev_id() script.generate_revision(a, None, refresh=True) write_script( script, a, """ down_revision = None from alembic import op def upgrade(): op.execute("CREATE TABLE foo(id integer)") def downgrade(): op.execute("DROP TABLE foo") """, ) script = ScriptDirectory.from_config(self.cfg) rev = script.get_revision(a) eq_(rev.revision, a) eq_(os.path.basename(rev.path), "%s.py" % a) def test_error_on_new_with_missing_revision(self): self.cfg.set_main_option("file_template", "%%(slug)s_%%(rev)s") script = ScriptDirectory.from_config(self.cfg) a = util.rev_id() script.generate_revision(a, "foobar", refresh=True) path = script.get_revision(a).path with open(path, "w") as fp: fp.write( """ down_revision = None from alembic import op def upgrade(): op.execute("CREATE TABLE foo(id integer)") def downgrade(): op.execute("DROP TABLE foo") """ ) pyc_path = util.pyc_file_from_path(path) if pyc_path is not None and os.access(pyc_path, os.F_OK): os.unlink(pyc_path) assert_raises_message( util.CommandError, "Could not determine revision id from filename foobar_%s.py. " "Be sure the 'revision' variable is declared " "inside the script." % a, Script._from_path, script, path, ) class IgnoreFilesTest(TestBase): sourceless = False def setUp(self): self.bind = _sqlite_file_db() self.env = staging_env(sourceless=self.sourceless) self.cfg = _sqlite_testing_config(sourceless=self.sourceless) def tearDown(self): clear_staging_env() def _test_ignore_file_py(self, fname): command.revision(self.cfg, message="some rev") script = ScriptDirectory.from_config(self.cfg) path = os.path.join(script.versions, fname) with open(path, "w") as f: f.write("crap, crap -> crap") command.revision(self.cfg, message="another rev") script.get_revision("head") def _test_ignore_init_py(self, ext): """test that __init__.py is ignored.""" self._test_ignore_file_py("__init__.%s" % ext) def _test_ignore_dot_hash_py(self, ext): """test that .#test.py is ignored.""" self._test_ignore_file_py(".#test.%s" % ext) def test_ignore_init_py(self): self._test_ignore_init_py("py") def test_ignore_init_pyc(self): self._test_ignore_init_py("pyc") def test_ignore_init_pyx(self): self._test_ignore_init_py("pyx") def test_ignore_init_pyo(self): self._test_ignore_init_py("pyo") def test_ignore_dot_hash_py(self): self._test_ignore_dot_hash_py("py") def test_ignore_dot_hash_pyc(self): self._test_ignore_dot_hash_py("pyc") def test_ignore_dot_hash_pyx(self): self._test_ignore_dot_hash_py("pyx") def test_ignore_dot_hash_pyo(self): self._test_ignore_dot_hash_py("pyo") class SimpleSourcelessIgnoreFilesTest(IgnoreFilesTest): sourceless = "simple" class NewFangledEnvOnlySourcelessIgnoreFilesTest(IgnoreFilesTest): sourceless = "pep3147_envonly" __requires__ = ("pep3147",) class NewFangledEverythingSourcelessIgnoreFilesTest(IgnoreFilesTest): sourceless = "pep3147_everything" __requires__ = ("pep3147",) class SourcelessNeedsFlagTest(TestBase): def setUp(self): self.env = staging_env(sourceless=False) self.cfg = _sqlite_testing_config() def tearDown(self): clear_staging_env() def test_needs_flag(self): a = util.rev_id() script = ScriptDirectory.from_config(self.cfg) script.generate_revision(a, None, refresh=True) write_script( script, a, """ revision = '%s' down_revision = None from alembic import op def upgrade(): op.execute("CREATE TABLE foo(id integer)") def downgrade(): op.execute("DROP TABLE foo") """ % a, sourceless=True, ) script = ScriptDirectory.from_config(self.cfg) eq_(script.get_heads(), []) self.cfg.set_main_option("sourceless", "true") script = ScriptDirectory.from_config(self.cfg) eq_(script.get_heads(), [a]) zzzeek-alembic-bee044a1c187/tests/test_script_production.py000066400000000000000000001053241353106760100241340ustar00rootroot00000000000000import datetime import os import re from dateutil import tz import sqlalchemy as sa from sqlalchemy.engine.reflection import Inspector from alembic import autogenerate from alembic import command from alembic import util from alembic.environment import EnvironmentContext from alembic.operations import ops from alembic.script import ScriptDirectory from alembic.testing import assert_raises_message from alembic.testing import assertions from alembic.testing import eq_ from alembic.testing import is_ from alembic.testing import mock from alembic.testing import ne_ from alembic.testing.env import _get_staging_directory from alembic.testing.env import _multi_dir_testing_config from alembic.testing.env import _multidb_testing_config from alembic.testing.env import _no_sql_testing_config from alembic.testing.env import _sqlite_file_db from alembic.testing.env import _sqlite_testing_config from alembic.testing.env import _testing_config from alembic.testing.env import clear_staging_env from alembic.testing.env import env_file_fixture from alembic.testing.env import script_file_fixture from alembic.testing.env import staging_env from alembic.testing.env import three_rev_fixture from alembic.testing.env import write_script from alembic.testing.fixtures import TestBase from alembic.util import CommandError env, abc, def_ = None, None, None class GeneralOrderedTests(TestBase): def setUp(self): global env env = staging_env() def tearDown(self): clear_staging_env() def test_steps(self): self._test_001_environment() self._test_002_rev_ids() self._test_003_api_methods_clean() self._test_004_rev() self._test_005_nextrev() self._test_006_from_clean_env() self._test_007_long_name() self._test_008_long_name_configurable() def _test_001_environment(self): assert_set = set(["env.py", "script.py.mako", "README"]) eq_(assert_set.intersection(os.listdir(env.dir)), assert_set) def _test_002_rev_ids(self): global abc, def_ abc = util.rev_id() def_ = util.rev_id() ne_(abc, def_) def _test_003_api_methods_clean(self): eq_(env.get_heads(), []) eq_(env.get_base(), None) def _test_004_rev(self): script = env.generate_revision(abc, "this is a message", refresh=True) eq_(script.doc, "this is a message") eq_(script.revision, abc) eq_(script.down_revision, None) assert os.access( os.path.join(env.dir, "versions", "%s_this_is_a_message.py" % abc), os.F_OK, ) assert callable(script.module.upgrade) eq_(env.get_heads(), [abc]) eq_(env.get_base(), abc) def _test_005_nextrev(self): script = env.generate_revision( def_, "this is the next rev", refresh=True ) assert os.access( os.path.join( env.dir, "versions", "%s_this_is_the_next_rev.py" % def_ ), os.F_OK, ) eq_(script.revision, def_) eq_(script.down_revision, abc) eq_(env.get_revision(abc).nextrev, set([def_])) assert script.module.down_revision == abc assert callable(script.module.upgrade) assert callable(script.module.downgrade) eq_(env.get_heads(), [def_]) eq_(env.get_base(), abc) def _test_006_from_clean_env(self): # test the environment so far with a # new ScriptDirectory instance. env = staging_env(create=False) abc_rev = env.get_revision(abc) def_rev = env.get_revision(def_) eq_(abc_rev.nextrev, set([def_])) eq_(abc_rev.revision, abc) eq_(def_rev.down_revision, abc) eq_(env.get_heads(), [def_]) eq_(env.get_base(), abc) def _test_007_long_name(self): rid = util.rev_id() env.generate_revision( rid, "this is a really long name with " "lots of characters and also " "I'd like it to\nhave\nnewlines", ) assert os.access( os.path.join( env.dir, "versions", "%s_this_is_a_really_long_name_with_lots_of_.py" % rid, ), os.F_OK, ) def _test_008_long_name_configurable(self): env.truncate_slug_length = 60 rid = util.rev_id() env.generate_revision( rid, "this is a really long name with " "lots of characters and also " "I'd like it to\nhave\nnewlines", ) assert os.access( os.path.join( env.dir, "versions", "%s_this_is_a_really_long_name_with_lots_" "of_characters_and_also_.py" % rid, ), os.F_OK, ) class ScriptNamingTest(TestBase): @classmethod def setup_class(cls): _testing_config() @classmethod def teardown_class(cls): clear_staging_env() def test_args(self): script = ScriptDirectory( _get_staging_directory(), file_template="%(rev)s_%(slug)s_" "%(year)s_%(month)s_" "%(day)s_%(hour)s_" "%(minute)s_%(second)s", ) create_date = datetime.datetime(2012, 7, 25, 15, 8, 5) eq_( script._rev_path( script.versions, "12345", "this is a message", create_date ), os.path.abspath( "%s/versions/12345_this_is_a_" "message_2012_7_25_15_8_5.py" % _get_staging_directory() ), ) def _test_tz(self, timezone_arg, given, expected): script = ScriptDirectory( _get_staging_directory(), file_template="%(rev)s_%(slug)s_" "%(year)s_%(month)s_" "%(day)s_%(hour)s_" "%(minute)s_%(second)s", timezone=timezone_arg, ) with mock.patch( "alembic.script.base.datetime", mock.Mock( datetime=mock.Mock(utcnow=lambda: given, now=lambda: given) ), ): create_date = script._generate_create_date() eq_(create_date, expected) def test_custom_tz(self): self._test_tz( "EST5EDT", datetime.datetime(2012, 7, 25, 15, 8, 5), datetime.datetime( 2012, 7, 25, 11, 8, 5, tzinfo=tz.gettz("EST5EDT") ), ) def test_custom_tz_lowercase(self): self._test_tz( "est5edt", datetime.datetime(2012, 7, 25, 15, 8, 5), datetime.datetime( 2012, 7, 25, 11, 8, 5, tzinfo=tz.gettz("EST5EDT") ), ) def test_custom_tz_utc(self): self._test_tz( "utc", datetime.datetime(2012, 7, 25, 15, 8, 5), datetime.datetime(2012, 7, 25, 15, 8, 5, tzinfo=tz.gettz("UTC")), ) def test_custom_tzdata_tz(self): self._test_tz( "Europe/Berlin", datetime.datetime(2012, 7, 25, 15, 8, 5), datetime.datetime( 2012, 7, 25, 17, 8, 5, tzinfo=tz.gettz("Europe/Berlin") ), ) def test_default_tz(self): self._test_tz( None, datetime.datetime(2012, 7, 25, 15, 8, 5), datetime.datetime(2012, 7, 25, 15, 8, 5), ) def test_tz_cant_locate(self): assert_raises_message( CommandError, "Can't locate timezone: fake", self._test_tz, "fake", datetime.datetime(2012, 7, 25, 15, 8, 5), datetime.datetime(2012, 7, 25, 15, 8, 5), ) class RevisionCommandTest(TestBase): def setUp(self): self.env = staging_env() self.cfg = _sqlite_testing_config() self.a, self.b, self.c = three_rev_fixture(self.cfg) def tearDown(self): clear_staging_env() def test_create_script_basic(self): rev = command.revision(self.cfg, message="some message") script = ScriptDirectory.from_config(self.cfg) rev = script.get_revision(rev.revision) eq_(rev.down_revision, self.c) assert "some message" in rev.doc def test_create_script_splice(self): rev = command.revision( self.cfg, message="some message", head=self.b, splice=True ) script = ScriptDirectory.from_config(self.cfg) rev = script.get_revision(rev.revision) eq_(rev.down_revision, self.b) assert "some message" in rev.doc eq_(set(script.get_heads()), set([rev.revision, self.c])) def test_create_script_missing_splice(self): assert_raises_message( util.CommandError, "Revision %s is not a head revision; please specify --splice " "to create a new branch from this revision" % self.b, command.revision, self.cfg, message="some message", head=self.b, ) def test_illegal_revision_chars(self): assert_raises_message( util.CommandError, r"Character\(s\) '-' not allowed in " "revision identifier 'no-dashes'", command.revision, self.cfg, message="some message", rev_id="no-dashes", ) assert not os.path.exists( os.path.join(self.env.dir, "versions", "no-dashes_some_message.py") ) assert_raises_message( util.CommandError, r"Character\(s\) '@' not allowed in " "revision identifier 'no@atsigns'", command.revision, self.cfg, message="some message", rev_id="no@atsigns", ) assert_raises_message( util.CommandError, r"Character\(s\) '-, @' not allowed in revision " "identifier 'no@atsigns-ordashes'", command.revision, self.cfg, message="some message", rev_id="no@atsigns-ordashes", ) assert_raises_message( util.CommandError, r"Character\(s\) '\+' not allowed in revision " r"identifier 'no\+plussignseither'", command.revision, self.cfg, message="some message", rev_id="no+plussignseither", ) def test_create_script_branches(self): rev = command.revision( self.cfg, message="some message", branch_label="foobar" ) script = ScriptDirectory.from_config(self.cfg) rev = script.get_revision(rev.revision) eq_(script.get_revision("foobar"), rev) def test_create_script_branches_old_template(self): script = ScriptDirectory.from_config(self.cfg) with open(os.path.join(script.dir, "script.py.mako"), "w") as file_: file_.write( "<%text># ${message}\n" "revision = ${repr(up_revision)}\n" "down_revision = ${repr(down_revision)}\n\n" "def upgrade():\n" " ${upgrades if upgrades else 'pass'}\n\n" "def downgrade():\n" " ${downgrade if downgrades else 'pass'}\n\n" ) # works OK if no branch names command.revision(self.cfg, message="some message") assert_raises_message( util.CommandError, r"Version \w+ specified branch_labels foobar, " r"however the migration file .+?\b does not have them; have you " "upgraded your script.py.mako to include the 'branch_labels' " r"section\?", command.revision, self.cfg, message="some message", branch_label="foobar", ) class CustomizeRevisionTest(TestBase): def setUp(self): self.env = staging_env() self.cfg = _multi_dir_testing_config() self.cfg.set_main_option("revision_environment", "true") script = ScriptDirectory.from_config(self.cfg) self.model1 = util.rev_id() self.model2 = util.rev_id() self.model3 = util.rev_id() for model, name in [ (self.model1, "model1"), (self.model2, "model2"), (self.model3, "model3"), ]: script.generate_revision( model, name, refresh=True, version_path=os.path.join(_get_staging_directory(), name), head="base", ) write_script( script, model, """\ "%s" revision = '%s' down_revision = None branch_labels = ['%s'] from alembic import op def upgrade(): pass def downgrade(): pass """ % (name, model, name), ) def tearDown(self): clear_staging_env() def _env_fixture(self, fn, target_metadata): self.engine = engine = _sqlite_file_db() def run_env(self): from alembic import context with engine.connect() as connection: context.configure( connection=connection, target_metadata=target_metadata, process_revision_directives=fn, ) with context.begin_transaction(): context.run_migrations() return mock.patch( "alembic.script.base.ScriptDirectory.run_env", run_env ) def test_new_locations_no_autogen(self): m = sa.MetaData() def process_revision_directives(context, rev, generate_revisions): generate_revisions[:] = [ ops.MigrationScript( util.rev_id(), ops.UpgradeOps(), ops.DowngradeOps(), version_path=os.path.join( _get_staging_directory(), "model1" ), head="model1@head", ), ops.MigrationScript( util.rev_id(), ops.UpgradeOps(), ops.DowngradeOps(), version_path=os.path.join( _get_staging_directory(), "model2" ), head="model2@head", ), ops.MigrationScript( util.rev_id(), ops.UpgradeOps(), ops.DowngradeOps(), version_path=os.path.join( _get_staging_directory(), "model3" ), head="model3@head", ), ] with self._env_fixture(process_revision_directives, m): revs = command.revision(self.cfg, message="some message") script = ScriptDirectory.from_config(self.cfg) for rev, model in [ (revs[0], "model1"), (revs[1], "model2"), (revs[2], "model3"), ]: rev_script = script.get_revision(rev.revision) eq_( rev_script.path, os.path.abspath( os.path.join( _get_staging_directory(), model, "%s_.py" % (rev_script.revision,), ) ), ) assert os.path.exists(rev_script.path) def test_renders_added_directives_no_autogen(self): m = sa.MetaData() def process_revision_directives(context, rev, generate_revisions): generate_revisions[0].upgrade_ops.ops.append( ops.CreateIndexOp("some_index", "some_table", ["a", "b"]) ) with self._env_fixture(process_revision_directives, m): rev = command.revision( self.cfg, message="some message", head="model1@head", sql=True ) with mock.patch.object(rev.module, "op") as op_mock: rev.module.upgrade() eq_( op_mock.mock_calls, [ mock.call.create_index( "some_index", "some_table", ["a", "b"], unique=False ) ], ) def test_autogen(self): m = sa.MetaData() sa.Table("t", m, sa.Column("x", sa.Integer)) def process_revision_directives(context, rev, generate_revisions): existing_upgrades = generate_revisions[0].upgrade_ops existing_downgrades = generate_revisions[0].downgrade_ops # model1 will run the upgrades, e.g. create the table, # model2 will run the downgrades as upgrades, e.g. drop # the table again generate_revisions[:] = [ ops.MigrationScript( util.rev_id(), existing_upgrades, ops.DowngradeOps(), version_path=os.path.join( _get_staging_directory(), "model1" ), head="model1@head", ), ops.MigrationScript( util.rev_id(), ops.UpgradeOps(ops=existing_downgrades.ops), ops.DowngradeOps(), version_path=os.path.join( _get_staging_directory(), "model2" ), head="model2@head", ), ] with self._env_fixture(process_revision_directives, m): command.upgrade(self.cfg, "heads") eq_( Inspector.from_engine(self.engine).get_table_names(), ["alembic_version"], ) command.revision( self.cfg, message="some message", autogenerate=True ) command.upgrade(self.cfg, "model1@head") eq_( Inspector.from_engine(self.engine).get_table_names(), ["alembic_version", "t"], ) command.upgrade(self.cfg, "model2@head") eq_( Inspector.from_engine(self.engine).get_table_names(), ["alembic_version"], ) def test_programmatic_command_option(self): def process_revision_directives(context, rev, generate_revisions): generate_revisions[0].message = "test programatic" generate_revisions[0].upgrade_ops = ops.UpgradeOps( ops=[ ops.CreateTableOp( "test_table", [ sa.Column("id", sa.Integer(), primary_key=True), sa.Column("name", sa.String(50), nullable=False), ], ) ] ) generate_revisions[0].downgrade_ops = ops.DowngradeOps( ops=[ops.DropTableOp("test_table")] ) with self._env_fixture(None, None): rev = command.revision( self.cfg, head="model1@head", process_revision_directives=process_revision_directives, ) with open(rev.path) as handle: result = handle.read() assert ( ( """ def upgrade(): # ### commands auto generated by Alembic - please adjust! ### op.create_table('test_table', sa.Column('id', sa.Integer(), nullable=False), sa.Column('name', sa.String(length=50), nullable=False), sa.PrimaryKeyConstraint('id') ) # ### end Alembic commands ### """ ) in result ) class ScriptAccessorTest(TestBase): def test_upgrade_downgrade_ops_list_accessors(self): u1 = ops.UpgradeOps(ops=[]) d1 = ops.DowngradeOps(ops=[]) m1 = ops.MigrationScript("somerev", u1, d1) is_(m1.upgrade_ops, u1) is_(m1.downgrade_ops, d1) u2 = ops.UpgradeOps(ops=[]) d2 = ops.DowngradeOps(ops=[]) m1._upgrade_ops.append(u2) m1._downgrade_ops.append(d2) assert_raises_message( ValueError, "This MigrationScript instance has a multiple-entry list for " "UpgradeOps; please use the upgrade_ops_list attribute.", getattr, m1, "upgrade_ops", ) assert_raises_message( ValueError, "This MigrationScript instance has a multiple-entry list for " "DowngradeOps; please use the downgrade_ops_list attribute.", getattr, m1, "downgrade_ops", ) eq_(m1.upgrade_ops_list, [u1, u2]) eq_(m1.downgrade_ops_list, [d1, d2]) class ImportsTest(TestBase): def setUp(self): self.env = staging_env() self.cfg = _sqlite_testing_config() def tearDown(self): clear_staging_env() def _env_fixture(self, target_metadata, **kw): self.engine = engine = _sqlite_file_db() def run_env(self): from alembic import context with engine.connect() as connection: context.configure( connection=connection, target_metadata=target_metadata, **kw ) with context.begin_transaction(): context.run_migrations() return mock.patch( "alembic.script.base.ScriptDirectory.run_env", run_env ) def test_imports_in_script(self): from sqlalchemy import MetaData, Table, Column from sqlalchemy.dialects.mysql import VARCHAR type_ = VARCHAR(20, charset="utf8", national=True) m = MetaData() Table("t", m, Column("x", type_)) def process_revision_directives(context, rev, generate_revisions): generate_revisions[0].imports.add( "from sqlalchemy.dialects.mysql import TINYINT" ) with self._env_fixture( m, process_revision_directives=process_revision_directives ): rev = command.revision( self.cfg, message="some message", autogenerate=True ) with open(rev.path) as file_: contents = file_.read() assert "from sqlalchemy.dialects import mysql" in contents assert "from sqlalchemy.dialects.mysql import TINYINT" in contents class MultiContextTest(TestBase): """test the multidb template for autogenerate front-to-back""" def setUp(self): self.engine1 = _sqlite_file_db(tempname="eng1.db") self.engine2 = _sqlite_file_db(tempname="eng2.db") self.engine3 = _sqlite_file_db(tempname="eng3.db") self.env = staging_env(template="multidb") self.cfg = _multidb_testing_config( { "engine1": self.engine1, "engine2": self.engine2, "engine3": self.engine3, } ) def _write_metadata(self, meta): path = os.path.join(_get_staging_directory(), "scripts", "env.py") with open(path) as env_: existing_env = env_.read() existing_env = existing_env.replace("target_metadata = {}", meta) with open(path, "w") as env_: env_.write(existing_env) def tearDown(self): clear_staging_env() def test_autogen(self): self._write_metadata( """ import sqlalchemy as sa m1 = sa.MetaData() m2 = sa.MetaData() m3 = sa.MetaData() target_metadata = {"engine1": m1, "engine2": m2, "engine3": m3} sa.Table('e1t1', m1, sa.Column('x', sa.Integer)) sa.Table('e2t1', m2, sa.Column('y', sa.Integer)) sa.Table('e3t1', m3, sa.Column('z', sa.Integer)) """ ) rev = command.revision( self.cfg, message="some message", autogenerate=True ) with mock.patch.object(rev.module, "op") as op_mock: rev.module.upgrade_engine1() eq_( op_mock.mock_calls[-1], mock.call.create_table("e1t1", mock.ANY), ) rev.module.upgrade_engine2() eq_( op_mock.mock_calls[-1], mock.call.create_table("e2t1", mock.ANY), ) rev.module.upgrade_engine3() eq_( op_mock.mock_calls[-1], mock.call.create_table("e3t1", mock.ANY), ) rev.module.downgrade_engine1() eq_(op_mock.mock_calls[-1], mock.call.drop_table("e1t1")) rev.module.downgrade_engine2() eq_(op_mock.mock_calls[-1], mock.call.drop_table("e2t1")) rev.module.downgrade_engine3() eq_(op_mock.mock_calls[-1], mock.call.drop_table("e3t1")) class RewriterTest(TestBase): def test_all_traverse(self): writer = autogenerate.Rewriter() mocker = mock.Mock(side_effect=lambda context, revision, op: op) writer.rewrites(ops.MigrateOperation)(mocker) addcolop = ops.AddColumnOp("t1", sa.Column("x", sa.Integer())) directives = [ ops.MigrationScript( util.rev_id(), ops.UpgradeOps(ops=[ops.ModifyTableOps("t1", ops=[addcolop])]), ops.DowngradeOps(ops=[]), ) ] ctx, rev = mock.Mock(), mock.Mock() writer(ctx, rev, directives) eq_( mocker.mock_calls, [ mock.call(ctx, rev, directives[0]), mock.call(ctx, rev, directives[0].upgrade_ops), mock.call(ctx, rev, directives[0].upgrade_ops.ops[0]), mock.call(ctx, rev, addcolop), mock.call(ctx, rev, directives[0].downgrade_ops), ], ) def test_double_migrate_table(self): writer = autogenerate.Rewriter() idx_ops = [] @writer.rewrites(ops.ModifyTableOps) def second_table(context, revision, op): return [ op, ops.ModifyTableOps( "t2", ops=[ops.AddColumnOp("t2", sa.Column("x", sa.Integer()))], ), ] @writer.rewrites(ops.AddColumnOp) def add_column(context, revision, op): idx_op = ops.CreateIndexOp("ixt", op.table_name, [op.column.name]) idx_ops.append(idx_op) return [op, idx_op] directives = [ ops.MigrationScript( util.rev_id(), ops.UpgradeOps( ops=[ ops.ModifyTableOps( "t1", ops=[ ops.AddColumnOp( "t1", sa.Column("x", sa.Integer()) ) ], ) ] ), ops.DowngradeOps(ops=[]), ) ] ctx, rev = mock.Mock(), mock.Mock() writer(ctx, rev, directives) eq_( [d.table_name for d in directives[0].upgrade_ops.ops], ["t1", "t2"] ) is_(directives[0].upgrade_ops.ops[0].ops[1], idx_ops[0]) is_(directives[0].upgrade_ops.ops[1].ops[1], idx_ops[1]) def test_chained_ops(self): writer1 = autogenerate.Rewriter() writer2 = autogenerate.Rewriter() @writer1.rewrites(ops.AddColumnOp) def add_column_nullable(context, revision, op): if op.column.nullable: return op else: op.column.nullable = True return [ op, ops.AlterColumnOp( op.table_name, op.column.name, modify_nullable=False, existing_type=op.column.type, ), ] @writer2.rewrites(ops.AddColumnOp) def add_column_idx(context, revision, op): idx_op = ops.CreateIndexOp("ixt", op.table_name, [op.column.name]) return [op, idx_op] directives = [ ops.MigrationScript( util.rev_id(), ops.UpgradeOps( ops=[ ops.ModifyTableOps( "t1", ops=[ ops.AddColumnOp( "t1", sa.Column( "x", sa.Integer(), nullable=False ), ) ], ) ] ), ops.DowngradeOps(ops=[]), ) ] ctx, rev = mock.Mock(), mock.Mock() writer1.chain(writer2)(ctx, rev, directives) eq_( autogenerate.render_python_code(directives[0].upgrade_ops), "# ### commands auto generated by Alembic - please adjust! ###\n" " op.add_column('t1', " "sa.Column('x', sa.Integer(), nullable=True))\n" " op.create_index('ixt', 't1', ['x'], unique=False)\n" " op.alter_column('t1', 'x',\n" " existing_type=sa.Integer(),\n" " nullable=False)\n" " # ### end Alembic commands ###", ) class MultiDirRevisionCommandTest(TestBase): def setUp(self): self.env = staging_env() self.cfg = _multi_dir_testing_config() def tearDown(self): clear_staging_env() def test_multiple_dir_no_bases(self): assert_raises_message( util.CommandError, "Multiple version locations present, please specify " "--version-path", command.revision, self.cfg, message="some message", ) def test_multiple_dir_no_bases_invalid_version_path(self): assert_raises_message( util.CommandError, "Path foo/bar/ is not represented in current version locations", command.revision, self.cfg, message="x", version_path=os.path.join("foo/bar/"), ) def test_multiple_dir_no_bases_version_path(self): script = command.revision( self.cfg, message="x", version_path=os.path.join(_get_staging_directory(), "model1"), ) assert os.access(script.path, os.F_OK) def test_multiple_dir_chooses_base(self): command.revision( self.cfg, message="x", head="base", version_path=os.path.join(_get_staging_directory(), "model1"), ) script2 = command.revision( self.cfg, message="y", head="base", version_path=os.path.join(_get_staging_directory(), "model2"), ) script3 = command.revision( self.cfg, message="y2", head=script2.revision ) eq_( os.path.dirname(script3.path), os.path.abspath(os.path.join(_get_staging_directory(), "model2")), ) assert os.access(script3.path, os.F_OK) class TemplateArgsTest(TestBase): def setUp(self): staging_env() self.cfg = _no_sql_testing_config( directives="\nrevision_environment=true\n" ) def tearDown(self): clear_staging_env() def test_args_propagate(self): config = _no_sql_testing_config() script = ScriptDirectory.from_config(config) template_args = {"x": "x1", "y": "y1", "z": "z1"} env = EnvironmentContext(config, script, template_args=template_args) env.configure( dialect_name="sqlite", template_args={"y": "y2", "q": "q1"} ) eq_(template_args, {"x": "x1", "y": "y2", "z": "z1", "q": "q1"}) def test_tmpl_args_revision(self): env_file_fixture( """ context.configure(dialect_name='sqlite', template_args={"somearg":"somevalue"}) """ ) script_file_fixture( """ # somearg: ${somearg} revision = ${repr(up_revision)} down_revision = ${repr(down_revision)} """ ) command.revision(self.cfg, message="some rev") script = ScriptDirectory.from_config(self.cfg) rev = script.get_revision("head") with open(rev.path) as f: text = f.read() assert "somearg: somevalue" in text def test_bad_render(self): env_file_fixture( """ context.configure(dialect_name='sqlite', template_args={"somearg":"somevalue"}) """ ) script_file_fixture( """ <% z = x + y %> """ ) try: command.revision(self.cfg, message="some rev") except CommandError as ce: m = re.match( r"^Template rendering failed; see (.+?) " "for a template-oriented", str(ce), ) assert m, "Command error did not produce a file" with open(m.group(1)) as handle: contents = handle.read() os.remove(m.group(1)) assert "<% z = x + y %>" in contents class DuplicateVersionLocationsTest(TestBase): def setUp(self): self.env = staging_env() self.cfg = _multi_dir_testing_config( # this is a duplicate of one of the paths # already present in this fixture extra_version_location="%(here)s/model1" ) script = ScriptDirectory.from_config(self.cfg) self.model1 = util.rev_id() self.model2 = util.rev_id() self.model3 = util.rev_id() for model, name in [ (self.model1, "model1"), (self.model2, "model2"), (self.model3, "model3"), ]: script.generate_revision( model, name, refresh=True, version_path=os.path.join(_get_staging_directory(), name), head="base", ) write_script( script, model, """\ "%s" revision = '%s' down_revision = None branch_labels = ['%s'] from alembic import op def upgrade(): pass def downgrade(): pass """ % (name, model, name), ) def tearDown(self): clear_staging_env() def test_env_emits_warning(self): with assertions.expect_warnings( "File %s loaded twice! ignoring. " "Please ensure version_locations is unique" % ( os.path.realpath( os.path.join( _get_staging_directory(), "model1", "%s_model1.py" % self.model1, ) ) ) ): script = ScriptDirectory.from_config(self.cfg) script.revision_map.heads eq_( [rev.revision for rev in script.walk_revisions()], [self.model1, self.model2, self.model3], ) zzzeek-alembic-bee044a1c187/tests/test_sqlite.py000066400000000000000000000201721353106760100216600ustar00rootroot00000000000000from sqlalchemy import Boolean from sqlalchemy import Column from sqlalchemy import DateTime from sqlalchemy import Float from sqlalchemy import func from sqlalchemy import Integer from sqlalchemy import MetaData from sqlalchemy import String from sqlalchemy import Table from sqlalchemy import text from sqlalchemy.engine.reflection import Inspector from sqlalchemy.sql import column from alembic import autogenerate from alembic import op from alembic.autogenerate import api from alembic.autogenerate.compare import _compare_server_default from alembic.migration import MigrationContext from alembic.operations import ops from alembic.testing import assert_raises_message from alembic.testing import config from alembic.testing import eq_ from alembic.testing import eq_ignore_whitespace from alembic.testing.env import clear_staging_env from alembic.testing.env import staging_env from alembic.testing.fixtures import op_fixture from alembic.testing.fixtures import TestBase class SQLiteTest(TestBase): def test_add_column(self): context = op_fixture("sqlite") op.add_column("t1", Column("c1", Integer)) context.assert_("ALTER TABLE t1 ADD COLUMN c1 INTEGER") def test_add_column_implicit_constraint(self): context = op_fixture("sqlite") op.add_column("t1", Column("c1", Boolean)) context.assert_("ALTER TABLE t1 ADD COLUMN c1 BOOLEAN") def test_add_explicit_constraint(self): op_fixture("sqlite") assert_raises_message( NotImplementedError, "No support for ALTER of constraints in SQLite dialect", op.create_check_constraint, "foo", "sometable", column("name") > 5, ) def test_drop_explicit_constraint(self): op_fixture("sqlite") assert_raises_message( NotImplementedError, "No support for ALTER of constraints in SQLite dialect", op.drop_constraint, "foo", "sometable", ) @config.requirements.comments def test_create_table_with_comment_ignored(self): context = op_fixture("sqlite") op.create_table( "t2", Column("c1", Integer, primary_key=True), Column("c2", Integer), comment="This is a table comment", ) context.assert_( "CREATE TABLE t2 (c1 INTEGER NOT NULL, " "c2 INTEGER, PRIMARY KEY (c1))" ) @config.requirements.comments def test_add_column_with_comment_ignored(self): context = op_fixture("sqlite") op.add_column("t1", Column("c1", Integer, comment="c1 comment")) context.assert_("ALTER TABLE t1 ADD COLUMN c1 INTEGER") class SQLiteDefaultCompareTest(TestBase): __only_on__ = "sqlite" __backend__ = True @classmethod def setup_class(cls): cls.bind = config.db staging_env() cls.migration_context = MigrationContext.configure( connection=cls.bind.connect(), opts={"compare_type": True, "compare_server_default": True}, ) def setUp(self): self.metadata = MetaData(self.bind) self.autogen_context = api.AutogenContext(self.migration_context) @classmethod def teardown_class(cls): clear_staging_env() def tearDown(self): self.metadata.drop_all() def _compare_default_roundtrip( self, type_, orig_default, alternate=None, diff_expected=None ): diff_expected = ( diff_expected if diff_expected is not None else alternate is not None ) if alternate is None: alternate = orig_default t1 = Table( "test", self.metadata, Column("somecol", type_, server_default=orig_default), ) t2 = Table( "test", MetaData(), Column("somecol", type_, server_default=alternate), ) t1.create(self.bind) insp = Inspector.from_engine(self.bind) cols = insp.get_columns(t1.name) insp_col = Column( "somecol", cols[0]["type"], server_default=text(cols[0]["default"]) ) op = ops.AlterColumnOp("test", "somecol") _compare_server_default( self.autogen_context, op, None, "test", "somecol", insp_col, t2.c.somecol, ) diffs = op.to_diff_tuple() eq_(bool(diffs), diff_expected) def _compare_default(self, t1, t2, col, rendered): t1.create(self.bind, checkfirst=True) insp = Inspector.from_engine(self.bind) cols = insp.get_columns(t1.name) ctx = self.autogen_context.migration_context return ctx.impl.compare_server_default( None, col, rendered, cols[0]["default"] ) @config.requirements.sqlalchemy_12 def test_compare_current_timestamp_func(self): self._compare_default_roundtrip( DateTime(), func.datetime("now", "localtime") ) def test_compare_current_timestamp_text(self): # SQLAlchemy doesn't render the parenthesis for a # SQLite server default specified as text(), so users will be doing # this; sqlite comparison needs to accommodate for these. self._compare_default_roundtrip( DateTime(), text("(datetime('now', 'localtime'))") ) def test_compare_integer_str(self): self._compare_default_roundtrip(Integer(), "5") def test_compare_integer_str_diff(self): self._compare_default_roundtrip(Integer(), "5", "7") def test_compare_integer_text(self): self._compare_default_roundtrip(Integer(), text("5")) def test_compare_integer_text_diff(self): self._compare_default_roundtrip(Integer(), text("5"), "7") def test_compare_float_str(self): self._compare_default_roundtrip(Float(), "5.2") def test_compare_float_str_diff(self): self._compare_default_roundtrip(Float(), "5.2", "5.3") def test_compare_float_text(self): self._compare_default_roundtrip(Float(), text("5.2")) def test_compare_float_text_diff(self): self._compare_default_roundtrip(Float(), text("5.2"), "5.3") def test_compare_string_literal(self): self._compare_default_roundtrip(String(), "im a default") def test_compare_string_literal_diff(self): self._compare_default_roundtrip(String(), "im a default", "me too") class SQLiteAutogenRenderTest(TestBase): def setUp(self): ctx_opts = { "sqlalchemy_module_prefix": "sa.", "alembic_module_prefix": "op.", "target_metadata": MetaData(), } context = MigrationContext.configure( dialect_name="sqlite", opts=ctx_opts ) self.autogen_context = api.AutogenContext(context) def test_render_server_default_expr_needs_parens(self): c = Column( "date_value", DateTime(), server_default=func.datetime("now", "localtime"), ) result = autogenerate.render._render_column(c, self.autogen_context) eq_ignore_whitespace( result, "sa.Column('date_value', sa.DateTime(), " "server_default=sa.text(!U\"(datetime('now', 'localtime'))\"), " "nullable=True)", ) def test_render_server_default_text_expr_needs_parens(self): c = Column( "date_value", DateTime(), server_default=text("(datetime('now', 'localtime'))"), ) result = autogenerate.render._render_column(c, self.autogen_context) eq_ignore_whitespace( result, "sa.Column('date_value', sa.DateTime(), " "server_default=sa.text(!U\"(datetime('now', 'localtime'))\"), " "nullable=True)", ) def test_render_server_default_const(self): c = Column("int_value", Integer, server_default="5") result = autogenerate.render._render_column(c, self.autogen_context) eq_ignore_whitespace( result, "sa.Column('int_value', sa.Integer(), server_default='5', " "nullable=True)", ) zzzeek-alembic-bee044a1c187/tests/test_version_table.py000066400000000000000000000247311353106760100232200ustar00rootroot00000000000000from sqlalchemy import Column from sqlalchemy import MetaData from sqlalchemy import String from sqlalchemy import Table from sqlalchemy.engine.reflection import Inspector from alembic import migration from alembic.testing import assert_raises from alembic.testing import assert_raises_message from alembic.testing import config from alembic.testing import eq_ from alembic.testing import mock from alembic.testing.fixtures import TestBase from alembic.util import CommandError version_table = Table( "version_table", MetaData(), Column("version_num", String(32), nullable=False), ) def _up(from_, to_, branch_presence_changed=False): return migration.StampStep(from_, to_, True, branch_presence_changed) def _down(from_, to_, branch_presence_changed=False): return migration.StampStep(from_, to_, False, branch_presence_changed) class TestMigrationContext(TestBase): @classmethod def setup_class(cls): cls.bind = config.db def setUp(self): self.connection = self.bind.connect() self.transaction = self.connection.begin() def tearDown(self): self.transaction.rollback() version_table.drop(self.connection, checkfirst=True) self.connection.close() def make_one(self, **kwargs): return migration.MigrationContext.configure(**kwargs) def get_revision(self): result = self.connection.execute(version_table.select()) rows = result.fetchall() if len(rows) == 0: return None eq_(len(rows), 1) return rows[0]["version_num"] def test_config_default_version_table_name(self): context = self.make_one(dialect_name="sqlite") eq_(context._version.name, "alembic_version") def test_config_explicit_version_table_name(self): context = self.make_one( dialect_name="sqlite", opts={"version_table": "explicit"} ) eq_(context._version.name, "explicit") eq_(context._version.primary_key.name, "explicit_pkc") def test_config_explicit_version_table_schema(self): context = self.make_one( dialect_name="sqlite", opts={"version_table_schema": "explicit"} ) eq_(context._version.schema, "explicit") def test_config_explicit_no_pk(self): context = self.make_one( dialect_name="sqlite", opts={"version_table_pk": False} ) eq_(len(context._version.primary_key), 0) def test_config_explicit_w_pk(self): context = self.make_one( dialect_name="sqlite", opts={"version_table_pk": True} ) eq_(len(context._version.primary_key), 1) eq_(context._version.primary_key.name, "alembic_version_pkc") def test_get_current_revision_doesnt_create_version_table(self): context = self.make_one( connection=self.connection, opts={"version_table": "version_table"} ) eq_(context.get_current_revision(), None) insp = Inspector(self.connection) assert "version_table" not in insp.get_table_names() def test_get_current_revision(self): context = self.make_one( connection=self.connection, opts={"version_table": "version_table"} ) version_table.create(self.connection) eq_(context.get_current_revision(), None) self.connection.execute( version_table.insert().values(version_num="revid") ) eq_(context.get_current_revision(), "revid") def test_get_current_revision_error_if_starting_rev_given_online(self): context = self.make_one( connection=self.connection, opts={"starting_rev": "boo"} ) assert_raises(CommandError, context.get_current_revision) def test_get_current_revision_offline(self): context = self.make_one( dialect_name="sqlite", opts={"starting_rev": "startrev", "as_sql": True}, ) eq_(context.get_current_revision(), "startrev") def test_get_current_revision_multiple_heads(self): version_table.create(self.connection) context = self.make_one( connection=self.connection, opts={"version_table": "version_table"} ) updater = migration.HeadMaintainer(context, ()) updater.update_to_step(_up(None, "a", True)) updater.update_to_step(_up(None, "b", True)) assert_raises_message( CommandError, "Version table 'version_table' has more than one head present; " "please use get_current_heads()", context.get_current_revision, ) def test_get_heads(self): version_table.create(self.connection) context = self.make_one( connection=self.connection, opts={"version_table": "version_table"} ) updater = migration.HeadMaintainer(context, ()) updater.update_to_step(_up(None, "a", True)) updater.update_to_step(_up(None, "b", True)) eq_(context.get_current_heads(), ("a", "b")) def test_get_heads_offline(self): version_table.create(self.connection) context = self.make_one( connection=self.connection, opts={ "starting_rev": "q", "version_table": "version_table", "as_sql": True, }, ) eq_(context.get_current_heads(), ("q",)) def test_stamp_api_creates_table(self): context = self.make_one(connection=self.connection) assert ( "alembic_version" not in Inspector(self.connection).get_table_names() ) script = mock.Mock( _stamp_revs=lambda revision, heads: [ _up(None, "a", True), _up(None, "b", True), ] ) context.stamp(script, "b") eq_(context.get_current_heads(), ("a", "b")) assert ( "alembic_version" in Inspector(self.connection).get_table_names() ) class UpdateRevTest(TestBase): __backend__ = True @classmethod def setup_class(cls): cls.bind = config.db def setUp(self): self.connection = self.bind.connect() self.context = migration.MigrationContext.configure( connection=self.connection, opts={"version_table": "version_table"} ) version_table.create(self.connection) self.updater = migration.HeadMaintainer(self.context, ()) def tearDown(self): version_table.drop(self.connection, checkfirst=True) self.connection.close() def _assert_heads(self, heads): eq_(set(self.context.get_current_heads()), set(heads)) eq_(self.updater.heads, set(heads)) def test_update_none_to_single(self): self.updater.update_to_step(_up(None, "a", True)) self._assert_heads(("a",)) def test_update_single_to_single(self): self.updater.update_to_step(_up(None, "a", True)) self.updater.update_to_step(_up("a", "b")) self._assert_heads(("b",)) def test_update_single_to_none(self): self.updater.update_to_step(_up(None, "a", True)) self.updater.update_to_step(_down("a", None, True)) self._assert_heads(()) def test_add_branches(self): self.updater.update_to_step(_up(None, "a", True)) self.updater.update_to_step(_up("a", "b")) self.updater.update_to_step(_up(None, "c", True)) self._assert_heads(("b", "c")) self.updater.update_to_step(_up("c", "d")) self.updater.update_to_step(_up("d", "e1")) self.updater.update_to_step(_up("d", "e2", True)) self._assert_heads(("b", "e1", "e2")) def test_teardown_branches(self): self.updater.update_to_step(_up(None, "d1", True)) self.updater.update_to_step(_up(None, "d2", True)) self._assert_heads(("d1", "d2")) self.updater.update_to_step(_down("d1", "c")) self._assert_heads(("c", "d2")) self.updater.update_to_step(_down("d2", "c", True)) self._assert_heads(("c",)) self.updater.update_to_step(_down("c", "b")) self._assert_heads(("b",)) def test_resolve_merges(self): self.updater.update_to_step(_up(None, "a", True)) self.updater.update_to_step(_up("a", "b")) self.updater.update_to_step(_up("b", "c1")) self.updater.update_to_step(_up("b", "c2", True)) self.updater.update_to_step(_up("c1", "d1")) self.updater.update_to_step(_up("c2", "d2")) self._assert_heads(("d1", "d2")) self.updater.update_to_step(_up(("d1", "d2"), "e")) self._assert_heads(("e",)) def test_unresolve_merges(self): self.updater.update_to_step(_up(None, "e", True)) self.updater.update_to_step(_down("e", ("d1", "d2"))) self._assert_heads(("d2", "d1")) self.updater.update_to_step(_down("d2", "c2")) self._assert_heads(("c2", "d1")) def test_update_no_match(self): self.updater.update_to_step(_up(None, "a", True)) self.updater.heads.add("x") assert_raises_message( CommandError, "Online migration expected to match one row when updating " "'x' to 'b' in 'version_table'; 0 found", self.updater.update_to_step, _up("x", "b"), ) def test_update_multi_match(self): self.connection.execute(version_table.insert(), version_num="a") self.connection.execute(version_table.insert(), version_num="a") self.updater.heads.add("a") assert_raises_message( CommandError, "Online migration expected to match one row when updating " "'a' to 'b' in 'version_table'; 2 found", self.updater.update_to_step, _up("a", "b"), ) def test_delete_no_match(self): self.updater.update_to_step(_up(None, "a", True)) self.updater.heads.add("x") assert_raises_message( CommandError, "Online migration expected to match one row when " "deleting 'x' in 'version_table'; 0 found", self.updater.update_to_step, _down("x", None, True), ) def test_delete_multi_match(self): self.connection.execute(version_table.insert(), version_num="a") self.connection.execute(version_table.insert(), version_num="a") self.updater.heads.add("a") assert_raises_message( CommandError, "Online migration expected to match one row when " "deleting 'a' in 'version_table'; 2 found", self.updater.update_to_step, _down("a", None, True), ) zzzeek-alembic-bee044a1c187/tests/test_version_traversal.py000066400000000000000000001014461353106760100241330ustar00rootroot00000000000000from alembic import util from alembic.migration import HeadMaintainer from alembic.migration import MigrationStep from alembic.testing import assert_raises_message from alembic.testing import eq_ from alembic.testing import mock from alembic.testing.env import clear_staging_env from alembic.testing.env import staging_env from alembic.testing.fixtures import TestBase class MigrationTest(TestBase): def up_(self, rev): return MigrationStep.upgrade_from_script(self.env.revision_map, rev) def down_(self, rev): return MigrationStep.downgrade_from_script(self.env.revision_map, rev) def _assert_downgrade(self, destination, source, expected, expected_heads): revs = self.env._downgrade_revs(destination, source) eq_(revs, expected) heads = set(util.to_tuple(source, default=())) head = HeadMaintainer(mock.Mock(), heads) for rev in revs: head.update_to_step(rev) eq_(head.heads, expected_heads) def _assert_upgrade(self, destination, source, expected, expected_heads): revs = self.env._upgrade_revs(destination, source) eq_(revs, expected) heads = set(util.to_tuple(source, default=())) head = HeadMaintainer(mock.Mock(), heads) for rev in revs: head.update_to_step(rev) eq_(head.heads, expected_heads) class RevisionPathTest(MigrationTest): @classmethod def setup_class(cls): cls.env = env = staging_env() cls.a = env.generate_revision(util.rev_id(), "->a") cls.b = env.generate_revision(util.rev_id(), "a->b") cls.c = env.generate_revision(util.rev_id(), "b->c") cls.d = env.generate_revision(util.rev_id(), "c->d") cls.e = env.generate_revision(util.rev_id(), "d->e") @classmethod def teardown_class(cls): clear_staging_env() def test_upgrade_path(self): self._assert_upgrade( self.e.revision, self.c.revision, [self.up_(self.d), self.up_(self.e)], set([self.e.revision]), ) self._assert_upgrade( self.c.revision, None, [self.up_(self.a), self.up_(self.b), self.up_(self.c)], set([self.c.revision]), ) def test_relative_upgrade_path(self): self._assert_upgrade( "+2", self.a.revision, [self.up_(self.b), self.up_(self.c)], set([self.c.revision]), ) self._assert_upgrade( "+1", self.a.revision, [self.up_(self.b)], set([self.b.revision]) ) self._assert_upgrade( "+3", self.b.revision, [self.up_(self.c), self.up_(self.d), self.up_(self.e)], set([self.e.revision]), ) self._assert_upgrade( "%s+2" % self.b.revision, self.a.revision, [self.up_(self.b), self.up_(self.c), self.up_(self.d)], set([self.d.revision]), ) self._assert_upgrade( "%s-2" % self.d.revision, self.a.revision, [self.up_(self.b)], set([self.b.revision]), ) def test_invalid_relative_upgrade_path(self): assert_raises_message( util.CommandError, "Relative revision -2 didn't produce 2 migrations", self.env._upgrade_revs, "-2", self.b.revision, ) assert_raises_message( util.CommandError, r"Relative revision \+5 didn't produce 5 migrations", self.env._upgrade_revs, "+5", self.b.revision, ) def test_downgrade_path(self): self._assert_downgrade( self.c.revision, self.e.revision, [self.down_(self.e), self.down_(self.d)], set([self.c.revision]), ) self._assert_downgrade( None, self.c.revision, [self.down_(self.c), self.down_(self.b), self.down_(self.a)], set(), ) def test_relative_downgrade_path(self): self._assert_downgrade( "-1", self.c.revision, [self.down_(self.c)], set([self.b.revision]) ) self._assert_downgrade( "-3", self.e.revision, [self.down_(self.e), self.down_(self.d), self.down_(self.c)], set([self.b.revision]), ) self._assert_downgrade( "%s+2" % self.a.revision, self.d.revision, [self.down_(self.d)], set([self.c.revision]), ) self._assert_downgrade( "%s-2" % self.c.revision, self.d.revision, [self.down_(self.d), self.down_(self.c), self.down_(self.b)], set([self.a.revision]), ) def test_invalid_relative_downgrade_path(self): assert_raises_message( util.CommandError, "Relative revision -5 didn't produce 5 migrations", self.env._downgrade_revs, "-5", self.b.revision, ) assert_raises_message( util.CommandError, r"Relative revision \+2 didn't produce 2 migrations", self.env._downgrade_revs, "+2", self.b.revision, ) def test_invalid_move_rev_to_none(self): assert_raises_message( util.CommandError, r"Destination %s is not a valid downgrade " r"target from current head\(s\)" % self.b.revision[0:3], self.env._downgrade_revs, self.b.revision[0:3], None, ) def test_invalid_move_higher_to_lower(self): assert_raises_message( util.CommandError, r"Destination %s is not a valid downgrade " r"target from current head\(s\)" % self.c.revision[0:4], self.env._downgrade_revs, self.c.revision[0:4], self.b.revision, ) def test_stamp_to_base(self): revs = self.env._stamp_revs("base", self.d.revision) eq_(len(revs), 1) assert revs[0].should_delete_branch eq_(revs[0].delete_version_num, self.d.revision) class BranchedPathTest(MigrationTest): @classmethod def setup_class(cls): cls.env = env = staging_env() cls.a = env.generate_revision(util.rev_id(), "->a") cls.b = env.generate_revision(util.rev_id(), "a->b") cls.c1 = env.generate_revision( util.rev_id(), "b->c1", branch_labels="c1branch", refresh=True ) cls.d1 = env.generate_revision(util.rev_id(), "c1->d1") cls.c2 = env.generate_revision( util.rev_id(), "b->c2", branch_labels="c2branch", head=cls.b.revision, splice=True, ) cls.d2 = env.generate_revision( util.rev_id(), "c2->d2", head=cls.c2.revision ) @classmethod def teardown_class(cls): clear_staging_env() def test_stamp_down_across_multiple_branch_to_branchpoint(self): heads = [self.d1.revision, self.c2.revision] revs = self.env._stamp_revs(self.b.revision, heads) eq_(len(revs), 1) eq_( revs[0].merge_branch_idents(heads), # DELETE d1 revision, UPDATE c2 to b ([self.d1.revision], self.c2.revision, self.b.revision), ) def test_stamp_to_labeled_base_multiple_heads(self): revs = self.env._stamp_revs( "c1branch@base", [self.d1.revision, self.c2.revision] ) eq_(len(revs), 1) assert revs[0].should_delete_branch eq_(revs[0].delete_version_num, self.d1.revision) def test_stamp_to_labeled_head_multiple_heads(self): heads = [self.d1.revision, self.c2.revision] revs = self.env._stamp_revs("c2branch@head", heads) eq_(len(revs), 1) eq_( revs[0].merge_branch_idents(heads), # the c1branch remains unchanged ([], self.c2.revision, self.d2.revision), ) def test_upgrade_single_branch(self): self._assert_upgrade( self.d1.revision, self.b.revision, [self.up_(self.c1), self.up_(self.d1)], set([self.d1.revision]), ) def test_upgrade_multiple_branch(self): # move from a single head to multiple heads self._assert_upgrade( (self.d1.revision, self.d2.revision), self.a.revision, [ self.up_(self.b), self.up_(self.c2), self.up_(self.d2), self.up_(self.c1), self.up_(self.d1), ], set([self.d1.revision, self.d2.revision]), ) def test_downgrade_multiple_branch(self): self._assert_downgrade( self.a.revision, (self.d1.revision, self.d2.revision), [ self.down_(self.d1), self.down_(self.c1), self.down_(self.d2), self.down_(self.c2), self.down_(self.b), ], set([self.a.revision]), ) def test_relative_upgrade(self): self._assert_upgrade( "c2branch@head-1", self.b.revision, [self.up_(self.c2)], set([self.c2.revision]), ) def test_relative_downgrade(self): self._assert_downgrade( "c2branch@base+2", [self.d2.revision, self.d1.revision], [self.down_(self.d2), self.down_(self.c2), self.down_(self.d1)], set([self.c1.revision]), ) class BranchFromMergepointTest(MigrationTest): """this is a form that will come up frequently in the "many independent roots with cross-dependencies" case. """ @classmethod def setup_class(cls): cls.env = env = staging_env() cls.a1 = env.generate_revision(util.rev_id(), "->a1") cls.b1 = env.generate_revision(util.rev_id(), "a1->b1") cls.c1 = env.generate_revision(util.rev_id(), "b1->c1") cls.a2 = env.generate_revision( util.rev_id(), "->a2", head=(), refresh=True ) cls.b2 = env.generate_revision( util.rev_id(), "a2->b2", head=cls.a2.revision ) cls.c2 = env.generate_revision( util.rev_id(), "b2->c2", head=cls.b2.revision ) # mergepoint between c1, c2 # d1 dependent on c2 cls.d1 = env.generate_revision( util.rev_id(), "d1", head=(cls.c1.revision, cls.c2.revision), refresh=True, ) # but then c2 keeps going into d2 cls.d2 = env.generate_revision( util.rev_id(), "d2", head=cls.c2.revision, refresh=True, splice=True, ) @classmethod def teardown_class(cls): clear_staging_env() def test_mergepoint_to_only_one_side_upgrade(self): self._assert_upgrade( self.d1.revision, (self.d2.revision, self.b1.revision), [self.up_(self.c1), self.up_(self.d1)], set([self.d2.revision, self.d1.revision]), ) def test_mergepoint_to_only_one_side_downgrade(self): self._assert_downgrade( self.b1.revision, (self.d2.revision, self.d1.revision), [self.down_(self.d1), self.down_(self.c1)], set([self.d2.revision, self.b1.revision]), ) class BranchFrom3WayMergepointTest(MigrationTest): """this is a form that will come up frequently in the "many independent roots with cross-dependencies" case. """ @classmethod def setup_class(cls): cls.env = env = staging_env() cls.a1 = env.generate_revision(util.rev_id(), "->a1") cls.b1 = env.generate_revision(util.rev_id(), "a1->b1") cls.c1 = env.generate_revision(util.rev_id(), "b1->c1") cls.a2 = env.generate_revision( util.rev_id(), "->a2", head=(), refresh=True ) cls.b2 = env.generate_revision( util.rev_id(), "a2->b2", head=cls.a2.revision ) cls.c2 = env.generate_revision( util.rev_id(), "b2->c2", head=cls.b2.revision ) cls.a3 = env.generate_revision( util.rev_id(), "->a3", head=(), refresh=True ) cls.b3 = env.generate_revision( util.rev_id(), "a3->b3", head=cls.a3.revision ) cls.c3 = env.generate_revision( util.rev_id(), "b3->c3", head=cls.b3.revision ) # mergepoint between c1, c2, c3 # d1 dependent on c2, c3 cls.d1 = env.generate_revision( util.rev_id(), "d1", head=(cls.c1.revision, cls.c2.revision, cls.c3.revision), refresh=True, ) # but then c2 keeps going into d2 cls.d2 = env.generate_revision( util.rev_id(), "d2", head=cls.c2.revision, refresh=True, splice=True, ) # c3 keeps going into d3 cls.d3 = env.generate_revision( util.rev_id(), "d3", head=cls.c3.revision, refresh=True, splice=True, ) @classmethod def teardown_class(cls): clear_staging_env() def test_mergepoint_to_only_one_side_upgrade(self): self._assert_upgrade( self.d1.revision, (self.d3.revision, self.d2.revision, self.b1.revision), [self.up_(self.c1), self.up_(self.d1)], set([self.d3.revision, self.d2.revision, self.d1.revision]), ) def test_mergepoint_to_only_one_side_downgrade(self): self._assert_downgrade( self.b1.revision, (self.d3.revision, self.d2.revision, self.d1.revision), [self.down_(self.d1), self.down_(self.c1)], set([self.d3.revision, self.d2.revision, self.b1.revision]), ) def test_mergepoint_to_two_sides_upgrade(self): self._assert_upgrade( self.d1.revision, (self.d3.revision, self.b2.revision, self.b1.revision), [self.up_(self.c2), self.up_(self.c1), self.up_(self.d1)], # this will merge b2 and b1 into d1 set([self.d3.revision, self.d1.revision]), ) # but then! b2 will break out again if we keep going with it self._assert_upgrade( self.d2.revision, (self.d3.revision, self.d1.revision), [self.up_(self.d2)], set([self.d3.revision, self.d2.revision, self.d1.revision]), ) class TwinMergeTest(MigrationTest): """Test #297, where we have two mergepoints from the same set of originating branches. """ @classmethod def setup_class(cls): """ 33e21c000cfe -> 178d4e761bbd (head), 2bef33cb3a58, 3904558db1c6, 968330f320d -> 33e21c000cfe (mergepoint) 46c99f866004 -> 18f46b42410d (head), 2bef33cb3a58, 3904558db1c6, 968330f320d -> 46c99f866004 (mergepoint) f0fa4315825 -> 3904558db1c6 (branchpoint), -------------------------- A -> B2 (branchpoint), B1, B2, B3 -> C1 (mergepoint) B1, B2, B3 -> C2 (mergepoint) C1 -> D1 (head), C2 -> D2 (head), """ cls.env = env = staging_env() cls.a = env.generate_revision("a", "a") cls.b1 = env.generate_revision("b1", "b1", head=cls.a.revision) cls.b2 = env.generate_revision( "b2", "b2", splice=True, head=cls.a.revision ) cls.b3 = env.generate_revision( "b3", "b3", splice=True, head=cls.a.revision ) cls.c1 = env.generate_revision( "c1", "c1", head=(cls.b1.revision, cls.b2.revision, cls.b3.revision), ) cls.c2 = env.generate_revision( "c2", "c2", splice=True, head=(cls.b1.revision, cls.b2.revision, cls.b3.revision), ) cls.d1 = env.generate_revision("d1", "d1", head=cls.c1.revision) cls.d2 = env.generate_revision("d2", "d2", head=cls.c2.revision) @classmethod def teardown_class(cls): clear_staging_env() def test_upgrade(self): head = HeadMaintainer(mock.Mock(), [self.a.revision]) steps = [ (self.up_(self.b3), ("b3",)), (self.up_(self.b1), ("b1", "b3")), (self.up_(self.b2), ("b1", "b2", "b3")), (self.up_(self.c2), ("c2",)), (self.up_(self.d2), ("d2",)), (self.up_(self.c1), ("c1", "d2")), (self.up_(self.d1), ("d1", "d2")), ] for step, assert_ in steps: head.update_to_step(step) eq_(head.heads, set(assert_)) class NotQuiteTwinMergeTest(MigrationTest): """Test a variant of #297. """ @classmethod def setup_class(cls): """ A -> B2 (branchpoint), B1, B2 -> C1 (mergepoint) B2, B3 -> C2 (mergepoint) C1 -> D1 (head), C2 -> D2 (head), """ cls.env = env = staging_env() cls.a = env.generate_revision("a", "a") cls.b1 = env.generate_revision("b1", "b1", head=cls.a.revision) cls.b2 = env.generate_revision( "b2", "b2", splice=True, head=cls.a.revision ) cls.b3 = env.generate_revision( "b3", "b3", splice=True, head=cls.a.revision ) cls.c1 = env.generate_revision( "c1", "c1", head=(cls.b1.revision, cls.b2.revision) ) cls.c2 = env.generate_revision( "c2", "c2", splice=True, head=(cls.b2.revision, cls.b3.revision) ) cls.d1 = env.generate_revision("d1", "d1", head=cls.c1.revision) cls.d2 = env.generate_revision("d2", "d2", head=cls.c2.revision) @classmethod def teardown_class(cls): clear_staging_env() def test_upgrade(self): head = HeadMaintainer(mock.Mock(), [self.a.revision]) """ upgrade a -> b2, b2 upgrade a -> b3, b3 upgrade b2, b3 -> c2, c2 upgrade c2 -> d2, d2 upgrade a -> b1, b1 upgrade b1, b2 -> c1, c1 upgrade c1 -> d1, d1 """ steps = [ (self.up_(self.b2), ("b2",)), (self.up_(self.b3), ("b2", "b3")), (self.up_(self.c2), ("c2",)), (self.up_(self.d2), ("d2",)), (self.up_(self.b1), ("b1", "d2")), (self.up_(self.c1), ("c1", "d2")), (self.up_(self.d1), ("d1", "d2")), ] for step, assert_ in steps: head.update_to_step(step) eq_(head.heads, set(assert_)) class DependsOnBranchTestOne(MigrationTest): @classmethod def setup_class(cls): cls.env = env = staging_env() cls.a1 = env.generate_revision( util.rev_id(), "->a1", branch_labels=["lib1"] ) cls.b1 = env.generate_revision(util.rev_id(), "a1->b1") cls.c1 = env.generate_revision(util.rev_id(), "b1->c1") cls.a2 = env.generate_revision(util.rev_id(), "->a2", head=()) cls.b2 = env.generate_revision( util.rev_id(), "a2->b2", head=cls.a2.revision ) cls.c2 = env.generate_revision( util.rev_id(), "b2->c2", head=cls.b2.revision, depends_on=cls.c1.revision, ) cls.d1 = env.generate_revision( util.rev_id(), "c1->d1", head=cls.c1.revision ) cls.e1 = env.generate_revision( util.rev_id(), "d1->e1", head=cls.d1.revision ) cls.f1 = env.generate_revision( util.rev_id(), "e1->f1", head=cls.e1.revision ) @classmethod def teardown_class(cls): clear_staging_env() def test_downgrade_to_dependency(self): heads = [self.c2.revision, self.d1.revision] head = HeadMaintainer(mock.Mock(), heads) head.update_to_step(self.down_(self.d1)) eq_(head.heads, set([self.c2.revision])) def test_stamp_across_dependency(self): heads = [self.e1.revision, self.c2.revision] head = HeadMaintainer(mock.Mock(), heads) for step in self.env._stamp_revs(self.b1.revision, heads): head.update_to_step(step) eq_(head.heads, set([self.b1.revision])) class DependsOnBranchTestTwo(MigrationTest): @classmethod def setup_class(cls): """ Structure:: a1 ---+ | a2 ---+--> amerge | a3 ---+ ^ | +---------------------------+ | b1 ---+ | +--> bmerge overmerge / d1 b2 ---+ | | ^ | | | | | +--------------------------+ | | +-----------------------------+ | v c1 ---+ | c2 ---+--> cmerge | c3 ---+ """ cls.env = env = staging_env() cls.a1 = env.generate_revision("a1", "->a1", head="base") cls.a2 = env.generate_revision("a2", "->a2", head="base") cls.a3 = env.generate_revision("a3", "->a3", head="base") cls.amerge = env.generate_revision( "amerge", "amerge", head=[cls.a1.revision, cls.a2.revision, cls.a3.revision], ) cls.b1 = env.generate_revision("b1", "->b1", head="base") cls.b2 = env.generate_revision("b2", "->b2", head="base") cls.bmerge = env.generate_revision( "bmerge", "bmerge", head=[cls.b1.revision, cls.b2.revision] ) cls.c1 = env.generate_revision("c1", "->c1", head="base") cls.c2 = env.generate_revision("c2", "->c2", head="base") cls.c3 = env.generate_revision("c3", "->c3", head="base") cls.cmerge = env.generate_revision( "cmerge", "cmerge", head=[cls.c1.revision, cls.c2.revision, cls.c3.revision], ) cls.d1 = env.generate_revision( "d1", "o", head="base", depends_on=[cls.a3.revision, cls.b2.revision, cls.c1.revision], ) @classmethod def teardown_class(cls): clear_staging_env() def test_kaboom(self): # here's the upgrade path: # ['->c1', '->b2', '->a3', 'overmerge', '->c3', '->c2', 'cmerge', # '->b1', 'bmerge', '->a2', '->a1', 'amerge'], heads = [ self.amerge.revision, self.bmerge.revision, self.cmerge.revision, self.d1.revision, ] self._assert_downgrade( self.b2.revision, heads, [self.down_(self.bmerge)], set( [ self.amerge.revision, self.b1.revision, self.cmerge.revision, self.d1.revision, ] ), ) # start with those heads.. heads = [ self.amerge.revision, self.d1.revision, self.b1.revision, self.cmerge.revision, ] # downgrade d1... self._assert_downgrade( "d1@base", heads, [self.down_(self.d1)], # b2 has to be INSERTed, because it was implied by d1 set( [ self.amerge.revision, self.b1.revision, self.b2.revision, self.cmerge.revision, ] ), ) # start with those heads ... heads = [ self.amerge.revision, self.b1.revision, self.b2.revision, self.cmerge.revision, ] self._assert_downgrade( "base", heads, [ self.down_(self.amerge), self.down_(self.a1), self.down_(self.a2), self.down_(self.a3), self.down_(self.b1), self.down_(self.b2), self.down_(self.cmerge), self.down_(self.c1), self.down_(self.c2), self.down_(self.c3), ], set([]), ) class DependsOnBranchTestThree(MigrationTest): @classmethod def setup_class(cls): """ issue #377 Structure:: -> a1 --+--> a2 -------> a3 | ^ | | | +------+ | | | | +---|------+ | | | | v | +-------> b1 --> b2 --> b3 """ cls.env = env = staging_env() cls.a1 = env.generate_revision("a1", "->a1", head="base") cls.a2 = env.generate_revision("a2", "->a2") cls.b1 = env.generate_revision("b1", "->b1", head="base") cls.b2 = env.generate_revision( "b2", "->b2", depends_on="a2", head="b1" ) cls.b3 = env.generate_revision("b3", "->b3", head="b2") cls.a3 = env.generate_revision( "a3", "->a3", head="a2", depends_on="b1" ) @classmethod def teardown_class(cls): clear_staging_env() def test_downgrade_over_crisscross(self): # this state was not possible prior to # #377. a3 would be considered half of a merge point # between a3 and b2, and the head would be forced down # to b1. In this test however, we're not allowed to remove # b2 because a2 is dependent on it, hence we add the ability # to remove half of a merge point. self._assert_downgrade( "b1", ["a3", "b2"], [self.down_(self.b2)], set(["a3"]), # we have b1 also, which is implied by a3 ) class DependsOnBranchLabelTest(MigrationTest): @classmethod def setup_class(cls): cls.env = env = staging_env() cls.a1 = env.generate_revision( util.rev_id(), "->a1", branch_labels=["lib1"] ) cls.b1 = env.generate_revision(util.rev_id(), "a1->b1") cls.c1 = env.generate_revision( util.rev_id(), "b1->c1", branch_labels=["c1lib"] ) cls.a2 = env.generate_revision(util.rev_id(), "->a2", head=()) cls.b2 = env.generate_revision( util.rev_id(), "a2->b2", head=cls.a2.revision ) cls.c2 = env.generate_revision( util.rev_id(), "b2->c2", head=cls.b2.revision, depends_on=["c1lib"] ) cls.d1 = env.generate_revision( util.rev_id(), "c1->d1", head=cls.c1.revision ) cls.e1 = env.generate_revision( util.rev_id(), "d1->e1", head=cls.d1.revision ) cls.f1 = env.generate_revision( util.rev_id(), "e1->f1", head=cls.e1.revision ) @classmethod def teardown_class(cls): clear_staging_env() def test_upgrade_path(self): self._assert_upgrade( self.c2.revision, self.a2.revision, [ self.up_(self.a1), self.up_(self.b1), self.up_(self.c1), self.up_(self.b2), self.up_(self.c2), ], set([self.c2.revision]), ) class ForestTest(MigrationTest): @classmethod def setup_class(cls): cls.env = env = staging_env() cls.a1 = env.generate_revision(util.rev_id(), "->a1") cls.b1 = env.generate_revision(util.rev_id(), "a1->b1") cls.a2 = env.generate_revision( util.rev_id(), "->a2", head=(), refresh=True ) cls.b2 = env.generate_revision( util.rev_id(), "a2->b2", head=cls.a2.revision ) @classmethod def teardown_class(cls): clear_staging_env() def test_base_to_heads(self): eq_( self.env._upgrade_revs("heads", "base"), [ self.up_(self.a2), self.up_(self.b2), self.up_(self.a1), self.up_(self.b1), ], ) def test_stamp_to_heads(self): revs = self.env._stamp_revs("heads", ()) eq_(len(revs), 2) eq_( set(r.to_revisions for r in revs), set([(self.b1.revision,), (self.b2.revision,)]), ) def test_stamp_to_heads_no_moves_needed(self): revs = self.env._stamp_revs( "heads", (self.b1.revision, self.b2.revision) ) eq_(len(revs), 0) class MergedPathTest(MigrationTest): @classmethod def setup_class(cls): cls.env = env = staging_env() cls.a = env.generate_revision(util.rev_id(), "->a") cls.b = env.generate_revision(util.rev_id(), "a->b") cls.c1 = env.generate_revision(util.rev_id(), "b->c1") cls.d1 = env.generate_revision(util.rev_id(), "c1->d1") cls.c2 = env.generate_revision( util.rev_id(), "b->c2", branch_labels="c2branch", head=cls.b.revision, splice=True, ) cls.d2 = env.generate_revision( util.rev_id(), "c2->d2", head=cls.c2.revision ) cls.e = env.generate_revision( util.rev_id(), "merge d1 and d2", head=(cls.d1.revision, cls.d2.revision), ) cls.f = env.generate_revision(util.rev_id(), "e->f") @classmethod def teardown_class(cls): clear_staging_env() def test_stamp_down_across_merge_point_branch(self): heads = [self.e.revision] revs = self.env._stamp_revs(self.c2.revision, heads) eq_(len(revs), 1) eq_( revs[0].merge_branch_idents(heads), # no deletes, UPDATE e to c2 ([], self.e.revision, self.c2.revision), ) def test_stamp_down_across_merge_prior_branching(self): heads = [self.e.revision] revs = self.env._stamp_revs(self.a.revision, heads) eq_(len(revs), 1) eq_( revs[0].merge_branch_idents(heads), # no deletes, UPDATE e to c2 ([], self.e.revision, self.a.revision), ) def test_stamp_up_across_merge_from_single_branch(self): revs = self.env._stamp_revs(self.e.revision, [self.c2.revision]) eq_(len(revs), 1) eq_( revs[0].merge_branch_idents([self.c2.revision]), # no deletes, UPDATE e to c2 ([], self.c2.revision, self.e.revision), ) def test_stamp_labled_head_across_merge_from_multiple_branch(self): # this is testing that filter_for_lineage() checks for # d1 both in terms of "c2branch" as well as that the "head" # revision "f" is the head of both d1 and d2 revs = self.env._stamp_revs( "c2branch@head", [self.d1.revision, self.c2.revision] ) eq_(len(revs), 1) eq_( revs[0].merge_branch_idents([self.d1.revision, self.c2.revision]), # DELETE d1 revision, UPDATE c2 to e ([self.d1.revision], self.c2.revision, self.f.revision), ) def test_stamp_up_across_merge_from_multiple_branch(self): heads = [self.d1.revision, self.c2.revision] revs = self.env._stamp_revs(self.e.revision, heads) eq_(len(revs), 1) eq_( revs[0].merge_branch_idents(heads), # DELETE d1 revision, UPDATE c2 to e ([self.d1.revision], self.c2.revision, self.e.revision), ) def test_stamp_up_across_merge_prior_branching(self): heads = [self.b.revision] revs = self.env._stamp_revs(self.e.revision, heads) eq_(len(revs), 1) eq_( revs[0].merge_branch_idents(heads), # no deletes, UPDATE e to c2 ([], self.b.revision, self.e.revision), ) def test_upgrade_across_merge_point(self): eq_( self.env._upgrade_revs(self.f.revision, self.b.revision), [ self.up_(self.c2), self.up_(self.d2), self.up_(self.c1), # b->c1, create new branch self.up_(self.d1), self.up_(self.e), # d1/d2 -> e, merge branches # (DELETE d2, UPDATE d1->e) self.up_(self.f), ], ) def test_downgrade_across_merge_point(self): eq_( self.env._downgrade_revs(self.b.revision, self.f.revision), [ self.down_(self.f), self.down_(self.e), # e -> d1 and d2, unmerge branches # (UPDATE e->d1, INSERT d2) self.down_(self.d1), self.down_(self.c1), self.down_(self.d2), self.down_(self.c2), # c2->b, delete branch ], ) zzzeek-alembic-bee044a1c187/tox.ini000066400000000000000000000036451353106760100171250ustar00rootroot00000000000000[tox] envlist = py SQLA_REPO = {env:SQLA_REPO:git+https://github.com/sqlalchemy/sqlalchemy.git} [testenv] cov_args=--cov=alembic --cov-report term --cov-report xml deps=pytest!=3.9.1,!=3.9.2 pytest-xdist mock sqla11: {[tox]SQLA_REPO}@rel_1_1 sqla12: {[tox]SQLA_REPO}@rel_1_2 sqla13: {[tox]SQLA_REPO}@rel_1_3 sqlamaster: {[tox]SQLA_REPO}@master postgresql: psycopg2 mysql: mysqlclient mysql: pymysql oracle: cx_oracle>=6.0,!=6.4 mssql: pymssql cov: pytest-cov usedevelop= cov: True # only use --dropfirst option if we're *not* using -n; # if -n is used, we're working in brand new DBs anyway setenv= BASECOMMAND=python -m pytest WORKERS=-n4 sqla079: WORKERS=--dropfirst cov: COVERAGE={[testenv]cov_args} sqlite: SQLITE={env:TOX_SQLITE:--db sqlite} postgresql: POSTGRESQL={env:TOX_POSTGRESQL:--db postgresql} mysql: MYSQL={env:TOX_MYSQL:--db mysql} oracle: ORACLE={env:TOX_ORACLE:--db oracle} --low-connections --write-idents db_idents.txt mssql: MSSQL={env:TOX_MSSQL:--db pymssql} pyoptimize: PYTHONOPTIMIZE=1 pyoptimize: LIMITTESTS="tests/test_script_consumption.py" # tox as of 2.0 blocks all environment variables from the # outside, unless they are here (or in TOX_TESTENV_PASSENV, # wildcards OK). Need at least these passenv=ORACLE_HOME NLS_LANG TOX_SQLITE TOX_POSTGRESQL TOX_MYSQL TOX_ORACLE TOX_MSSQL commands= {env:BASECOMMAND} {env:WORKERS} {env:SQLITE:} {env:POSTGRESQL:} {env:MYSQL:} {env:ORACLE:} {env:MSSQL:} {env:BACKENDONLY:} {env:COVERAGE:} {env:LIMITTESTS:} {posargs} {oracle,mssql}: python reap_dbs.py db_idents.txt [testenv:pep8] basepython = python3 deps= flake8 flake8-import-order flake8-builtins flake8-docstrings flake8-rst-docstrings pydocstyle<4.0.0 # used by flake8-rst-docstrings pygments commands = flake8 ./alembic/ ./tests/ setup.py docs/build/conf.py