• Home
  • History
  • Annotate
Name Date Size #Lines LOC

..03-May-2022-

doc/H03-May-2022-121,20491,574

examples/H11-Nov-2021-10,7717,657

lib/H11-Nov-2021-183,838138,811

test/H03-May-2022-285,406232,493

AUTHORSH A D11-Nov-2021492 3128

CHANGESH A D11-Nov-2021235 107

LICENSEH A D11-Nov-20211.1 KiB2016

MANIFEST.inH A D11-Nov-2021555 1813

PKG-INFOH A D11-Nov-20217.4 KiB211183

README.dialects.rstH A D11-Nov-20218.9 KiB218166

README.rstH A D11-Nov-20215.6 KiB159134

README.unittests.rstH A D11-Nov-202113.2 KiB307221

setup.cfgH A D11-Nov-20214.9 KiB160146

setup.pyH A D11-Nov-20215.1 KiB184137

tox.iniH A D11-Nov-20217 KiB172137

README.dialects.rst

1========================
2Developing new Dialects
3========================
4
5.. note::
6
7   When studying this file, it's probably a good idea to also
8   familiarize with the  README.unittests.rst file, which discusses
9   SQLAlchemy's usage and extension of the pytest test runner.
10
11While SQLAlchemy includes many dialects within the core distribution, the
12trend for new dialects should be that they are published as external
13projects.   SQLAlchemy has since version 0.5 featured a "plugin" system
14which allows external dialects to be integrated into SQLAlchemy using
15standard setuptools entry points.  As of version 0.8, this system has
16been enhanced, so that a dialect can also be "plugged in" at runtime.
17
18On the testing side, SQLAlchemy includes a "dialect compliance
19suite" that is usable by third party libraries, in the source tree
20at ``lib/sqlalchemy/testing/suite``.   There's no need for a third party
21dialect to run through SQLAlchemy's full testing suite, as a large portion of
22these tests do not have dialect-sensitive functionality.  The "dialect
23compliance suite" should be viewed as the primary target for new dialects.
24
25
26Dialect Layout
27===============
28
29The file structure of a dialect is typically similar to the following::
30
31    sqlalchemy-<dialect>/
32                         setup.py
33                         setup.cfg
34                         sqlalchemy_<dialect>/
35                                              __init__.py
36                                              base.py
37                                              <dbapi>.py
38                                              requirements.py
39                         test/
40                                              __init__.py
41                                              conftest.py
42                                              test_suite.py
43                                              test_<dialect_specific_test>.py
44                                              ...
45
46An example of this structure can be seen in the MS Access dialect at
47https://github.com/gordthompson/sqlalchemy-access .
48
49Key aspects of this file layout include:
50
51* setup.py - should specify setuptools entrypoints, allowing the
52  dialect to be usable from create_engine(), e.g.::
53
54        entry_points = {
55         'sqlalchemy.dialects': [
56              'access.pyodbc = sqlalchemy_access.pyodbc:AccessDialect_pyodbc',
57              ]
58        }
59
60  Above, the entrypoint ``access.pyodbc`` allow URLs to be used such as::
61
62    create_engine("access+pyodbc://user:pw@dsn")
63
64* setup.cfg - this file contains the traditional contents such as
65  [tool:pytest] directives, but also contains new directives that are used
66  by SQLAlchemy's testing framework.  E.g. for Access::
67
68    [tool:pytest]
69    addopts= --tb native -v -r fxX --maxfail=25 -p no:warnings
70    python_files=test/*test_*.py
71
72    [sqla_testing]
73    requirement_cls=sqlalchemy_access.requirements:Requirements
74    profile_file=test/profiles.txt
75
76    [db]
77    default=access+pyodbc://admin@access_test
78    sqlite=sqlite:///:memory:
79
80  Above, the ``[sqla_testing]`` section contains configuration used by
81  SQLAlchemy's test plugin.  The ``[tool:pytest]`` section
82  include directives to help with these runners.  When using pytest
83  the test/conftest.py file will bootstrap SQLAlchemy's plugin.
84
85* test/conftest.py - This script bootstraps SQLAlchemy's pytest plugin
86  into the pytest runner.  This
87  script can also be used to install your third party dialect into
88  SQLAlchemy without using the setuptools entrypoint system; this allows
89  your dialect to be present without any explicit setup.py step needed.
90  The other portion invokes SQLAlchemy's pytest plugin::
91
92    from sqlalchemy.dialects import registry
93    import pytest
94
95    registry.register("access.pyodbc", "sqlalchemy_access.pyodbc", "AccessDialect_pyodbc")
96
97    pytest.register_assert_rewrite("sqlalchemy.testing.assertions")
98
99    from sqlalchemy.testing.plugin.pytestplugin import *
100
101  Where above, the ``registry`` module, introduced in SQLAlchemy 0.8, provides
102  an in-Python means of installing the dialect entrypoint(s) without the use
103  of setuptools, using the ``registry.register()`` function in a way that
104  is similar to the ``entry_points`` directive we placed in our ``setup.py``.
105  (The ``pytest.register_assert_rewrite`` is there just to suppress a spurious
106  warning from pytest.)
107
108* requirements.py - The ``requirements.py`` file is where directives
109  regarding database and dialect capabilities are set up.
110  SQLAlchemy's tests are often annotated with decorators   that mark
111  tests as "skip" or "fail" for particular backends.  Over time, this
112  system   has been refined such that specific database and DBAPI names
113  are mentioned   less and less, in favor of @requires directives which
114  state a particular capability.   The requirement directive is linked
115  to target dialects using a ``Requirements`` subclass.   The custom
116  ``Requirements`` subclass is specified in the ``requirements.py`` file
117  and   is made available to SQLAlchemy's test runner using the
118  ``requirement_cls`` directive   inside the ``[sqla_testing]`` section.
119
120  For a third-party dialect, the custom ``Requirements`` class can
121  usually specify a simple yes/no answer for a particular system. For
122  example, a requirements file that specifies a database that supports
123  the RETURNING construct but does not support nullable boolean
124  columns might look like this::
125
126      # sqlalchemy_access/requirements.py
127
128      from sqlalchemy.testing.requirements import SuiteRequirements
129
130      from sqlalchemy.testing import exclusions
131
132      class Requirements(SuiteRequirements):
133          @property
134          def nullable_booleans(self):
135              """Target database allows boolean columns to store NULL."""
136              # Access Yes/No doesn't allow null
137              return exclusions.closed()
138
139          @property
140          def returning(self):
141              return exclusions.open()
142
143  The ``SuiteRequirements`` class in
144  ``sqlalchemy.testing.requirements`` contains a large number of
145  requirements rules, which attempt to have reasonable defaults. The
146  tests will report on those requirements found as they are run.
147
148  The requirements system can also be used when running SQLAlchemy's
149  primary test suite against the external dialect.  In this use case,
150  a ``--dburi`` as well as a ``--requirements`` flag are passed to SQLAlchemy's
151  test runner so that exclusions specific to the dialect take place::
152
153    cd /path/to/sqlalchemy
154    pytest -v \
155      --requirements sqlalchemy_access.requirements:Requirements \
156      --dburi access+pyodbc://admin@access_test
157
158* test_suite.py - Finally, the ``test_suite.py`` module represents a
159  stub test suite, which pulls in the actual SQLAlchemy test suite.
160  To pull in the suite as a whole, it can   be imported in one step::
161
162      # test/test_suite.py
163
164      from sqlalchemy.testing.suite import *
165
166  That's all that's needed - the ``sqlalchemy.testing.suite`` package
167  contains an ever expanding series of tests, most of which should be
168  annotated with specific requirement decorators so that they can be
169  fully controlled.  In the case that the decorators are not covering
170  a particular test, a test can also be directly modified or bypassed.
171  In the example below, the Access dialect test suite overrides the
172  ``get_huge_int()`` test::
173
174      from sqlalchemy.testing.suite import *
175
176      from sqlalchemy.testing.suite import IntegerTest as _IntegerTest
177
178      class IntegerTest(_IntegerTest):
179
180          @testing.skip("access")
181          def test_huge_int(self):
182              # bypass this test because Access ODBC fails with
183              # [ODBC Microsoft Access Driver] Optional feature not implemented.
184              return
185
186AsyncIO dialects
187----------------
188
189As of version 1.4 SQLAlchemy supports also dialects that use
190asyncio drivers to interface with the database backend.
191
192SQLAlchemy's approach to asyncio drivers is that the connection and cursor
193objects of the driver (if any) are adapted into a pep-249 compliant interface,
194using the ``AdaptedConnection`` interface class. Refer to the internal asyncio
195driver implementations such as that of ``asyncpg``, ``asyncmy`` and
196``aiosqlite`` for examples.
197
198Going Forward
199==============
200
201The third-party dialect can be distributed like any other Python
202module on PyPI. Links to prominent dialects can be featured within
203SQLAlchemy's own documentation; contact the developers (see AUTHORS)
204for help with this.
205
206While SQLAlchemy includes many dialects built in, it remains to be
207seen if the project as a whole might move towards "plugin" model for
208all dialects, including all those currently built in.  Now that
209SQLAlchemy's dialect API is mature and the test suite is not far
210behind, it may be that a better maintenance experience can be
211delivered by having all dialects separately maintained and released.
212
213As new versions of SQLAlchemy are released, the test suite and
214requirements file will receive new tests and changes.  The dialect
215maintainer would normally keep track of these changes and make
216adjustments as needed.
217
218

README.rst

1SQLAlchemy
2==========
3
4|PyPI| |Python| |Downloads|
5
6.. |PyPI| image:: https://img.shields.io/pypi/v/sqlalchemy
7    :target: https://pypi.org/project/sqlalchemy
8    :alt: PyPI
9
10.. |Python| image:: https://img.shields.io/pypi/pyversions/sqlalchemy
11    :target: https://pypi.org/project/sqlalchemy
12    :alt: PyPI - Python Version
13
14.. |Downloads| image:: https://img.shields.io/pypi/dm/sqlalchemy
15    :target: https://pypi.org/project/sqlalchemy
16    :alt: PyPI - Downloads
17
18
19The Python SQL Toolkit and Object Relational Mapper
20
21Introduction
22-------------
23
24SQLAlchemy is the Python SQL toolkit and Object Relational Mapper
25that gives application developers the full power and
26flexibility of SQL. SQLAlchemy provides a full suite
27of well known enterprise-level persistence patterns,
28designed for efficient and high-performing database
29access, adapted into a simple and Pythonic domain
30language.
31
32Major SQLAlchemy features include:
33
34* An industrial strength ORM, built
35  from the core on the identity map, unit of work,
36  and data mapper patterns.   These patterns
37  allow transparent persistence of objects
38  using a declarative configuration system.
39  Domain models
40  can be constructed and manipulated naturally,
41  and changes are synchronized with the
42  current transaction automatically.
43* A relationally-oriented query system, exposing
44  the full range of SQL's capabilities
45  explicitly, including joins, subqueries,
46  correlation, and most everything else,
47  in terms of the object model.
48  Writing queries with the ORM uses the same
49  techniques of relational composition you use
50  when writing SQL.  While you can drop into
51  literal SQL at any time, it's virtually never
52  needed.
53* A comprehensive and flexible system
54  of eager loading for related collections and objects.
55  Collections are cached within a session,
56  and can be loaded on individual access, all
57  at once using joins, or by query per collection
58  across the full result set.
59* A Core SQL construction system and DBAPI
60  interaction layer.  The SQLAlchemy Core is
61  separate from the ORM and is a full database
62  abstraction layer in its own right, and includes
63  an extensible Python-based SQL expression
64  language, schema metadata, connection pooling,
65  type coercion, and custom types.
66* All primary and foreign key constraints are
67  assumed to be composite and natural.  Surrogate
68  integer primary keys are of course still the
69  norm, but SQLAlchemy never assumes or hardcodes
70  to this model.
71* Database introspection and generation.  Database
72  schemas can be "reflected" in one step into
73  Python structures representing database metadata;
74  those same structures can then generate
75  CREATE statements right back out - all within
76  the Core, independent of the ORM.
77
78SQLAlchemy's philosophy:
79
80* SQL databases behave less and less like object
81  collections the more size and performance start to
82  matter; object collections behave less and less like
83  tables and rows the more abstraction starts to matter.
84  SQLAlchemy aims to accommodate both of these
85  principles.
86* An ORM doesn't need to hide the "R".   A relational
87  database provides rich, set-based functionality
88  that should be fully exposed.   SQLAlchemy's
89  ORM provides an open-ended set of patterns
90  that allow a developer to construct a custom
91  mediation layer between a domain model and
92  a relational schema, turning the so-called
93  "object relational impedance" issue into
94  a distant memory.
95* The developer, in all cases, makes all decisions
96  regarding the design, structure, and naming conventions
97  of both the object model as well as the relational
98  schema.   SQLAlchemy only provides the means
99  to automate the execution of these decisions.
100* With SQLAlchemy, there's no such thing as
101  "the ORM generated a bad query" - you
102  retain full control over the structure of
103  queries, including how joins are organized,
104  how subqueries and correlation is used, what
105  columns are requested.  Everything SQLAlchemy
106  does is ultimately the result of a developer-
107  initiated decision.
108* Don't use an ORM if the problem doesn't need one.
109  SQLAlchemy consists of a Core and separate ORM
110  component.   The Core offers a full SQL expression
111  language that allows Pythonic construction
112  of SQL constructs that render directly to SQL
113  strings for a target database, returning
114  result sets that are essentially enhanced DBAPI
115  cursors.
116* Transactions should be the norm.  With SQLAlchemy's
117  ORM, nothing goes to permanent storage until
118  commit() is called.  SQLAlchemy encourages applications
119  to create a consistent means of delineating
120  the start and end of a series of operations.
121* Never render a literal value in a SQL statement.
122  Bound parameters are used to the greatest degree
123  possible, allowing query optimizers to cache
124  query plans effectively and making SQL injection
125  attacks a non-issue.
126
127Documentation
128-------------
129
130Latest documentation is at:
131
132https://www.sqlalchemy.org/docs/
133
134Installation / Requirements
135---------------------------
136
137Full documentation for installation is at
138`Installation <https://www.sqlalchemy.org/docs/intro.html#installation>`_.
139
140Getting Help / Development / Bug reporting
141------------------------------------------
142
143Please refer to the `SQLAlchemy Community Guide <https://www.sqlalchemy.org/support.html>`_.
144
145Code of Conduct
146---------------
147
148Above all, SQLAlchemy places great emphasis on polite, thoughtful, and
149constructive communication between users and developers.
150Please see our current Code of Conduct at
151`Code of Conduct <https://www.sqlalchemy.org/codeofconduct.html>`_.
152
153License
154-------
155
156SQLAlchemy is distributed under the `MIT license
157<https://www.opensource.org/licenses/mit-license.php>`_.
158
159

README.unittests.rst

1=====================
2SQLALCHEMY UNIT TESTS
3=====================
4
5Basic Test Running
6==================
7
8Tox is used to run the test suite fully.   For basic test runs against
9a single Python interpreter::
10
11    tox
12
13
14Advanced Tox Options
15====================
16
17For more elaborate CI-style test running, the tox script provided will
18run against various Python / database targets.   For a basic run against
19Python 3.8 using an in-memory SQLite database::
20
21    tox -e py38-sqlite
22
23The tox runner contains a series of target combinations that can run
24against various combinations of databases.  The test suite can be
25run against SQLite with "backend" tests also running against a PostgreSQL
26database::
27
28    tox -e py38-sqlite-postgresql
29
30Or to run just "backend" tests against a MySQL database::
31
32    tox -e py38-mysql-backendonly
33
34Running against backends other than SQLite requires that a database of that
35vendor be available at a specific URL.  See "Setting Up Databases" below
36for details.
37
38The pytest Engine
39=================
40
41The tox runner is using pytest to invoke the test suite.   Within the realm of
42pytest, SQLAlchemy itself is adding a large series of option and
43customizations to the pytest runner using plugin points, to allow for
44SQLAlchemy's multiple database support, database setup/teardown and
45connectivity, multi process support, as well as lots of skip / database
46selection rules.
47
48Running tests with pytest directly grants more immediate control over
49database options and test selection.
50
51A generic pytest run looks like::
52
53    pytest -n4
54
55Above, the full test suite will run against SQLite, using four processes.
56If the "-n" flag is not used, the pytest-xdist is skipped and the tests will
57run linearly, which will take a pretty long time.
58
59The pytest command line is more handy for running subsets of tests and to
60quickly allow for custom database connections.  Example::
61
62    pytest --dburi=postgresql+psycopg2://scott:tiger@localhost/test  test/sql/test_query.py
63
64Above will run the tests in the test/sql/test_query.py file (a pretty good
65file for basic "does this database work at all?" to start with) against a
66running PostgreSQL database at the given URL.
67
68The pytest frontend can also run tests against multiple kinds of databases at
69once - a large subset of tests are marked as "backend" tests, which will be run
70against each available backend, and additionally lots of tests are targeted at
71specific backends only, which only run if a matching backend is made available.
72For example, to run the test suite against both PostgreSQL and MySQL at the
73same time::
74
75    pytest -n4 --db postgresql --db mysql
76
77
78Setting Up Databases
79====================
80
81The test suite identifies several built-in database tags that run against
82a pre-set URL.  These can be seen using --dbs::
83
84    $ pytest --dbs
85    Available --db options (use --dburi to override)
86                 default    sqlite:///:memory:
87                firebird    firebird://sysdba:masterkey@localhost//Users/classic/foo.fdb
88                 mariadb    mariadb://scott:tiger@192.168.0.199:3307/test
89                   mssql    mssql+pyodbc://scott:tiger^5HHH@mssql2017:1433/test?driver=ODBC+Driver+13+for+SQL+Server
90           mssql_pymssql    mssql+pymssql://scott:tiger@ms_2008
91                   mysql    mysql://scott:tiger@127.0.0.1:3306/test?charset=utf8mb4
92                  oracle    oracle://scott:tiger@127.0.0.1:1521
93                 oracle8    oracle://scott:tiger@127.0.0.1:1521/?use_ansi=0
94                  pg8000    postgresql+pg8000://scott:tiger@127.0.0.1:5432/test
95              postgresql    postgresql://scott:tiger@127.0.0.1:5432/test
96    postgresql_psycopg2cffi postgresql+psycopg2cffi://scott:tiger@127.0.0.1:5432/test
97                 pymysql    mysql+pymysql://scott:tiger@127.0.0.1:3306/test?charset=utf8mb4
98                  sqlite    sqlite:///:memory:
99             sqlite_file    sqlite:///querytest.db
100
101Note that a pyodbc URL **must be against a hostname / database name
102combination, not a DSN name** when using the multiprocessing option; this is
103because the test suite needs to generate new URLs to refer to per-process
104databases that are created on the fly.
105
106What those mean is that if you have a database running that can be accessed
107by the above URL, you can run the test suite against it using ``--db <name>``.
108
109The URLs are present in the ``setup.cfg`` file.   You can make your own URLs by
110creating a new file called ``test.cfg`` and adding your own ``[db]`` section::
111
112    # test.cfg file
113    [db]
114    my_postgresql=postgresql://username:pass@hostname/dbname
115
116Above, we can now run the tests with ``my_postgresql``::
117
118    pytest --db my_postgresql
119
120We can also override the existing names in our ``test.cfg`` file, so that we can run
121with the tox runner also::
122
123    # test.cfg file
124    [db]
125    postgresql=postgresql://username:pass@hostname/dbname
126
127Now when we run ``tox -e py27-postgresql``, it will use our custom URL instead
128of the fixed one in setup.cfg.
129
130Database Configuration
131======================
132
133Step one, the **database chosen for tests must be entirely empty**.  A lot
134of what SQLAlchemy tests is creating and dropping lots of tables
135as well as running database introspection to see what is there.  If there
136are pre-existing tables or other objects in the target database already,
137these will get in the way.   A failed test run can also be followed by
138 a run that includes the "--dropfirst" option, which will try to drop
139all existing tables in the target database.
140
141The above paragraph changes somewhat when the multiprocessing option
142is used, in that separate databases will be created instead, however
143in the case of Postgresql, the starting database is used as a template,
144so the starting database must still be empty.  See below for example
145configurations using docker.
146
147The test runner will by default create and drop tables within the default
148database that's in the database URL, *unless* the multiprocessing option is in
149use via the pytest "-n" flag, which invokes pytest-xdist.   The
150multiprocessing option is **enabled by default** when using the tox runner.
151When multiprocessing is used, the SQLAlchemy testing framework will create a
152new database for each process, and then tear it down after the test run is
153complete.    So it will be necessary for the database user to have access to
154CREATE DATABASE in order for this to work.   Additionally, as mentioned
155earlier, the database URL must be formatted such that it can be rewritten on
156the fly to refer to these other databases, which means for pyodbc it must refer
157to a hostname/database name combination, not a DSN name.
158
159Several tests require alternate usernames or schemas to be present, which
160are used to test dotted-name access scenarios.  On some databases such
161as Oracle these are usernames, and others such as PostgreSQL
162and MySQL they are schemas.   The requirement applies to all backends
163except SQLite and Firebird.  The names are::
164
165    test_schema
166    test_schema_2 (only used on PostgreSQL and mssql)
167
168Please refer to your vendor documentation for the proper syntax to create
169these namespaces - the database user must have permission to create and drop
170tables within these schemas.  Its perfectly fine to run the test suite
171without these namespaces present, it only means that a handful of tests which
172expect them to be present will fail.
173
174Additional steps specific to individual databases are as follows::
175
176    POSTGRESQL: To enable unicode testing with JSONB, create the
177    database with UTF8 encoding::
178
179        postgres=# create database test with owner=scott encoding='utf8' template=template0;
180
181    To include tests for HSTORE, create the HSTORE type engine::
182
183        postgres=# \c test;
184        You are now connected to database "test" as user "postgresql".
185        test=# create extension hstore;
186        CREATE EXTENSION
187
188    Full-text search configuration should be set to English, else
189    several tests of ``.match()`` will fail. This can be set (if it isn't so
190    already) with:
191
192     ALTER DATABASE test SET default_text_search_config = 'pg_catalog.english'
193
194    For two-phase transaction support, the max_prepared_transactions
195    configuration variable must be set to a non-zero value in postgresql.conf.
196    See
197    https://www.postgresql.org/docs/current/runtime-config-resource.html#GUC-MAX-PREPARED-TRANSACTIONS
198    for further background.
199
200    ORACLE: a user named "test_schema" is created in addition to the default
201    user.
202
203    The primary database user needs to be able to create and drop tables,
204    synonyms, and constraints within the "test_schema" user.   For this
205    to work fully, including that the user has the "REFERENCES" role
206    in a remote schema for tables not yet defined (REFERENCES is per-table),
207    it is required that the test the user be present in the "DBA" role:
208
209        grant dba to scott;
210
211    MSSQL: Tests that involve multiple connections require Snapshot Isolation
212    ability implemented on the test database in order to prevent deadlocks that
213    will occur with record locking isolation. This feature is only available
214    with MSSQL 2005 and greater. You must enable snapshot isolation at the
215    database level and set the default cursor isolation with two SQL commands:
216
217     ALTER DATABASE MyDatabase SET ALLOW_SNAPSHOT_ISOLATION ON
218
219     ALTER DATABASE MyDatabase SET READ_COMMITTED_SNAPSHOT ON
220
221Docker Configurations
222---------------------
223
224The SQLAlchemy test can run against database running in Docker containers.
225This ensures that they are empty and that their configuration is not influenced
226by any local usage.
227
228The following configurations are just examples that developers can use to
229quickly set up a local environment for SQLAlchemy development. They are **NOT**
230intended for production use!
231
232**PostgreSQL configuration**::
233
234    # create the container with the proper configuration for sqlalchemy
235    docker run --rm -e POSTGRES_USER='scott' -e POSTGRES_PASSWORD='tiger' -e POSTGRES_DB='test' -p 127.0.0.1:5432:5432 -d --name postgres postgres
236
237    # configure the database
238    sleep 10
239    docker exec -ti postgres psql -U scott -c 'CREATE SCHEMA test_schema; CREATE SCHEMA test_schema_2;' test
240    # this last command is optional
241    docker exec -ti postgres sed -i 's/#max_prepared_transactions = 0/max_prepared_transactions = 10/g' /var/lib/postgresql/data/postgresql.conf
242
243    # To stop the container. It will also remove it.
244    docker stop postgres
245
246**MySQL configuration**::
247
248    # create the container with the proper configuration for sqlalchemy
249    docker run --rm -e MYSQL_USER='scott' -e MYSQL_PASSWORD='tiger' -e MYSQL_DATABASE='test' -e MYSQL_ROOT_PASSWORD='password' -p 127.0.0.1:3306:3306 -d --name mysql mysql --character-set-server=utf8mb4 --collation-server=utf8mb4_unicode_ci
250
251    # configure the database
252    sleep 20
253    docker exec -ti mysql mysql -u root -ppassword -w -e "CREATE DATABASE test_schema CHARSET utf8mb4; GRANT ALL ON test_schema.* TO scott;"
254
255    # To stop the container. It will also remove it.
256    docker stop mysql
257
258**MariaDB configuration**::
259
260    # create the container with the proper configuration for sqlalchemy
261    docker run --rm -e MARIADB_USER='scott' -e MARIADB_PASSWORD='tiger' -e MARIADB_DATABASE='test' -e MARIADB_ROOT_PASSWORD='password' -p 127.0.0.1:3306:3306 -d --name mariadb mariadb --character-set-server=utf8mb4 --collation-server=utf8mb4_unicode_ci
262
263    # configure the database
264    sleep 20
265    docker exec -ti mariadb mysql -u root -ppassword -w -e "CREATE DATABASE test_schema CHARSET utf8mb4; GRANT ALL ON test_schema.* TO scott;"
266
267    # To stop the container. It will also remove it.
268    docker stop mariadb
269
270**MSSQL configuration**::
271
272    # create the container with the proper configuration for sqlalchemy
273    # it will use the Developer version
274    docker run --rm -e 'ACCEPT_EULA=Y' -e 'SA_PASSWORD=yourStrong(!)Password' -p 127.0.0.1:1433:1433 -d --name mssql mcr.microsoft.com/mssql/server
275
276    # configure the database
277    sleep 20
278    docker exec -it mssql /opt/mssql-tools/bin/sqlcmd -S localhost -U sa -P 'yourStrong(!)Password' -Q "sp_configure 'contained database authentication', 1; RECONFIGURE; CREATE DATABASE test CONTAINMENT = PARTIAL; ALTER DATABASE test SET ALLOW_SNAPSHOT_ISOLATION ON; ALTER DATABASE test SET READ_COMMITTED_SNAPSHOT ON; CREATE LOGIN scott WITH PASSWORD = 'tiger^5HHH'; ALTER SERVER ROLE sysadmin ADD MEMBER scott;"
279    docker exec -it mssql /opt/mssql-tools/bin/sqlcmd -S localhost -U sa -P 'yourStrong(!)Password' -d test -Q "CREATE SCHEMA test_schema"
280    docker exec -it mssql /opt/mssql-tools/bin/sqlcmd -S localhost -U sa -P 'yourStrong(!)Password' -d test -Q "CREATE SCHEMA test_schema_2"
281
282    # To stop the container. It will also remove it.
283    docker stop mssql
284
285NOTE: with this configuration the url to use is not the default one configured
286in setup, but ``mssql+pymssql://scott:tiger^5HHH@127.0.0.1:1433/test``.  It can
287be used with pytest by using ``--db docker_mssql``.
288
289CONFIGURING LOGGING
290-------------------
291SQLAlchemy logs its activity and debugging through Python's logging package.
292Any log target can be directed to the console with command line options, such
293as::
294
295    $ ./pytest test/orm/test_unitofwork.py -s \
296      --log-debug=sqlalchemy.pool --log-info=sqlalchemy.engine
297
298Above we add the pytest "-s" flag so that standard out is not suppressed.
299
300
301DEVELOPING AND TESTING NEW DIALECTS
302-----------------------------------
303
304See the file README.dialects.rst for detail on dialects.
305
306
307