1.. _playhouse:
2
3Playhouse, extensions to Peewee
4===============================
5
6Peewee comes with numerous extension modules which are collected under the
7``playhouse`` namespace. Despite the silly name, there are some very useful
8extensions, particularly those that expose vendor-specific database features
9like the :ref:`sqlite_ext` and :ref:`postgres_ext` extensions.
10
11Below you will find a loosely organized listing of the various modules that
12make up the ``playhouse``.
13
14**Database drivers / vendor-specific database functionality**
15
16* :ref:`sqlite_ext` (on its own page)
17* :ref:`sqliteq`
18* :ref:`sqlite_udf`
19* :ref:`apsw`
20* :ref:`sqlcipher_ext`
21* :ref:`postgres_ext`
22* :ref:`crdb`
23* :ref:`mysql_ext`
24
25**High-level features**
26
27* :ref:`extra-fields`
28* :ref:`shortcuts`
29* :ref:`hybrid`
30* :ref:`kv`
31* :ref:`signals`
32* :ref:`dataset`
33
34**Database management and framework integration**
35
36* :ref:`pwiz`
37* :ref:`migrate`
38* :ref:`pool`
39* :ref:`reflection`
40* :ref:`db_url`
41* :ref:`test_utils`
42* :ref:`flask_utils`
43
44Sqlite Extensions
45-----------------
46
47The Sqlite extensions have been moved to :ref:`their own page <sqlite_ext>`.
48
49.. _sqliteq:
50
51SqliteQ
52-------
53
54The ``playhouse.sqliteq`` module provides a subclass of
55:py:class:`SqliteExtDatabase`, that will serialize concurrent writes to a
56SQLite database. :py:class:`SqliteQueueDatabase` can be used as a drop-in
57replacement for the regular :py:class:`SqliteDatabase` if you want simple
58**read and write** access to a SQLite database from **multiple threads**.
59
60SQLite only allows one connection to write to the database at any given time.
61As a result, if you have a multi-threaded application (like a web-server, for
62example) that needs to write to the database, you may see occasional errors
63when one or more of the threads attempting to write cannot acquire the lock.
64
65:py:class:`SqliteQueueDatabase` is designed to simplify things by sending all
66write queries through a single, long-lived connection. The benefit is that you
67get the appearance of multiple threads writing to the database without
68conflicts or timeouts. The downside, however, is that you cannot issue
69write transactions that encompass multiple queries -- all writes run in
70autocommit mode, essentially.
71
72.. note::
73    The module gets its name from the fact that all write queries get put into
74    a thread-safe queue. A single worker thread listens to the queue and
75    executes all queries that are sent to it.
76
77Transactions
78^^^^^^^^^^^^
79
80Because all queries are serialized and executed by a single worker thread, it
81is possible for transactional SQL from separate threads to be executed
82out-of-order. In the example below, the transaction started by thread "B" is
83rolled back by thread "A" (with bad consequences!):
84
85* Thread A: UPDATE transplants SET organ='liver', ...;
86* Thread B: BEGIN TRANSACTION;
87* Thread B: UPDATE life_support_system SET timer += 60 ...;
88* Thread A: ROLLBACK; -- Oh no....
89
90Since there is a potential for queries from separate transactions to be
91interleaved, the :py:meth:`~SqliteQueueDatabase.transaction` and
92:py:meth:`~SqliteQueueDatabase.atomic` methods are disabled on :py:class:`SqliteQueueDatabase`.
93
94For cases when you wish to temporarily write to the database from a different
95thread, you can use the :py:meth:`~SqliteQueueDatabase.pause` and
96:py:meth:`~SqliteQueueDatabase.unpause` methods. These methods block the
97caller until the writer thread is finished with its current workload. The
98writer then disconnects and the caller takes over until ``unpause`` is called.
99
100The :py:meth:`~SqliteQueueDatabase.stop`, :py:meth:`~SqliteQueueDatabase.start`,
101and :py:meth:`~SqliteQueueDatabase.is_stopped` methods can also be used to
102control the writer thread.
103
104.. note::
105    Take a look at SQLite's `isolation <https://www.sqlite.org/isolation.html>`_
106    documentation for more information about how SQLite handles concurrent
107    connections.
108
109Code sample
110^^^^^^^^^^^
111
112Creating a database instance does not require any special handling. The
113:py:class:`SqliteQueueDatabase` accepts some special parameters which you
114should be aware of, though. If you are using `gevent <http://gevent.org>`_, you
115must specify ``use_gevent=True`` when instantiating your database -- this way
116Peewee will know to use the appropriate objects for handling queueing, thread
117creation, and locking.
118
119.. code-block:: python
120
121    from playhouse.sqliteq import SqliteQueueDatabase
122
123    db = SqliteQueueDatabase(
124        'my_app.db',
125        use_gevent=False,  # Use the standard library "threading" module.
126        autostart=False,  # The worker thread now must be started manually.
127        queue_max_size=64,  # Max. # of pending writes that can accumulate.
128        results_timeout=5.0)  # Max. time to wait for query to be executed.
129
130
131If ``autostart=False``, as in the above example, you will need to call
132:py:meth:`~SqliteQueueDatabase.start` to bring up the worker threads that will
133do the actual write query execution.
134
135.. code-block:: python
136
137    @app.before_first_request
138    def _start_worker_threads():
139        db.start()
140
141If you plan on performing SELECT queries or generally wanting to access the
142database, you will need to call :py:meth:`~Database.connect` and
143:py:meth:`~Database.close` as you would with any other database instance.
144
145When your application is ready to terminate, use the :py:meth:`~SqliteQueueDatabase.stop`
146method to shut down the worker thread. If there was a backlog of work, then
147this method will block until all pending work is finished (though no new work
148is allowed).
149
150.. code-block:: python
151
152    import atexit
153
154    @atexit.register
155    def _stop_worker_threads():
156        db.stop()
157
158
159Lastly, the :py:meth:`~SqliteQueueDatabase.is_stopped` method can be used to
160determine whether the database writer is up and running.
161
162.. _sqlite_udf:
163
164Sqlite User-Defined Functions
165-----------------------------
166
167The ``sqlite_udf`` playhouse module contains a number of user-defined
168functions, aggregates, and table-valued functions, which you may find useful.
169The functions are grouped in collections and you can register these
170user-defined extensions individually, by collection, or register everything.
171
172Scalar functions are functions which take a number of parameters and return a
173single value. For example, converting a string to upper-case, or calculating
174the MD5 hex digest.
175
176Aggregate functions are like scalar functions that operate on multiple rows of
177data, producing a single result. For example, calculating the sum of a list of
178integers, or finding the smallest value in a particular column.
179
180Table-valued functions are simply functions that can return multiple rows of
181data. For example, a regular-expression search function that returns all the
182matches in a given string, or a function that accepts two dates and generates
183all the intervening days.
184
185.. note::
186    To use table-valued functions, you will need to build the
187    ``playhouse._sqlite_ext`` C extension.
188
189Registering user-defined functions:
190
191.. code-block:: python
192
193    db = SqliteDatabase('my_app.db')
194
195    # Register *all* functions.
196    register_all(db)
197
198    # Alternatively, you can register individual groups. This will just
199    # register the DATE and MATH groups of functions.
200    register_groups(db, 'DATE', 'MATH')
201
202    # If you only wish to register, say, the aggregate functions for a
203    # particular group or groups, you can:
204    register_aggregate_groups(db, 'DATE')
205
206    # If you only wish to register a single function, then you can:
207    from playhouse.sqlite_udf import gzip, gunzip
208    db.register_function(gzip, 'gzip')
209    db.register_function(gunzip, 'gunzip')
210
211Using a library function ("hostname"):
212
213.. code-block:: python
214
215    # Assume we have a model, Link, that contains lots of arbitrary URLs.
216    # We want to discover the most common hosts that have been linked.
217    query = (Link
218             .select(fn.hostname(Link.url).alias('host'), fn.COUNT(Link.id))
219             .group_by(fn.hostname(Link.url))
220             .order_by(fn.COUNT(Link.id).desc())
221             .tuples())
222
223    # Print the hostname along with number of links associated with it.
224    for host, count in query:
225        print('%s: %s' % (host, count))
226
227
228Functions, listed by collection name
229^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
230
231Scalar functions are indicated by ``(f)``, aggregate functions by ``(a)``, and
232table-valued functions by ``(t)``.
233
234**CONTROL_FLOW**
235
236.. py:function:: if_then_else(cond, truthy[, falsey=None])
237
238    Simple ternary-type operator, where, depending on the truthiness of the
239    ``cond`` parameter, either the ``truthy`` or ``falsey`` value will be
240    returned.
241
242**DATE**
243
244.. py:function:: strip_tz(date_str)
245
246    :param date_str: A datetime, encoded as a string.
247    :returns: The datetime with any timezone info stripped off.
248
249    The time is not adjusted in any way, the timezone is simply removed.
250
251.. py:function:: humandelta(nseconds[, glue=', '])
252
253    :param int nseconds: Number of seconds, total, in timedelta.
254    :param str glue: Fragment to join values.
255    :returns: Easy-to-read description of timedelta.
256
257    Example, 86471 -> "1 day, 1 minute, 11 seconds"
258
259.. py:function:: mintdiff(datetime_value)
260
261    :param datetime_value: A date-time.
262    :returns: Minimum difference between any two values in list.
263
264    Aggregate function that computes the minimum difference between any two
265    datetimes.
266
267.. py:function:: avgtdiff(datetime_value)
268
269    :param datetime_value: A date-time.
270    :returns: Average difference between values in list.
271
272    Aggregate function that computes the average difference between consecutive
273    values in the list.
274
275.. py:function:: duration(datetime_value)
276
277    :param datetime_value: A date-time.
278    :returns: Duration from smallest to largest value in list, in seconds.
279
280    Aggregate function that computes the duration from the smallest to the
281    largest value in the list, returned in seconds.
282
283.. py:function:: date_series(start, stop[, step_seconds=86400])
284
285    :param datetime start: Start datetime
286    :param datetime stop: Stop datetime
287    :param int step_seconds: Number of seconds comprising a step.
288
289    Table-value function that returns rows consisting of the date/+time values
290    encountered iterating from start to stop, ``step_seconds`` at a time.
291
292    Additionally, if start does not have a time component and step_seconds is
293    greater-than-or-equal-to one day (86400 seconds), the values returned will
294    be dates. Conversely, if start does not have a date component, values will
295    be returned as times. Otherwise values are returned as datetimes.
296
297    Example:
298
299    .. code-block:: sql
300
301        SELECT * FROM date_series('2017-01-28', '2017-02-02');
302
303        value
304        -----
305        2017-01-28
306        2017-01-29
307        2017-01-30
308        2017-01-31
309        2017-02-01
310        2017-02-02
311
312**FILE**
313
314.. py:function:: file_ext(filename)
315
316    :param str filename: Filename to extract extension from.
317    :return: Returns the file extension, including the leading ".".
318
319.. py:function:: file_read(filename)
320
321    :param str filename: Filename to read.
322    :return: Contents of the file.
323
324**HELPER**
325
326.. py:function:: gzip(data[, compression=9])
327
328    :param bytes data: Data to compress.
329    :param int compression: Compression level (9 is max).
330    :returns: Compressed binary data.
331
332.. py:function:: gunzip(data)
333
334    :param bytes data: Compressed data.
335    :returns: Uncompressed binary data.
336
337.. py:function:: hostname(url)
338
339    :param str url: URL to extract hostname from.
340    :returns: hostname portion of URL
341
342.. py:function:: toggle(key)
343
344    :param key: Key to toggle.
345
346    Toggle a key between True/False state. Example:
347
348    .. code-block:: pycon
349
350        >>> toggle('my-key')
351        True
352        >>> toggle('my-key')
353        False
354        >>> toggle('my-key')
355        True
356
357.. py:function:: setting(key[, value=None])
358
359    :param key: Key to set/retrieve.
360    :param value: Value to set.
361    :returns: Value associated with key.
362
363    Store/retrieve a setting in memory and persist during lifetime of
364    application. To get the current value, only specify the key. To set a new
365    value, call with key and new value.
366
367.. py:function:: clear_toggles()
368
369    Clears all state associated with the :py:func:`toggle` function.
370
371.. py:function:: clear_settings()
372
373    Clears all state associated with the :py:func:`setting` function.
374
375**MATH**
376
377.. py:function:: randomrange(start[, stop=None[, step=None]])
378
379    :param int start: Start of range (inclusive)
380    :param int end: End of range(not inclusive)
381    :param int step: Interval at which to return a value.
382
383    Return a random integer between ``[start, end)``.
384
385.. py:function:: gauss_distribution(mean, sigma)
386
387    :param float mean: Mean value
388    :param float sigma: Standard deviation
389
390.. py:function:: sqrt(n)
391
392    Calculate the square root of ``n``.
393
394.. py:function:: tonumber(s)
395
396    :param str s: String to convert to number.
397    :returns: Integer, floating-point or NULL on failure.
398
399.. py:function:: mode(val)
400
401    :param val: Numbers in list.
402    :returns: The mode, or most-common, number observed.
403
404    Aggregate function which calculates *mode* of values.
405
406.. py:function:: minrange(val)
407
408    :param val: Value
409    :returns: Min difference between two values.
410
411    Aggregate function which calculates the minimal distance between two
412    numbers in the sequence.
413
414.. py:function:: avgrange(val)
415
416    :param val: Value
417    :returns: Average difference between values.
418
419    Aggregate function which calculates the average distance between two
420    consecutive numbers in the sequence.
421
422.. py:function:: range(val)
423
424    :param val: Value
425    :returns: The range from the smallest to largest value in sequence.
426
427    Aggregate function which returns range of values observed.
428
429.. py:function:: median(val)
430
431    :param val: Value
432    :returns: The median, or middle, value in a sequence.
433
434    Aggregate function which calculates the middle value in a sequence.
435
436    .. note:: Only available if you compiled the ``_sqlite_udf`` extension.
437
438**STRING**
439
440.. py:function:: substr_count(haystack, needle)
441
442    Returns number of times ``needle`` appears in ``haystack``.
443
444.. py:function:: strip_chars(haystack, chars)
445
446    Strips any characters in ``chars`` from beginning and end of ``haystack``.
447
448.. py:function:: damerau_levenshtein_dist(s1, s2)
449
450    Computes the edit distance from s1 to s2 using the damerau variant of the
451    levenshtein algorithm.
452
453    .. note:: Only available if you compiled the ``_sqlite_udf`` extension.
454
455.. py:function:: levenshtein_dist(s1, s2)
456
457    Computes the edit distance from s1 to s2 using the levenshtein algorithm.
458
459    .. note:: Only available if you compiled the ``_sqlite_udf`` extension.
460
461.. py:function:: str_dist(s1, s2)
462
463    Computes the edit distance from s1 to s2 using the standard library
464    SequenceMatcher's algorithm.
465
466    .. note:: Only available if you compiled the ``_sqlite_udf`` extension.
467
468.. py:function:: regex_search(regex, search_string)
469
470    :param str regex: Regular expression
471    :param str search_string: String to search for instances of regex.
472
473    Table-value function that searches a string for substrings that match
474    the provided ``regex``. Returns rows for each match found.
475
476    Example:
477
478    .. code-block:: python
479
480        SELECT * FROM regex_search('\w+', 'extract words, ignore! symbols');
481
482        value
483        -----
484        extract
485        words
486        ignore
487        symbols
488
489.. _apsw:
490
491apsw, an advanced sqlite driver
492-------------------------------
493
494The ``apsw_ext`` module contains a database class suitable for use with
495the apsw sqlite driver.
496
497APSW Project page: https://github.com/rogerbinns/apsw
498
499APSW is a really neat library that provides a thin wrapper on top of SQLite's
500C interface, making it possible to use all of SQLite's advanced features.
501
502Here are just a few reasons to use APSW, taken from the documentation:
503
504* APSW gives all functionality of SQLite, including virtual tables, virtual
505  file system, blob i/o, backups and file control.
506* Connections can be shared across threads without any additional locking.
507* Transactions are managed explicitly by your code.
508* APSW can handle nested transactions.
509* Unicode is handled correctly.
510* APSW is faster.
511
512For more information on the differences between apsw and pysqlite,
513check `the apsw docs <http://rogerbinns.github.io/apsw/>`_.
514
515How to use the APSWDatabase
516^^^^^^^^^^^^^^^^^^^^^^^^^^^
517
518.. code-block:: python
519
520    from apsw_ext import *
521
522    db = APSWDatabase(':memory:')
523
524    class BaseModel(Model):
525        class Meta:
526            database = db
527
528    class SomeModel(BaseModel):
529        col1 = CharField()
530        col2 = DateTimeField()
531
532
533apsw_ext API notes
534^^^^^^^^^^^^^^^^^^
535
536:py:class:`APSWDatabase` extends the :py:class:`SqliteExtDatabase` and inherits
537its advanced features.
538
539.. py:class:: APSWDatabase(database, **connect_kwargs)
540
541    :param string database: filename of sqlite database
542    :param connect_kwargs: keyword arguments passed to apsw when opening a connection
543
544    .. py:method:: register_module(mod_name, mod_inst)
545
546        Provides a way of globally registering a module. For more information,
547        see the `documentation on virtual tables <http://rogerbinns.github.io/apsw/vtable.html>`_.
548
549        :param string mod_name: name to use for module
550        :param object mod_inst: an object implementing the `Virtual Table <http://rogerbinns.github.io/apsw/vtable.html#vttable-class>`_ interface
551
552    .. py:method:: unregister_module(mod_name)
553
554        Unregister a module.
555
556        :param string mod_name: name to use for module
557
558.. note::
559    Be sure to use the ``Field`` subclasses defined in the ``apsw_ext``
560    module, as they will properly handle adapting the data types for storage.
561
562    For example, instead of using ``peewee.DateTimeField``, be sure you are importing
563    and using ``playhouse.apsw_ext.DateTimeField``.
564
565
566.. _sqlcipher_ext:
567
568Sqlcipher backend
569-----------------
570
571* Although this extention's code is short, it has not been properly
572  peer-reviewed yet and may have introduced vulnerabilities.
573
574Also note that this code relies on pysqlcipher_ and sqlcipher_, and
575the code there might have vulnerabilities as well, but since these
576are widely used crypto modules, we can expect "short zero days" there.
577
578..  _pysqlcipher: https://pypi.python.org/pypi/pysqlcipher3
579..  _sqlcipher: http://sqlcipher.net
580
581sqlcipher_ext API notes
582^^^^^^^^^^^^^^^^^^^^^^^
583
584.. py:class:: SqlCipherDatabase(database, passphrase, **kwargs)
585
586    Subclass of :py:class:`SqliteDatabase` that stores the database
587    encrypted. Instead of the standard ``sqlite3`` backend, it uses pysqlcipher_:
588    a python wrapper for sqlcipher_, which -- in turn -- is an encrypted wrapper
589    around ``sqlite3``, so the API is *identical* to :py:class:`SqliteDatabase`'s,
590    except for object construction parameters:
591
592    :param database: Path to encrypted database filename to open [or create].
593    :param passphrase: Database encryption passphrase: should be at least 8 character
594        long, but it is *strongly advised* to enforce better `passphrase strength`_
595        criteria in your implementation.
596
597    * If the ``database`` file doesn't exist, it will be *created* with
598      encryption by a key derived from ``passhprase``.
599    * When trying to open an existing database, ``passhprase`` should be
600      identical to the ones used when it was created. If the passphrase is
601      incorrect, an error will be raised when first attempting to access the
602      database.
603
604    .. py:method:: rekey(passphrase)
605
606        :param str passphrase: New passphrase for database.
607
608        Change the passphrase for database.
609
610.. _passphrase strength: https://en.wikipedia.org/wiki/Password_strength
611
612.. note::
613    SQLCipher can be configured using a number of extension PRAGMAs. The list
614    of PRAGMAs and their descriptions can be found in the `SQLCipher documentation <https://www.zetetic.net/sqlcipher/sqlcipher-api/>`_.
615
616    For example to specify the number of PBKDF2 iterations for the key
617    derivation (64K in SQLCipher 3.x, 256K in SQLCipher 4.x by default):
618
619    .. code-block:: python
620
621        # Use 1,000,000 iterations.
622        db = SqlCipherDatabase('my_app.db', pragmas={'kdf_iter': 1000000})
623
624    To use a cipher page-size of 16KB and a cache-size of 10,000 pages:
625
626    .. code-block:: python
627
628        db = SqlCipherDatabase('my_app.db', passphrase='secret!!!', pragmas={
629            'cipher_page_size': 1024 * 16,
630            'cache_size': 10000})  # 10,000 16KB pages, or 160MB.
631
632
633Example of prompting the user for a passphrase:
634
635.. code-block:: python
636
637    db = SqlCipherDatabase(None)
638
639    class BaseModel(Model):
640        """Parent for all app's models"""
641        class Meta:
642            # We won't have a valid db until user enters passhrase.
643            database = db
644
645    # Derive our model subclasses
646    class Person(BaseModel):
647        name = TextField(primary_key=True)
648
649    right_passphrase = False
650    while not right_passphrase:
651        db.init(
652            'testsqlcipher.db',
653            passphrase=get_passphrase_from_user())
654
655        try:  # Actually execute a query against the db to test passphrase.
656            db.get_tables()
657        except DatabaseError as exc:
658            # This error indicates the password was wrong.
659            if exc.args[0] == 'file is encrypted or is not a database':
660                tell_user_the_passphrase_was_wrong()
661                db.init(None)  # Reset the db.
662            else:
663                raise exc
664        else:
665            # The password was correct.
666            right_passphrase = True
667
668See also: a slightly more elaborate `example <https://gist.github.com/thedod/11048875#file-testpeeweesqlcipher-py>`_.
669
670.. _postgres_ext:
671
672Postgresql Extensions
673---------------------
674
675The postgresql extensions module provides a number of "postgres-only" functions,
676currently:
677
678* :ref:`json support <pgjson>`, including *jsonb* for Postgres 9.4.
679* :ref:`hstore support <hstore>`
680* :ref:`server-side cursors <server_side_cursors>`
681* :ref:`full-text search <pg_fts>`
682* :py:class:`ArrayField` field type, for storing arrays.
683* :py:class:`HStoreField` field type, for storing key/value pairs.
684* :py:class:`IntervalField` field type, for storing ``timedelta`` objects.
685* :py:class:`JSONField` field type, for storing JSON data.
686* :py:class:`BinaryJSONField` field type for the ``jsonb`` JSON data type.
687* :py:class:`TSVectorField` field type, for storing full-text search data.
688* :py:class:`DateTimeTZField` field type, a timezone-aware datetime field.
689
690In the future I would like to add support for more of postgresql's features.
691If there is a particular feature you would like to see added, please
692`open a Github issue <https://github.com/coleifer/peewee/issues>`_.
693
694.. warning:: In order to start using the features described below, you will need to use the extension :py:class:`PostgresqlExtDatabase` class instead of :py:class:`PostgresqlDatabase`.
695
696The code below will assume you are using the following database and base model:
697
698.. code-block:: python
699
700    from playhouse.postgres_ext import *
701
702    ext_db = PostgresqlExtDatabase('peewee_test', user='postgres')
703
704    class BaseExtModel(Model):
705        class Meta:
706            database = ext_db
707
708.. _pgjson:
709
710JSON Support
711^^^^^^^^^^^^
712
713peewee has basic support for Postgres' native JSON data type, in the form of
714:py:class:`JSONField`. As of version 2.4.7, peewee also supports the Postgres
7159.4 binary json ``jsonb`` type, via :py:class:`BinaryJSONField`.
716
717.. warning::
718  Postgres supports a JSON data type natively as of 9.2 (full support in 9.3).
719  In order to use this functionality you must be using the correct version of
720  Postgres with `psycopg2` version 2.5 or greater.
721
722  To use :py:class:`BinaryJSONField`, which has many performance and querying
723  advantages, you must have Postgres 9.4 or later.
724
725.. note::
726  You must be sure your database is an instance of
727  :py:class:`PostgresqlExtDatabase` in order to use the `JSONField`.
728
729Here is an example of how you might declare a model with a JSON field:
730
731.. code-block:: python
732
733    import json
734    import urllib2
735    from playhouse.postgres_ext import *
736
737    db = PostgresqlExtDatabase('my_database')
738
739    class APIResponse(Model):
740        url = CharField()
741        response = JSONField()
742
743        class Meta:
744            database = db
745
746        @classmethod
747        def request(cls, url):
748            fh = urllib2.urlopen(url)
749            return cls.create(url=url, response=json.loads(fh.read()))
750
751    APIResponse.create_table()
752
753    # Store a JSON response.
754    offense = APIResponse.request('http://crime-api.com/api/offense/')
755    booking = APIResponse.request('http://crime-api.com/api/booking/')
756
757    # Query a JSON data structure using a nested key lookup:
758    offense_responses = APIResponse.select().where(
759        APIResponse.response['meta']['model'] == 'offense')
760
761    # Retrieve a sub-key for each APIResponse. By calling .as_json(), the
762    # data at the sub-key will be returned as Python objects (dicts, lists,
763    # etc) instead of serialized JSON.
764    q = (APIResponse
765         .select(
766           APIResponse.data['booking']['person'].as_json().alias('person'))
767         .where(APIResponse.data['meta']['model'] == 'booking'))
768
769    for result in q:
770        print(result.person['name'], result.person['dob'])
771
772The :py:class:`BinaryJSONField` works the same and supports the same operations
773as the regular :py:class:`JSONField`, but provides several additional
774operations for testing **containment**. Using the binary json field, you can
775test whether your JSON data contains other partial JSON structures
776(:py:meth:`~BinaryJSONField.contains`, :py:meth:`~BinaryJSONField.contains_any`,
777:py:meth:`~BinaryJSONField.contains_all`), or whether it is a subset of a
778larger JSON document (:py:meth:`~BinaryJSONField.contained_by`).
779
780For more examples, see the :py:class:`JSONField` and
781:py:class:`BinaryJSONField` API documents below.
782
783.. _hstore:
784
785hstore support
786^^^^^^^^^^^^^^
787
788`Postgresql hstore <http://www.postgresql.org/docs/current/static/hstore.html>`_
789is an embedded key/value store. With hstore, you can store arbitrary key/value
790pairs in your database alongside structured relational data.
791
792To use ``hstore``, you need to specify an additional parameter when
793instantiating your :py:class:`PostgresqlExtDatabase`:
794
795.. code-block:: python
796
797    # Specify "register_hstore=True":
798    db = PostgresqlExtDatabase('my_db', register_hstore=True)
799
800Currently the ``postgres_ext`` module supports the following operations:
801
802* Store and retrieve arbitrary dictionaries
803* Filter by key(s) or partial dictionary
804* Update/add one or more keys to an existing dictionary
805* Delete one or more keys from an existing dictionary
806* Select keys, values, or zip keys and values
807* Retrieve a slice of keys/values
808* Test for the existence of a key
809* Test that a key has a non-NULL value
810
811Using hstore
812^^^^^^^^^^^^
813
814To start with, you will need to import the custom database class and the hstore
815functions from ``playhouse.postgres_ext`` (see above code snippet). Then, it
816is as simple as adding a :py:class:`HStoreField` to your model:
817
818.. code-block:: python
819
820    class House(BaseExtModel):
821        address = CharField()
822        features = HStoreField()
823
824You can now store arbitrary key/value pairs on ``House`` instances:
825
826.. code-block:: pycon
827
828    >>> h = House.create(
829    ...     address='123 Main St',
830    ...     features={'garage': '2 cars', 'bath': '2 bath'})
831    ...
832    >>> h_from_db = House.get(House.id == h.id)
833    >>> h_from_db.features
834    {'bath': '2 bath', 'garage': '2 cars'}
835
836You can filter by individual key, multiple keys or partial dictionary:
837
838.. code-block:: pycon
839
840    >>> query = House.select()
841    >>> garage = query.where(House.features.contains('garage'))
842    >>> garage_and_bath = query.where(House.features.contains(['garage', 'bath']))
843    >>> twocar = query.where(House.features.contains({'garage': '2 cars'}))
844
845Suppose you want to do an atomic update to the house:
846
847.. code-block:: pycon
848
849    >>> new_features = House.features.update({'bath': '2.5 bath', 'sqft': '1100'})
850    >>> query = House.update(features=new_features)
851    >>> query.where(House.id == h.id).execute()
852    1
853    >>> h = House.get(House.id == h.id)
854    >>> h.features
855    {'bath': '2.5 bath', 'garage': '2 cars', 'sqft': '1100'}
856
857Or, alternatively an atomic delete:
858
859.. code-block:: pycon
860
861    >>> query = House.update(features=House.features.delete('bath'))
862    >>> query.where(House.id == h.id).execute()
863    1
864    >>> h = House.get(House.id == h.id)
865    >>> h.features
866    {'garage': '2 cars', 'sqft': '1100'}
867
868Multiple keys can be deleted at the same time:
869
870.. code-block:: pycon
871
872    >>> query = House.update(features=House.features.delete('garage', 'sqft'))
873
874You can select just keys, just values, or zip the two:
875
876.. code-block:: pycon
877
878    >>> for h in House.select(House.address, House.features.keys().alias('keys')):
879    ...     print(h.address, h.keys)
880
881    123 Main St [u'bath', u'garage']
882
883    >>> for h in House.select(House.address, House.features.values().alias('vals')):
884    ...     print(h.address, h.vals)
885
886    123 Main St [u'2 bath', u'2 cars']
887
888    >>> for h in House.select(House.address, House.features.items().alias('mtx')):
889    ...     print(h.address, h.mtx)
890
891    123 Main St [[u'bath', u'2 bath'], [u'garage', u'2 cars']]
892
893You can retrieve a slice of data, for example, all the garage data:
894
895.. code-block:: pycon
896
897    >>> query = House.select(House.address, House.features.slice('garage').alias('garage_data'))
898    >>> for house in query:
899    ...     print(house.address, house.garage_data)
900
901    123 Main St {'garage': '2 cars'}
902
903You can check for the existence of a key and filter rows accordingly:
904
905.. code-block:: pycon
906
907    >>> has_garage = House.features.exists('garage')
908    >>> for house in House.select(House.address, has_garage.alias('has_garage')):
909    ...     print(house.address, house.has_garage)
910
911    123 Main St True
912
913    >>> for house in House.select().where(House.features.exists('garage')):
914    ...     print(house.address, house.features['garage'])  # <-- just houses w/garage data
915
916    123 Main St 2 cars
917
918
919Interval support
920^^^^^^^^^^^^^^^^
921
922Postgres supports durations through the ``INTERVAL`` data-type (`docs <https://www.postgresql.org/docs/current/static/datatype-datetime.html>`_).
923
924.. py:class:: IntervalField([null=False, [...]])
925
926    Field class capable of storing Python ``datetime.timedelta`` instances.
927
928    Example:
929
930    .. code-block:: python
931
932        from datetime import timedelta
933
934        from playhouse.postgres_ext import *
935
936        db = PostgresqlExtDatabase('my_db')
937
938        class Event(Model):
939            location = CharField()
940            duration = IntervalField()
941            start_time = DateTimeField()
942
943            class Meta:
944                database = db
945
946            @classmethod
947            def get_long_meetings(cls):
948                return cls.select().where(cls.duration > timedelta(hours=1))
949
950.. _server_side_cursors:
951
952Server-side cursors
953^^^^^^^^^^^^^^^^^^^
954
955When psycopg2 executes a query, normally all results are fetched and returned
956to the client by the backend. This can cause your application to use a lot of
957memory when making large queries. Using server-side cursors, results are
958returned a little at a time (by default 2000 records). For the definitive
959reference, please see the `psycopg2 documentation <http://initd.org/psycopg/docs/usage.html#server-side-cursors>`_.
960
961.. note:: To use server-side (or named) cursors, you must be using :py:class:`PostgresqlExtDatabase`.
962
963To execute a query using a server-side cursor, simply wrap your select query
964using the :py:func:`ServerSide` helper:
965
966.. code-block:: python
967
968    large_query = PageView.select()  # Build query normally.
969
970    # Iterate over large query inside a transaction.
971    for page_view in ServerSide(large_query):
972        # do some interesting analysis here.
973        pass
974
975    # Server-side resources are released.
976
977If you would like all ``SELECT`` queries to automatically use a server-side
978cursor, you can specify this when creating your :py:class:`PostgresqlExtDatabase`:
979
980.. code-block:: python
981
982    from postgres_ext import PostgresqlExtDatabase
983
984    ss_db = PostgresqlExtDatabase('my_db', server_side_cursors=True)
985
986.. note::
987    Server-side cursors live only as long as the transaction, so for this reason
988    peewee will not automatically call ``commit()`` after executing a ``SELECT``
989    query. If you do not ``commit`` after you are done iterating, you will not
990    release the server-side resources until the connection is closed (or the
991    transaction is committed later). Furthermore, since peewee will by default
992    cache rows returned by the cursor, you should always call ``.iterator()``
993    when iterating over a large query.
994
995    If you are using the :py:func:`ServerSide` helper, the transaction and
996    call to ``iterator()`` will be handled transparently.
997
998
999.. _pg_fts:
1000
1001Full-text search
1002^^^^^^^^^^^^^^^^
1003
1004Postgresql provides `sophisticated full-text search
1005<http://www.postgresql.org/docs/9.3/static/textsearch.html>`_ using special
1006data-types (``tsvector`` and ``tsquery``). Documents should be stored or
1007converted to the ``tsvector`` type, and search queries should be converted to
1008``tsquery``.
1009
1010For simple cases, you can simply use the :py:func:`Match` function, which will
1011automatically perform the appropriate conversions, and requires no schema
1012changes:
1013
1014.. code-block:: python
1015
1016    def blog_search(search_term):
1017        return Blog.select().where(
1018            (Blog.status == Blog.STATUS_PUBLISHED) &
1019            Match(Blog.content, search_term))
1020
1021The :py:func:`Match` function will automatically convert the left-hand operand
1022to a ``tsvector``, and the right-hand operand to a ``tsquery``. For better
1023performance, it is recommended you create a ``GIN`` index on the column you
1024plan to search:
1025
1026.. code-block:: sql
1027
1028    CREATE INDEX blog_full_text_search ON blog USING gin(to_tsvector(content));
1029
1030Alternatively, you can use the :py:class:`TSVectorField` to maintain a
1031dedicated column for storing ``tsvector`` data:
1032
1033.. code-block:: python
1034
1035    class Blog(Model):
1036        content = TextField()
1037        search_content = TSVectorField()
1038
1039.. note::
1040    :py:class:`TSVectorField`, will automatically be created with a GIN index.
1041
1042You will need to explicitly convert the incoming text data to ``tsvector`` when
1043inserting or updating the ``search_content`` field:
1044
1045.. code-block:: python
1046
1047    content = 'Excellent blog post about peewee ORM.'
1048    blog_entry = Blog.create(
1049        content=content,
1050        search_content=fn.to_tsvector(content))
1051
1052To perform a full-text search, use :py:meth:`TSVectorField.match`:
1053
1054.. code-block:: python
1055
1056    terms = 'python & (sqlite | postgres)'
1057    results = Blog.select().where(Blog.search_content.match(terms))
1058
1059For more information, see the `Postgres full-text search docs <https://www.postgresql.org/docs/current/textsearch.html>`_.
1060
1061
1062postgres_ext API notes
1063^^^^^^^^^^^^^^^^^^^^^^
1064
1065.. py:class:: PostgresqlExtDatabase(database[, server_side_cursors=False[, register_hstore=False[, ...]]])
1066
1067    Identical to :py:class:`PostgresqlDatabase` but required in order to support:
1068
1069    :param str database: Name of database to connect to.
1070    :param bool server_side_cursors: Whether ``SELECT`` queries should utilize
1071        server-side cursors.
1072    :param bool register_hstore: Register the HStore extension with the connection.
1073
1074    * :ref:`server_side_cursors`
1075    * :py:class:`ArrayField`
1076    * :py:class:`DateTimeTZField`
1077    * :py:class:`JSONField`
1078    * :py:class:`BinaryJSONField`
1079    * :py:class:`HStoreField`
1080    * :py:class:`TSVectorField`
1081
1082    If you wish to use the HStore extension, you must specify ``register_hstore=True``.
1083
1084    If using ``server_side_cursors``, also be sure to wrap your queries with
1085    :py:func:`ServerSide`.
1086
1087.. py:function:: ServerSide(select_query)
1088
1089    :param select_query: a :py:class:`SelectQuery` instance.
1090    :rtype generator:
1091
1092    Wrap the given select query in a transaction, and call its
1093    :py:meth:`~SelectQuery.iterator` method to avoid caching row instances. In
1094    order for the server-side resources to be released, be sure to exhaust the
1095    generator (iterate over all the rows).
1096
1097    Usage:
1098
1099    .. code-block:: python
1100
1101        large_query = PageView.select()
1102        for page_view in ServerSide(large_query):
1103            # Do something interesting.
1104            pass
1105
1106        # At this point server side resources are released.
1107
1108.. _pgarrays:
1109
1110.. py:class:: ArrayField([field_class=IntegerField[, field_kwargs=None[, dimensions=1[, convert_values=False]]]])
1111
1112    :param field_class: a subclass of :py:class:`Field`, e.g. :py:class:`IntegerField`.
1113    :param dict field_kwargs: arguments to initialize ``field_class``.
1114    :param int dimensions: dimensions of array.
1115    :param bool convert_values: apply ``field_class`` value conversion to array data.
1116
1117    Field capable of storing arrays of the provided `field_class`.
1118
1119    .. note::
1120        By default ArrayField will use a GIN index. To disable this, initialize
1121        the field with ``index=False``.
1122
1123    You can store and retrieve lists (or lists-of-lists):
1124
1125    .. code-block:: python
1126
1127        class BlogPost(BaseModel):
1128            content = TextField()
1129            tags = ArrayField(CharField)
1130
1131
1132        post = BlogPost(content='awesome', tags=['foo', 'bar', 'baz'])
1133
1134    Additionally, you can use the ``__getitem__`` API to query values or slices
1135    in the database:
1136
1137    .. code-block:: python
1138
1139        # Get the first tag on a given blog post.
1140        first_tag = (BlogPost
1141                     .select(BlogPost.tags[0].alias('first_tag'))
1142                     .where(BlogPost.id == 1)
1143                     .dicts()
1144                     .get())
1145
1146        # first_tag = {'first_tag': 'foo'}
1147
1148    Get a slice of values:
1149
1150    .. code-block:: python
1151
1152        # Get the first two tags.
1153        two_tags = (BlogPost
1154                    .select(BlogPost.tags[:2].alias('two'))
1155                    .dicts()
1156                    .get())
1157        # two_tags = {'two': ['foo', 'bar']}
1158
1159    .. py:method:: contains(*items)
1160
1161        :param items: One or more items that must be in the given array field.
1162
1163        .. code-block:: python
1164
1165            # Get all blog posts that are tagged with both "python" and "django".
1166            Blog.select().where(Blog.tags.contains('python', 'django'))
1167
1168    .. py:method:: contains_any(*items)
1169
1170        :param items: One or more items to search for in the given array field.
1171
1172        Like :py:meth:`~ArrayField.contains`, except will match rows where the
1173        array contains *any* of the given items.
1174
1175        .. code-block:: python
1176
1177            # Get all blog posts that are tagged with "flask" and/or "django".
1178            Blog.select().where(Blog.tags.contains_any('flask', 'django'))
1179
1180.. py:class:: DateTimeTZField(*args, **kwargs)
1181
1182    A timezone-aware subclass of :py:class:`DateTimeField`.
1183
1184.. py:class:: HStoreField(*args, **kwargs)
1185
1186    A field for storing and retrieving arbitrary key/value pairs. For details
1187    on usage, see :ref:`hstore`.
1188
1189    .. attention::
1190        To use the :py:class:`HStoreField` you will need to be sure the
1191        *hstore* extension is registered with the connection. To accomplish
1192        this, instantiate the :py:class:`PostgresqlExtDatabase` with
1193        ``register_hstore=True``.
1194
1195    .. note::
1196        By default ``HStoreField`` will use a *GiST* index. To disable this,
1197        initialize the field with ``index=False``.
1198
1199    .. py:method:: keys()
1200
1201        Returns the keys for a given row.
1202
1203        .. code-block:: pycon
1204
1205            >>> for h in House.select(House.address, House.features.keys().alias('keys')):
1206            ...     print(h.address, h.keys)
1207
1208            123 Main St [u'bath', u'garage']
1209
1210    .. py:method:: values()
1211
1212        Return the values for a given row.
1213
1214        .. code-block:: pycon
1215
1216            >>> for h in House.select(House.address, House.features.values().alias('vals')):
1217            ...     print(h.address, h.vals)
1218
1219            123 Main St [u'2 bath', u'2 cars']
1220
1221    .. py:method:: items()
1222
1223        Like python's ``dict``, return the keys and values in a list-of-lists:
1224
1225        .. code-block:: pycon
1226
1227            >>> for h in House.select(House.address, House.features.items().alias('mtx')):
1228            ...     print(h.address, h.mtx)
1229
1230            123 Main St [[u'bath', u'2 bath'], [u'garage', u'2 cars']]
1231
1232    .. py:method:: slice(*args)
1233
1234        Return a slice of data given a list of keys.
1235
1236        .. code-block:: pycon
1237
1238            >>> for h in House.select(House.address, House.features.slice('garage').alias('garage_data')):
1239            ...     print(h.address, h.garage_data)
1240
1241            123 Main St {'garage': '2 cars'}
1242
1243    .. py:method:: exists(key)
1244
1245        Query for whether the given key exists.
1246
1247        .. code-block:: pycon
1248
1249            >>> for h in House.select(House.address, House.features.exists('garage').alias('has_garage')):
1250            ...     print(h.address, h.has_garage)
1251
1252            123 Main St True
1253
1254            >>> for h in House.select().where(House.features.exists('garage')):
1255            ...     print(h.address, h.features['garage']) # <-- just houses w/garage data
1256
1257            123 Main St 2 cars
1258
1259    .. py:method:: defined(key)
1260
1261        Query for whether the given key has a value associated with it.
1262
1263    .. py:method:: update(**data)
1264
1265        Perform an atomic update to the keys/values for a given row or rows.
1266
1267        .. code-block:: pycon
1268
1269            >>> query = House.update(features=House.features.update(
1270            ...     sqft=2000,
1271            ...     year_built=2012))
1272            >>> query.where(House.id == 1).execute()
1273
1274    .. py:method:: delete(*keys)
1275
1276        Delete the provided keys for a given row or rows.
1277
1278        .. note:: We will use an ``UPDATE`` query.
1279
1280        .. code-block:: pycon
1281
1282        >>> query = House.update(features=House.features.delete(
1283        ...     'sqft', 'year_built'))
1284        >>> query.where(House.id == 1).execute()
1285
1286    .. py:method:: contains(value)
1287
1288        :param value: Either a ``dict``, a ``list`` of keys, or a single key.
1289
1290        Query rows for the existence of either:
1291
1292        * a partial dictionary.
1293        * a list of keys.
1294        * a single key.
1295
1296        .. code-block:: pycon
1297
1298            >>> query = House.select()
1299            >>> has_garage = query.where(House.features.contains('garage'))
1300            >>> garage_bath = query.where(House.features.contains(['garage', 'bath']))
1301            >>> twocar = query.where(House.features.contains({'garage': '2 cars'}))
1302
1303    .. py:method:: contains_any(*keys)
1304
1305        :param keys: One or more keys to search for.
1306
1307        Query rows for the existence of *any* key.
1308
1309.. py:class:: JSONField(dumps=None, *args, **kwargs)
1310
1311    :param dumps: The default is to call json.dumps() or the dumps function.
1312        You can override this method to create a customized JSON wrapper.
1313
1314    Field class suitable for storing and querying arbitrary JSON. When using
1315    this on a model, set the field's value to a Python object (either a
1316    ``dict`` or a ``list``). When you retrieve your value from the database it
1317    will be returned as a Python data structure.
1318
1319    .. note:: You must be using Postgres 9.2 / psycopg2 2.5 or greater.
1320
1321    .. note::
1322        If you are using Postgres 9.4, strongly consider using the
1323        :py:class:`BinaryJSONField` instead as it offers better performance and
1324        more powerful querying options.
1325
1326    Example model declaration:
1327
1328    .. code-block:: python
1329
1330        db = PostgresqlExtDatabase('my_db')
1331
1332        class APIResponse(Model):
1333            url = CharField()
1334            response = JSONField()
1335
1336            class Meta:
1337                database = db
1338
1339    Example of storing JSON data:
1340
1341    .. code-block:: python
1342
1343        url = 'http://foo.com/api/resource/'
1344        resp = json.loads(urllib2.urlopen(url).read())
1345        APIResponse.create(url=url, response=resp)
1346
1347        APIResponse.create(url='http://foo.com/baz/', response={'key': 'value'})
1348
1349    To query, use Python's ``[]`` operators to specify nested key or array lookups:
1350
1351    .. code-block:: python
1352
1353        APIResponse.select().where(
1354            APIResponse.response['key1']['nested-key'] == 'some-value')
1355
1356    To illustrate the use of the ``[]`` operators, imagine we have the
1357    following data stored in an ``APIResponse``:
1358
1359    .. code-block:: javascript
1360
1361        {
1362          "foo": {
1363            "bar": ["i1", "i2", "i3"],
1364            "baz": {
1365              "huey": "mickey",
1366              "peewee": "nugget"
1367            }
1368          }
1369        }
1370
1371    Here are the results of a few queries:
1372
1373    .. code-block:: python
1374
1375        def get_data(expression):
1376            # Helper function to just retrieve the results of a
1377            # particular expression.
1378            query = (APIResponse
1379                     .select(expression.alias('my_data'))
1380                     .dicts()
1381                     .get())
1382            return query['my_data']
1383
1384        # Accessing the foo -> bar subkey will return a JSON
1385        # representation of the list.
1386        get_data(APIResponse.data['foo']['bar'])
1387        # '["i1", "i2", "i3"]'
1388
1389        # In order to retrieve this list as a Python list,
1390        # we will call .as_json() on the expression.
1391        get_data(APIResponse.data['foo']['bar'].as_json())
1392        # ['i1', 'i2', 'i3']
1393
1394        # Similarly, accessing the foo -> baz subkey will
1395        # return a JSON representation of the dictionary.
1396        get_data(APIResponse.data['foo']['baz'])
1397        # '{"huey": "mickey", "peewee": "nugget"}'
1398
1399        # Again, calling .as_json() will return an actual
1400        # python dictionary.
1401        get_data(APIResponse.data['foo']['baz'].as_json())
1402        # {'huey': 'mickey', 'peewee': 'nugget'}
1403
1404        # When dealing with simple values, either way works as
1405        # you expect.
1406        get_data(APIResponse.data['foo']['bar'][0])
1407        # 'i1'
1408
1409        # Calling .as_json() when the result is a simple value
1410        # will return the same thing as the previous example.
1411        get_data(APIResponse.data['foo']['bar'][0].as_json())
1412        # 'i1'
1413
1414.. py:class:: BinaryJSONField(dumps=None, *args, **kwargs)
1415
1416    :param dumps: The default is to call json.dumps() or the dumps function.
1417      You can override this method to create a customized JSON wrapper.
1418
1419    Store and query arbitrary JSON documents. Data should be stored using
1420    normal Python ``dict`` and ``list`` objects, and when data is returned from
1421    the database, it will be returned using ``dict`` and ``list`` as well.
1422
1423    For examples of basic query operations, see the above code samples for
1424    :py:class:`JSONField`. The example queries below will use the same
1425    ``APIResponse`` model described above.
1426
1427    .. note::
1428        By default BinaryJSONField will use a GiST index. To disable this,
1429        initialize the field with ``index=False``.
1430
1431    .. note:: You must be using Postgres 9.4 / psycopg2 2.5 or newer. If you are using Postgres 9.2 or 9.3, you can use the regular :py:class:`JSONField` instead.
1432
1433    .. py:method:: contains(other)
1434
1435        Test whether the given JSON data contains the given JSON fragment or key.
1436
1437        Example:
1438
1439        .. code-block:: python
1440
1441            search_fragment = {
1442                'foo': {'bar': ['i2']}
1443            }
1444            query = (APIResponse
1445                     .select()
1446                     .where(APIResponse.data.contains(search_fragment)))
1447
1448            # If we're searching for a list, the list items do not need to
1449            # be ordered in a particular way:
1450            query = (APIResponse
1451                     .select()
1452                     .where(APIResponse.data.contains({
1453                         'foo': {'bar': ['i2', 'i1']}})))
1454
1455        We can pass in simple keys as well. To find APIResponses that contain the key ``foo`` at the top-level:
1456
1457        .. code-block:: python
1458
1459            APIResponse.select().where(APIResponse.data.contains('foo'))
1460
1461        We can also search sub-keys using square-brackets:
1462
1463        .. code-block:: python
1464
1465            APIResponse.select().where(
1466                APIResponse.data['foo']['bar'].contains(['i2', 'i1']))
1467
1468    .. py:method:: contains_any(*items)
1469
1470        Search for the presence of one or more of the given items.
1471
1472        .. code-block:: python
1473
1474            APIResponse.select().where(
1475                APIResponse.data.contains_any('foo', 'baz', 'nugget'))
1476
1477        Like :py:meth:`~BinaryJSONField.contains`, we can also search sub-keys:
1478
1479        .. code-block:: python
1480
1481            APIResponse.select().where(
1482                APIResponse.data['foo']['bar'].contains_any('i2', 'ix'))
1483
1484    .. py:method:: contains_all(*items)
1485
1486        Search for the presence of all of the given items.
1487
1488        .. code-block:: python
1489
1490            APIResponse.select().where(
1491                APIResponse.data.contains_all('foo'))
1492
1493        Like :py:meth:`~BinaryJSONField.contains_any`, we can also search sub-keys:
1494
1495        .. code-block:: python
1496
1497            APIResponse.select().where(
1498                APIResponse.data['foo']['bar'].contains_all('i1', 'i2', 'i3'))
1499
1500    .. py:method:: contained_by(other)
1501
1502        Test whether the given JSON document is contained by (is a subset of) the given JSON document. This method is the inverse of :py:meth:`~BinaryJSONField.contains`.
1503
1504        .. code-block:: python
1505
1506            big_doc = {
1507                'foo': {
1508                    'bar': ['i1', 'i2', 'i3'],
1509                    'baz': {
1510                        'huey': 'mickey',
1511                        'peewee': 'nugget',
1512                    }
1513                },
1514                'other_key': ['nugget', 'bear', 'kitten'],
1515            }
1516            APIResponse.select().where(
1517                APIResponse.data.contained_by(big_doc))
1518
1519    .. py:method:: concat(data)
1520
1521        Concatentate two field data and the provided data. Note that this
1522        operation does not merge or do a "deep concat".
1523
1524    .. py:method:: has_key(key)
1525
1526        Test whether the key exists at the top-level of the JSON object.
1527
1528    .. py:method:: remove(*keys)
1529
1530        Remove one or more keys from the top-level of the JSON object.
1531
1532
1533.. py:function:: Match(field, query)
1534
1535    Generate a full-text search expression, automatically converting the
1536    left-hand operand to a ``tsvector``, and the right-hand operand to a
1537    ``tsquery``.
1538
1539    Example:
1540
1541    .. code-block:: python
1542
1543        def blog_search(search_term):
1544            return Blog.select().where(
1545                (Blog.status == Blog.STATUS_PUBLISHED) &
1546                Match(Blog.content, search_term))
1547
1548.. py:class:: TSVectorField
1549
1550    Field type suitable for storing ``tsvector`` data. This field will
1551    automatically be created with a ``GIN`` index for improved search
1552    performance.
1553
1554    .. note::
1555        Data stored in this field will still need to be manually converted to
1556        the ``tsvector`` type.
1557
1558    .. note::
1559        By default TSVectorField will use a GIN index. To disable this,
1560        initialize the field with ``index=False``.
1561
1562     Example usage:
1563
1564     .. code-block:: python
1565
1566          class Blog(Model):
1567              content = TextField()
1568              search_content = TSVectorField()
1569
1570          content = 'this is a sample blog entry.'
1571          blog_entry = Blog.create(
1572              content=content,
1573              search_content=fn.to_tsvector(content))  # Note `to_tsvector()`.
1574
1575    .. py:method:: match(query[, language=None[, plain=False]])
1576
1577        :param str query: the full-text search query.
1578        :param str language: language name (optional).
1579        :param bool plain: parse search query using plain (simple) parser.
1580        :returns: an expression representing full-text search/match.
1581
1582        Example:
1583
1584        .. code-block:: python
1585
1586            # Perform a search using the "match" method.
1587            terms = 'python & (sqlite | postgres)'
1588            results = Blog.select().where(Blog.search_content.match(terms))
1589
1590
1591.. include:: crdb.rst
1592
1593
1594.. _mysql_ext:
1595
1596MySQL Extensions
1597----------------
1598
1599Peewee provides an alternate database implementation for using the
1600`mysql-connector <https://dev.mysql.com/doc/connector-python/en/>`_ driver. The
1601implementation can be found in ``playhouse.mysql_ext``.
1602
1603Example usage:
1604
1605.. code-block:: python
1606
1607    from playhouse.mysql_ext import MySQLConnectorDatabase
1608
1609    # MySQL database implementation that utilizes mysql-connector driver.
1610    db = MySQLConnectorDatabase('my_database', host='1.2.3.4', user='mysql')
1611
1612Additional MySQL-specific helpers:
1613
1614.. py:class:: JSONField()
1615
1616    Extends :py:class:`TextField` and implements transparent JSON encoding and
1617    decoding in Python.
1618
1619.. py:function:: Match(columns, expr[, modifier=None])
1620
1621    :param columns: a single :py:class:`Field` or a tuple of multiple fields.
1622    :param str expr: the full-text search expression.
1623    :param str modifier: optional modifiers for the search, e.g. *'in boolean mode'*.
1624
1625    Helper class for constructing MySQL full-text search queries of the form:
1626
1627    .. code-block:: sql
1628
1629        MATCH (columns, ...) AGAINST (expr[ modifier])
1630
1631.. _dataset:
1632
1633DataSet
1634-------
1635
1636The *dataset* module contains a high-level API for working with databases
1637modeled after the popular `project of the same name <https://dataset.readthedocs.io/en/latest/index.html>`_.
1638The aims of the *dataset* module are to provide:
1639
1640* A simplified API for working with relational data, along the lines of working with JSON.
1641* An easy way to export relational data as JSON or CSV.
1642* An easy way to import JSON or CSV data into a relational database.
1643
1644A minimal data-loading script might look like this:
1645
1646.. code-block:: python
1647
1648    from playhouse.dataset import DataSet
1649
1650    db = DataSet('sqlite:///:memory:')
1651
1652    table = db['sometable']
1653    table.insert(name='Huey', age=3)
1654    table.insert(name='Mickey', age=5, gender='male')
1655
1656    huey = table.find_one(name='Huey')
1657    print(huey)
1658    # {'age': 3, 'gender': None, 'id': 1, 'name': 'Huey'}
1659
1660    for obj in table:
1661        print(obj)
1662    # {'age': 3, 'gender': None, 'id': 1, 'name': 'Huey'}
1663    # {'age': 5, 'gender': 'male', 'id': 2, 'name': 'Mickey'}
1664
1665You can insert, update or delete using the dictionary APIs as well:
1666
1667.. code-block:: python
1668
1669    huey = table.find_one(name='Huey')
1670    # {'age': 3, 'gender': None, 'id': 1, 'name': 'Huey'}
1671
1672    # Perform an update by supplying a partial record of changes.
1673    table[1] = {'gender': 'male', 'age': 4}
1674    print(table[1])
1675    # {'age': 4, 'gender': 'male', 'id': 1, 'name': 'Huey'}
1676
1677    # Or insert a new record:
1678    table[3] = {'name': 'Zaizee', 'age': 2}
1679    print(table[3])
1680    # {'age': 2, 'gender': None, 'id': 3, 'name': 'Zaizee'}
1681
1682    # Or delete a record:
1683    del table[3]  # Remove the row we just added.
1684
1685You can export or import data using :py:meth:`~DataSet.freeze` and
1686:py:meth:`~DataSet.thaw`:
1687
1688.. code-block:: python
1689
1690    # Export table content to the `users.json` file.
1691    db.freeze(table.all(), format='json', filename='users.json')
1692
1693    # Import data from a CSV file into a new table. Columns will be automatically
1694    # created for each field in the CSV file.
1695    new_table = db['stats']
1696    new_table.thaw(format='csv', filename='monthly_stats.csv')
1697
1698Getting started
1699^^^^^^^^^^^^^^^
1700
1701:py:class:`DataSet` objects are initialized by passing in a database URL of the
1702format ``dialect://user:password@host/dbname``. See the :ref:`db_url` section
1703for examples of connecting to various databases.
1704
1705.. code-block:: python
1706
1707    # Create an in-memory SQLite database.
1708    db = DataSet('sqlite:///:memory:')
1709
1710Storing data
1711^^^^^^^^^^^^
1712
1713To store data, we must first obtain a reference to a table. If the table does
1714not exist, it will be created automatically:
1715
1716.. code-block:: python
1717
1718    # Get a table reference, creating the table if it does not exist.
1719    table = db['users']
1720
1721We can now :py:meth:`~Table.insert` new rows into the table. If the columns do
1722not exist, they will be created automatically:
1723
1724.. code-block:: python
1725
1726    table.insert(name='Huey', age=3, color='white')
1727    table.insert(name='Mickey', age=5, gender='male')
1728
1729To update existing entries in the table, pass in a dictionary containing the
1730new values and filter conditions. The list of columns to use as filters is
1731specified in the *columns* argument. If no filter columns are specified, then
1732all rows will be updated.
1733
1734.. code-block:: python
1735
1736    # Update the gender for "Huey".
1737    table.update(name='Huey', gender='male', columns=['name'])
1738
1739    # Update all records. If the column does not exist, it will be created.
1740    table.update(favorite_orm='peewee')
1741
1742Importing data
1743^^^^^^^^^^^^^^
1744
1745To import data from an external source, such as a JSON or CSV file, you can use
1746the :py:meth:`~Table.thaw` method. By default, new columns will be created for
1747any attributes encountered. If you wish to only populate columns that are
1748already defined on a table, you can pass in ``strict=True``.
1749
1750.. code-block:: python
1751
1752    # Load data from a JSON file containing a list of objects.
1753    table = dataset['stock_prices']
1754    table.thaw(filename='stocks.json', format='json')
1755    table.all()[:3]
1756
1757    # Might print...
1758    [{'id': 1, 'ticker': 'GOOG', 'price': 703},
1759     {'id': 2, 'ticker': 'AAPL', 'price': 109},
1760     {'id': 3, 'ticker': 'AMZN', 'price': 300}]
1761
1762
1763Using transactions
1764^^^^^^^^^^^^^^^^^^
1765
1766DataSet supports nesting transactions using a simple context manager.
1767
1768.. code-block:: python
1769
1770    table = db['users']
1771    with db.transaction() as txn:
1772        table.insert(name='Charlie')
1773
1774        with db.transaction() as nested_txn:
1775            # Set Charlie's favorite ORM to Django.
1776            table.update(name='Charlie', favorite_orm='django', columns=['name'])
1777
1778            # jk/lol
1779            nested_txn.rollback()
1780
1781Inspecting the database
1782^^^^^^^^^^^^^^^^^^^^^^^
1783
1784You can use the :py:meth:`tables` method to list the tables in the current
1785database:
1786
1787.. code-block:: pycon
1788
1789    >>> print db.tables
1790    ['sometable', 'user']
1791
1792And for a given table, you can print the columns:
1793
1794.. code-block:: pycon
1795
1796    >>> table = db['user']
1797    >>> print table.columns
1798    ['id', 'age', 'name', 'gender', 'favorite_orm']
1799
1800We can also find out how many rows are in a table:
1801
1802.. code-block:: pycon
1803
1804    >>> print len(db['user'])
1805    3
1806
1807Reading data
1808^^^^^^^^^^^^
1809
1810To retrieve all rows, you can use the :py:meth:`~Table.all` method:
1811
1812.. code-block:: python
1813
1814    # Retrieve all the users.
1815    users = db['user'].all()
1816
1817    # We can iterate over all rows without calling `.all()`
1818    for user in db['user']:
1819        print user['name']
1820
1821Specific objects can be retrieved using :py:meth:`~Table.find` and
1822:py:meth:`~Table.find_one`.
1823
1824.. code-block:: python
1825
1826    # Find all the users who like peewee.
1827    peewee_users = db['user'].find(favorite_orm='peewee')
1828
1829    # Find Huey.
1830    huey = db['user'].find_one(name='Huey')
1831
1832Exporting data
1833^^^^^^^^^^^^^^
1834
1835To export data, use the :py:meth:`~DataSet.freeze` method, passing in the query
1836you wish to export:
1837
1838.. code-block:: python
1839
1840    peewee_users = db['user'].find(favorite_orm='peewee')
1841    db.freeze(peewee_users, format='json', filename='peewee_users.json')
1842
1843API
1844^^^
1845
1846.. py:class:: DataSet(url, **kwargs)
1847
1848    :param url: A database URL or a :py:class:`Database` instance. For
1849        details on using a URL, see :ref:`db_url` for examples.
1850    :param kwargs: additional keyword arguments passed to
1851        :py:meth:`Introspector.generate_models` when introspecting the db.
1852
1853    The *DataSet* class provides a high-level API for working with relational
1854    databases.
1855
1856    .. py:attribute:: tables
1857
1858        Return a list of tables stored in the database. This list is computed
1859        dynamically each time it is accessed.
1860
1861    .. py:method:: __getitem__(table_name)
1862
1863        Provide a :py:class:`Table` reference to the specified table. If the
1864        table does not exist, it will be created.
1865
1866    .. py:method:: query(sql[, params=None[, commit=True]])
1867
1868        :param str sql: A SQL query.
1869        :param list params: Optional parameters for the query.
1870        :param bool commit: Whether the query should be committed upon execution.
1871        :return: A database cursor.
1872
1873        Execute the provided query against the database.
1874
1875    .. py:method:: transaction()
1876
1877        Create a context manager representing a new transaction (or savepoint).
1878
1879    .. py:method:: freeze(query[, format='csv'[, filename=None[, file_obj=None[, **kwargs]]]])
1880
1881        :param query: A :py:class:`SelectQuery`, generated using :py:meth:`~Table.all` or `~Table.find`.
1882        :param format: Output format. By default, *csv* and *json* are supported.
1883        :param filename: Filename to write output to.
1884        :param file_obj: File-like object to write output to.
1885        :param kwargs: Arbitrary parameters for export-specific functionality.
1886
1887    .. py:method:: thaw(table[, format='csv'[, filename=None[, file_obj=None[, strict=False[, **kwargs]]]]])
1888
1889        :param str table: The name of the table to load data into.
1890        :param format: Input format. By default, *csv* and *json* are supported.
1891        :param filename: Filename to read data from.
1892        :param file_obj: File-like object to read data from.
1893        :param bool strict: Whether to store values for columns that do not already exist on the table.
1894        :param kwargs: Arbitrary parameters for import-specific functionality.
1895
1896    .. py:method:: connect()
1897
1898        Open a connection to the underlying database. If a connection is not
1899        opened explicitly, one will be opened the first time a query is
1900        executed.
1901
1902    .. py:method:: close()
1903
1904        Close the connection to the underlying database.
1905
1906.. py:class:: Table(dataset, name, model_class)
1907
1908    :noindex:
1909
1910    Provides a high-level API for working with rows in a given table.
1911
1912    .. py:attribute:: columns
1913
1914        Return a list of columns in the given table.
1915
1916    .. py:attribute:: model_class
1917
1918        A dynamically-created :py:class:`Model` class.
1919
1920    .. py:method:: create_index(columns[, unique=False])
1921
1922        Create an index on the given columns:
1923
1924        .. code-block:: python
1925
1926            # Create a unique index on the `username` column.
1927            db['users'].create_index(['username'], unique=True)
1928
1929    .. py:method:: insert(**data)
1930
1931        Insert the given data dictionary into the table, creating new columns
1932        as needed.
1933
1934    .. py:method:: update(columns=None, conjunction=None, **data)
1935
1936        Update the table using the provided data. If one or more columns are
1937        specified in the *columns* parameter, then those columns' values in the
1938        *data* dictionary will be used to determine which rows to update.
1939
1940        .. code-block:: python
1941
1942            # Update all rows.
1943            db['users'].update(favorite_orm='peewee')
1944
1945            # Only update Huey's record, setting his age to 3.
1946            db['users'].update(name='Huey', age=3, columns=['name'])
1947
1948    .. py:method:: find(**query)
1949
1950        Query the table for rows matching the specified equality conditions. If
1951        no query is specified, then all rows are returned.
1952
1953        .. code-block:: python
1954
1955            peewee_users = db['users'].find(favorite_orm='peewee')
1956
1957    .. py:method:: find_one(**query)
1958
1959        Return a single row matching the specified equality conditions. If no
1960        matching row is found then ``None`` will be returned.
1961
1962        .. code-block:: python
1963
1964            huey = db['users'].find_one(name='Huey')
1965
1966    .. py:method:: all()
1967
1968        Return all rows in the given table.
1969
1970    .. py:method:: delete(**query)
1971
1972        Delete all rows matching the given equality conditions. If no query is
1973        provided, then all rows will be deleted.
1974
1975        .. code-block:: python
1976
1977            # Adios, Django!
1978            db['users'].delete(favorite_orm='Django')
1979
1980            # Delete all the secret messages.
1981            db['secret_messages'].delete()
1982
1983    .. py:method:: freeze([format='csv'[, filename=None[, file_obj=None[, **kwargs]]]])
1984
1985        :param format: Output format. By default, *csv* and *json* are supported.
1986        :param filename: Filename to write output to.
1987        :param file_obj: File-like object to write output to.
1988        :param kwargs: Arbitrary parameters for export-specific functionality.
1989
1990    .. py:method:: thaw([format='csv'[, filename=None[, file_obj=None[, strict=False[, **kwargs]]]]])
1991
1992        :param format: Input format. By default, *csv* and *json* are supported.
1993        :param filename: Filename to read data from.
1994        :param file_obj: File-like object to read data from.
1995        :param bool strict: Whether to store values for columns that do not already exist on the table.
1996        :param kwargs: Arbitrary parameters for import-specific functionality.
1997
1998.. _extra-fields:
1999
2000Fields
2001------
2002
2003These fields can be found in the ``playhouse.fields`` module.
2004
2005.. py:class:: CompressedField([compression_level=6[, algorithm='zlib'[, **kwargs]]])
2006
2007    :param int compression_level: A value from 0 to 9.
2008    :param str algorithm: Either ``'zlib'`` or ``'bz2'``.
2009
2010    Stores compressed data using the specified algorithm. This field extends
2011    :py:class:`BlobField`, transparently storing a compressed representation of
2012    the data in the database.
2013
2014.. py:class:: PickleField()
2015
2016    Stores arbitrary Python data by transparently pickling and un-pickling data
2017    stored in the field. This field extends :py:class:`BlobField`. If the
2018    ``cPickle`` module is available, it will be used.
2019
2020.. _hybrid:
2021
2022Hybrid Attributes
2023-----------------
2024
2025Hybrid attributes encapsulate functionality that operates at both the Python
2026*and* SQL levels. The idea for hybrid attributes comes from a feature of the
2027`same name in SQLAlchemy <http://docs.sqlalchemy.org/en/improve_toc/orm/extensions/hybrid.html>`_.
2028Consider the following example:
2029
2030.. code-block:: python
2031
2032    class Interval(Model):
2033        start = IntegerField()
2034        end = IntegerField()
2035
2036        @hybrid_property
2037        def length(self):
2038            return self.end - self.start
2039
2040        @hybrid_method
2041        def contains(self, point):
2042            return (self.start <= point) & (point < self.end)
2043
2044The *hybrid attribute* gets its name from the fact that the ``length``
2045attribute will behave differently depending on whether it is accessed via the
2046``Interval`` class or an ``Interval`` instance.
2047
2048If accessed via an instance, then it behaves just as you would expect.
2049
2050If accessed via the ``Interval.length`` class attribute, however, the length
2051calculation will be expressed as a SQL expression. For example:
2052
2053.. code-block:: python
2054
2055    query = Interval.select().where(Interval.length > 5)
2056
2057This query will be equivalent to the following SQL:
2058
2059.. code-block:: sql
2060
2061    SELECT "t1"."id", "t1"."start", "t1"."end"
2062    FROM "interval" AS t1
2063    WHERE (("t1"."end" - "t1"."start") > 5)
2064
2065The ``playhouse.hybrid`` module also contains a decorator for implementing
2066hybrid methods which can accept parameters. As with hybrid properties, when
2067accessed via a model instance, then the function executes normally as-written.
2068When the hybrid method is called on the class, however, it will generate a SQL
2069expression.
2070
2071Example:
2072
2073.. code-block:: python
2074
2075    query = Interval.select().where(Interval.contains(2))
2076
2077This query is equivalent to the following SQL:
2078
2079.. code-block:: sql
2080
2081    SELECT "t1"."id", "t1"."start", "t1"."end"
2082    FROM "interval" AS t1
2083    WHERE (("t1"."start" <= 2) AND (2 < "t1"."end"))
2084
2085There is an additional API for situations where the python implementation differs slightly from the SQL implementation. Let's add a ``radius`` method to the ``Interval`` model. Because this method calculates an absolute value, we will use the Python ``abs()`` function for the instance portion and the ``fn.ABS()`` SQL function for the class portion.
2086
2087.. code-block:: python
2088
2089    class Interval(Model):
2090        start = IntegerField()
2091        end = IntegerField()
2092
2093        @hybrid_property
2094        def length(self):
2095            return self.end - self.start
2096
2097        @hybrid_property
2098        def radius(self):
2099            return abs(self.length) / 2
2100
2101        @radius.expression
2102        def radius(cls):
2103            return fn.ABS(cls.length) / 2
2104
2105What is neat is that both the ``radius`` implementations refer to the
2106``length`` hybrid attribute! When accessed via an ``Interval`` instance, the
2107radius calculation will be executed in Python. When invoked via an ``Interval``
2108class, we will get the appropriate SQL.
2109
2110Example:
2111
2112.. code-block:: python
2113
2114    query = Interval.select().where(Interval.radius < 3)
2115
2116This query is equivalent to the following SQL:
2117
2118.. code-block:: sql
2119
2120    SELECT "t1"."id", "t1"."start", "t1"."end"
2121    FROM "interval" AS t1
2122    WHERE ((abs("t1"."end" - "t1"."start") / 2) < 3)
2123
2124Pretty neat, right? Thanks for the cool idea, SQLAlchemy!
2125
2126Hybrid API
2127^^^^^^^^^^
2128
2129.. py:class:: hybrid_method(func[, expr=None])
2130
2131    Method decorator that allows the definition of a Python object method with
2132    both instance-level and class-level behavior.
2133
2134    Example:
2135
2136    .. code-block:: python
2137
2138        class Interval(Model):
2139            start = IntegerField()
2140            end = IntegerField()
2141
2142            @hybrid_method
2143            def contains(self, point):
2144                return (self.start <= point) & (point < self.end)
2145
2146    When called with an ``Interval`` instance, the ``contains`` method will
2147    behave as you would expect. When called as a classmethod, though, a SQL
2148    expression will be generated:
2149
2150    .. code-block:: python
2151
2152        query = Interval.select().where(Interval.contains(2))
2153
2154    Would generate the following SQL:
2155
2156    .. code-block:: sql
2157
2158        SELECT "t1"."id", "t1"."start", "t1"."end"
2159        FROM "interval" AS t1
2160        WHERE (("t1"."start" <= 2) AND (2 < "t1"."end"))
2161
2162    .. py:method:: expression(expr)
2163
2164        Method decorator for specifying the SQL-expression producing method.
2165
2166.. py:class:: hybrid_property(fget[, fset=None[, fdel=None[, expr=None]]])
2167
2168    Method decorator that allows the definition of a Python object property
2169    with both instance-level and class-level behavior.
2170
2171    Examples:
2172
2173    .. code-block:: python
2174
2175        class Interval(Model):
2176            start = IntegerField()
2177            end = IntegerField()
2178
2179            @hybrid_property
2180            def length(self):
2181                return self.end - self.start
2182
2183            @hybrid_property
2184            def radius(self):
2185                return abs(self.length) / 2
2186
2187            @radius.expression
2188            def radius(cls):
2189                return fn.ABS(cls.length) / 2
2190
2191    When accessed on an ``Interval`` instance, the ``length`` and ``radius``
2192    properties will behave as you would expect. When accessed as class
2193    attributes, though, a SQL expression will be generated instead:
2194
2195    .. code-block:: python
2196
2197        query = (Interval
2198                 .select()
2199                 .where(
2200                     (Interval.length > 6) &
2201                     (Interval.radius >= 3)))
2202
2203    Would generate the following SQL:
2204
2205    .. code-block:: sql
2206
2207        SELECT "t1"."id", "t1"."start", "t1"."end"
2208        FROM "interval" AS t1
2209        WHERE (
2210            (("t1"."end" - "t1"."start") > 6) AND
2211            ((abs("t1"."end" - "t1"."start") / 2) >= 3)
2212        )
2213
2214.. _kv:
2215
2216Key/Value Store
2217---------------
2218
2219The ``playhouse.kv`` module contains the implementation of a persistent
2220dictionary.
2221
2222.. py:class:: KeyValue([key_field=None[, value_field=None[, ordered=False[, database=None[, table_name='keyvalue']]]]])
2223
2224    :param Field key_field: field to use for key. Defaults to
2225        :py:class:`CharField`. **Must have** ``primary_key=True``.
2226    :param Field value_field: field to use for value. Defaults to
2227        :py:class:`PickleField`.
2228    :param bool ordered: data should be returned in key-sorted order.
2229    :param Database database: database where key/value data is stored. If not
2230        specified, an in-memory SQLite database will be used.
2231    :param str table_name: table name for data storage.
2232
2233    Dictionary-like API for storing key/value data. Like dictionaries, supports
2234    the expected APIs, but also has the added capability of accepting
2235    expressions for getting, setting and deleting items.
2236
2237    Table is created automatically (if it doesn't exist) when the ``KeyValue``
2238    is instantiated.
2239
2240    Uses efficient upsert implementation for setting and updating/overwriting
2241    key/value pairs.
2242
2243    Basic examples:
2244
2245    .. code-block:: python
2246
2247        # Create a key/value store, which uses an in-memory SQLite database
2248        # for data storage.
2249        KV = KeyValue()
2250
2251        # Set (or overwrite) the value for "k1".
2252        KV['k1'] = 'v1'
2253
2254        # Set (or update) multiple keys at once (uses an efficient upsert).
2255        KV.update(k2='v2', k3='v3')
2256
2257        # Getting values works as you'd expect.
2258        assert KV['k2'] == 'v2'
2259
2260        # We can also do this:
2261        for value in KV[KV.key > 'k1']:
2262            print(value)
2263
2264        # 'v2'
2265        # 'v3'
2266
2267        # Update multiple values at once using expression:
2268        KV[KV.key > 'k1'] = 'vx'
2269
2270        # What's stored in the KV?
2271        print(dict(KV))
2272
2273        # {'k1': 'v1', 'k2': 'vx', 'k3': 'vx'}
2274
2275        # Delete a single item.
2276        del KV['k2']
2277
2278        # How many items are stored in the KV?
2279        print(len(KV))
2280        # 2
2281
2282        # Delete items that match the given condition.
2283        del KV[KV.key > 'k1']
2284
2285    .. py:method:: __contains__(expr)
2286
2287        :param expr: a single key or an expression
2288        :returns: Boolean whether key/expression exists.
2289
2290        Example:
2291
2292        .. code-block:: pycon
2293
2294            >>> kv = KeyValue()
2295            >>> kv.update(k1='v1', k2='v2')
2296
2297            >>> 'k1' in kv
2298            True
2299            >>> 'kx' in kv
2300            False
2301
2302            >>> (KV.key < 'k2') in KV
2303            True
2304            >>> (KV.key > 'k2') in KV
2305            False
2306
2307    .. py:method:: __len__()
2308
2309        :returns: Count of items stored.
2310
2311    .. py:method:: __getitem__(expr)
2312
2313        :param expr: a single key or an expression.
2314        :returns: value(s) corresponding to key/expression.
2315        :raises: ``KeyError`` if single key given and not found.
2316
2317        Examples:
2318
2319        .. code-block:: pycon
2320
2321            >>> KV = KeyValue()
2322            >>> KV.update(k1='v1', k2='v2', k3='v3')
2323
2324            >>> KV['k1']
2325            'v1'
2326            >>> KV['kx']
2327            KeyError: "kx" not found
2328
2329            >>> KV[KV.key > 'k1']
2330            ['v2', 'v3']
2331            >>> KV[KV.key < 'k1']
2332            []
2333
2334    .. py:method:: __setitem__(expr, value)
2335
2336        :param expr: a single key or an expression.
2337        :param value: value to set for key(s)
2338
2339        Set value for the given key. If ``expr`` is an expression, then any
2340        keys matching the expression will have their value updated.
2341
2342        Example:
2343
2344        .. code-block:: pycon
2345
2346            >>> KV = KeyValue()
2347            >>> KV.update(k1='v1', k2='v2', k3='v3')
2348
2349            >>> KV['k1'] = 'v1-x'
2350            >>> print(KV['k1'])
2351            'v1-x'
2352
2353            >>> KV[KV.key >= 'k2'] = 'v99'
2354            >>> dict(KV)
2355            {'k1': 'v1-x', 'k2': 'v99', 'k3': 'v99'}
2356
2357    .. py:method:: __delitem__(expr)
2358
2359        :param expr: a single key or an expression.
2360
2361        Delete the given key. If an expression is given, delete all keys that
2362        match the expression.
2363
2364        Example:
2365
2366        .. code-block:: pycon
2367
2368            >>> KV = KeyValue()
2369            >>> KV.update(k1=1, k2=2, k3=3)
2370
2371            >>> del KV['k1']  # Deletes "k1".
2372            >>> del KV['k1']
2373            KeyError: "k1" does not exist
2374
2375            >>> del KV[KV.key > 'k2']  # Deletes "k3".
2376            >>> del KV[KV.key > 'k99']  # Nothing deleted, no keys match.
2377
2378    .. py:method:: keys()
2379
2380        :returns: an iterable of all keys in the table.
2381
2382    .. py:method:: values()
2383
2384        :returns: an iterable of all values in the table.
2385
2386    .. py:method:: items()
2387
2388        :returns: an iterable of all key/value pairs in the table.
2389
2390    .. py:method:: update([__data=None[, **mapping]])
2391
2392        Efficiently bulk-insert or replace the given key/value pairs.
2393
2394        Example:
2395
2396        .. code-block:: pycon
2397
2398            >>> KV = KeyValue()
2399            >>> KV.update(k1=1, k2=2)  # Sets 'k1'=1, 'k2'=2.
2400
2401            >>> dict(KV)
2402            {'k1': 1, 'k2': 2}
2403
2404            >>> KV.update(k2=22, k3=3)  # Updates 'k2'->22, sets 'k3'=3.
2405
2406            >>> dict(KV)
2407            {'k1': 1, 'k2': 22, 'k3': 3}
2408
2409            >>> KV.update({'k2': -2, 'k4': 4})  # Also can pass a dictionary.
2410
2411            >>> dict(KV)
2412            {'k1': 1, 'k2': -2, 'k3': 3, 'k4': 4}
2413
2414    .. py:method:: get(expr[, default=None])
2415
2416        :param expr: a single key or an expression.
2417        :param default: default value if key not found.
2418        :returns: value of given key/expr or default if single key not found.
2419
2420        Get the value at the given key. If the key does not exist, the default
2421        value is returned, unless the key is an expression in which case an
2422        empty list will be returned.
2423
2424    .. py:method:: pop(expr[, default=Sentinel])
2425
2426        :param expr: a single key or an expression.
2427        :param default: default value if key does not exist.
2428        :returns: value of given key/expr or default if single key not found.
2429
2430        Get value and delete the given key. If the key does not exist, the
2431        default value is returned, unless the key is an expression in which
2432        case an empty list is returned.
2433
2434    .. py:method:: clear()
2435
2436        Remove all items from the key-value table.
2437
2438
2439.. _shortcuts:
2440
2441Shortcuts
2442---------
2443
2444This module contains helper functions for expressing things that would
2445otherwise be somewhat verbose or cumbersome using peewee's APIs. There are also
2446helpers for serializing models to dictionaries and vice-versa.
2447
2448.. py:function:: model_to_dict(model[, recurse=True[, backrefs=False[, only=None[, exclude=None[, extra_attrs=None[, fields_from_query=None[, max_depth=None[, manytomany=False]]]]]]]])
2449
2450    :param bool recurse: Whether foreign-keys should be recursed.
2451    :param bool backrefs: Whether lists of related objects should be recursed.
2452    :param only: A list (or set) of field instances which should be included in the result dictionary.
2453    :param exclude: A list (or set) of field instances which should be excluded from the result dictionary.
2454    :param extra_attrs: A list of attribute or method names on the instance which should be included in the dictionary.
2455    :param Select fields_from_query: The :py:class:`SelectQuery` that created this model instance. Only the fields and values explicitly selected by the query will be serialized.
2456    :param int max_depth: Maximum depth when recursing.
2457    :param bool manytomany: Process many-to-many fields.
2458
2459    Convert a model instance (and optionally any related instances) to
2460    a dictionary.
2461
2462    Examples:
2463
2464    .. code-block:: pycon
2465
2466        >>> user = User.create(username='charlie')
2467        >>> model_to_dict(user)
2468        {'id': 1, 'username': 'charlie'}
2469
2470        >>> model_to_dict(user, backrefs=True)
2471        {'id': 1, 'tweets': [], 'username': 'charlie'}
2472
2473        >>> t1 = Tweet.create(user=user, message='tweet-1')
2474        >>> t2 = Tweet.create(user=user, message='tweet-2')
2475        >>> model_to_dict(user, backrefs=True)
2476        {
2477          'id': 1,
2478          'tweets': [
2479            {'id': 1, 'message': 'tweet-1'},
2480            {'id': 2, 'message': 'tweet-2'},
2481          ],
2482          'username': 'charlie'
2483        }
2484
2485        >>> model_to_dict(t1)
2486        {
2487          'id': 1,
2488          'message': 'tweet-1',
2489          'user': {
2490            'id': 1,
2491            'username': 'charlie'
2492          }
2493        }
2494
2495        >>> model_to_dict(t2, recurse=False)
2496        {'id': 1, 'message': 'tweet-2', 'user': 1}
2497
2498    The implementation of ``model_to_dict`` is fairly complex, owing to the
2499    various usages it attempts to support. If you have a special usage, I
2500    strongly advise that you do **not** attempt to shoe-horn some crazy
2501    combination of parameters into this function. Just write a simple function
2502    that accomplishes exactly what you're attempting to do.
2503
2504.. py:function:: dict_to_model(model_class, data[, ignore_unknown=False])
2505
2506    :param Model model_class: The model class to construct.
2507    :param dict data: A dictionary of data. Foreign keys can be included as nested dictionaries, and back-references as lists of dictionaries.
2508    :param bool ignore_unknown: Whether to allow unrecognized (non-field) attributes.
2509
2510    Convert a dictionary of data to a model instance, creating related
2511    instances where appropriate.
2512
2513    Examples:
2514
2515    .. code-block:: pycon
2516
2517        >>> user_data = {'id': 1, 'username': 'charlie'}
2518        >>> user = dict_to_model(User, user_data)
2519        >>> user
2520        <__main__.User at 0x7fea8fa4d490>
2521
2522        >>> user.username
2523        'charlie'
2524
2525        >>> note_data = {'id': 2, 'text': 'note text', 'user': user_data}
2526        >>> note = dict_to_model(Note, note_data)
2527        >>> note.text
2528        'note text'
2529        >>> note.user.username
2530        'charlie'
2531
2532        >>> user_with_notes = {
2533        ...     'id': 1,
2534        ...     'username': 'charlie',
2535        ...     'notes': [{'id': 1, 'text': 'note-1'}, {'id': 2, 'text': 'note-2'}]}
2536        >>> user = dict_to_model(User, user_with_notes)
2537        >>> user.notes[0].text
2538        'note-1'
2539        >>> user.notes[0].user.username
2540        'charlie'
2541
2542
2543.. py:function:: update_model_from_dict(instance, data[, ignore_unknown=False])
2544
2545    :param Model instance: The model instance to update.
2546    :param dict data: A dictionary of data. Foreign keys can be included as nested dictionaries, and back-references as lists of dictionaries.
2547    :param bool ignore_unknown: Whether to allow unrecognized (non-field) attributes.
2548
2549    Update a model instance with the given data dictionary.
2550
2551
2552.. py:function:: resolve_multimodel_query(query[, key='_model_identifier'])
2553
2554    :param query: a compound select query.
2555    :param str key: key to use for storing model identifier
2556    :return: an iteratable cursor that yields the proper model instance for
2557        each row selected in the compound select query.
2558
2559    Helper for resolving rows returned in a compound select query to the
2560    correct model instance type. For example, if you have a union of two
2561    different tables, this helper will resolve each row to the proper model
2562    when iterating over the query results.
2563
2564
2565.. _signals:
2566
2567Signal support
2568--------------
2569
2570Models with hooks for signals (a-la django) are provided in
2571``playhouse.signals``. To use the signals, you will need all of your project's
2572models to be a subclass of ``playhouse.signals.Model``, which overrides the
2573necessary methods to provide support for the various signals.
2574
2575.. code-block:: python
2576
2577    from playhouse.signals import Model, post_save
2578
2579
2580    class MyModel(Model):
2581        data = IntegerField()
2582
2583    @post_save(sender=MyModel)
2584    def on_save_handler(model_class, instance, created):
2585        put_data_in_cache(instance.data)
2586
2587.. warning::
2588    For what I hope are obvious reasons, Peewee signals do not work when you
2589    use the :py:meth:`Model.insert`, :py:meth:`Model.update`, or
2590    :py:meth:`Model.delete` methods. These methods generate queries that
2591    execute beyond the scope of the ORM, and the ORM does not know about which
2592    model instances might or might not be affected when the query executes.
2593
2594    Signals work by hooking into the higher-level peewee APIs like
2595    :py:meth:`Model.save` and :py:meth:`Model.delete_instance`, where the
2596    affected model instance is known ahead of time.
2597
2598The following signals are provided:
2599
2600``pre_save``
2601    Called immediately before an object is saved to the database. Provides an
2602    additional keyword argument ``created``, indicating whether the model is being
2603    saved for the first time or updated.
2604``post_save``
2605    Called immediately after an object is saved to the database. Provides an
2606    additional keyword argument ``created``, indicating whether the model is being
2607    saved for the first time or updated.
2608``pre_delete``
2609    Called immediately before an object is deleted from the database when :py:meth:`Model.delete_instance`
2610    is used.
2611``post_delete``
2612    Called immediately after an object is deleted from the database when :py:meth:`Model.delete_instance`
2613    is used.
2614``pre_init``
2615    Called when a model class is first instantiated
2616
2617
2618Connecting handlers
2619^^^^^^^^^^^^^^^^^^^
2620
2621Whenever a signal is dispatched, it will call any handlers that have been
2622registered. This allows totally separate code to respond to events like model
2623save and delete.
2624
2625The :py:class:`Signal` class provides a :py:meth:`~Signal.connect` method,
2626which takes a callback function and two optional parameters for "sender" and
2627"name". If specified, the "sender" parameter should be a single model class
2628and allows your callback to only receive signals from that one model class.
2629The "name" parameter is used as a convenient alias in the event you wish to
2630unregister your signal handler.
2631
2632Example usage:
2633
2634.. code-block:: python
2635
2636    from playhouse.signals import *
2637
2638    def post_save_handler(sender, instance, created):
2639        print '%s was just saved' % instance
2640
2641    # our handler will only be called when we save instances of SomeModel
2642    post_save.connect(post_save_handler, sender=SomeModel)
2643
2644All signal handlers accept as their first two arguments ``sender`` and
2645``instance``, where ``sender`` is the model class and ``instance`` is the
2646actual model being acted upon.
2647
2648If you'd like, you can also use a decorator to connect signal handlers. This
2649is functionally equivalent to the above example:
2650
2651.. code-block:: python
2652
2653    @post_save(sender=SomeModel)
2654    def post_save_handler(sender, instance, created):
2655        print '%s was just saved' % instance
2656
2657
2658Signal API
2659^^^^^^^^^^
2660
2661.. py:class:: Signal()
2662
2663    Stores a list of receivers (callbacks) and calls them when the "send"
2664    method is invoked.
2665
2666    .. py:method:: connect(receiver[, sender=None[, name=None]])
2667
2668        :param callable receiver: a callable that takes at least two parameters,
2669            a "sender", which is the Model subclass that triggered the signal, and
2670            an "instance", which is the actual model instance.
2671        :param Model sender: if specified, only instances of this model class will
2672            trigger the receiver callback.
2673        :param string name: a short alias
2674
2675        Add the receiver to the internal list of receivers, which will be called
2676        whenever the signal is sent.
2677
2678        .. code-block:: python
2679
2680            from playhouse.signals import post_save
2681            from project.handlers import cache_buster
2682
2683            post_save.connect(cache_buster, name='project.cache_buster')
2684
2685    .. py:method:: disconnect([receiver=None[, name=None]])
2686
2687        :param callable receiver: the callback to disconnect
2688        :param string name: a short alias
2689
2690        Disconnect the given receiver (or the receiver with the given name alias)
2691        so that it no longer is called. Either the receiver or the name must be
2692        provided.
2693
2694        .. code-block:: python
2695
2696            post_save.disconnect(name='project.cache_buster')
2697
2698    .. py:method:: send(instance, *args, **kwargs)
2699
2700        :param instance: a model instance
2701
2702        Iterates over the receivers and will call them in the order in which
2703        they were connected. If the receiver specified a sender, it will only
2704        be called if the instance is an instance of the sender.
2705
2706
2707    .. py:method __call__([sender=None[, name=None]])
2708
2709        Function decorator that is an alias for a signal's connect method:
2710
2711        .. code-block:: python
2712
2713            from playhouse.signals import connect, post_save
2714
2715            @post_save(name='project.cache_buster')
2716            def cache_bust_handler(sender, instance, *args, **kwargs):
2717                # bust the cache for this instance
2718                cache.delete(cache_key_for(instance))
2719
2720.. _pwiz:
2721
2722pwiz, a model generator
2723-----------------------
2724
2725``pwiz`` is a little script that ships with peewee and is capable of
2726introspecting an existing database and generating model code suitable for
2727interacting with the underlying data. If you have a database already, pwiz can
2728give you a nice boost by generating skeleton code with correct column
2729affinities and foreign keys.
2730
2731If you install peewee using ``setup.py install``, pwiz will be installed as a
2732"script" and you can just run:
2733
2734.. code-block:: console
2735
2736    python -m pwiz -e postgresql -u postgres my_postgres_db
2737
2738This will print a bunch of models to standard output. So you can do this:
2739
2740.. code-block:: console
2741
2742    python -m pwiz -e postgresql my_postgres_db > mymodels.py
2743    python # <-- fire up an interactive shell
2744
2745.. code-block:: pycon
2746
2747    >>> from mymodels import Blog, Entry, Tag, Whatever
2748    >>> print [blog.name for blog in Blog.select()]
2749
2750Command-line options
2751^^^^^^^^^^^^^^^^^^^^
2752
2753pwiz accepts the following command-line options:
2754
2755======    =================================== ============================================
2756Option    Meaning                             Example
2757======    =================================== ============================================
2758-h        show help
2759-e        database backend                    -e mysql
2760-H        host to connect to                  -H remote.db.server
2761-p        port to connect on                  -p 9001
2762-u        database user                       -u postgres
2763-P        database password                   -P (will be prompted for password)
2764-s        schema                              -s public
2765-t        tables to generate                  -t tweet,users,relationships
2766-v        generate models for VIEWs           (no argument)
2767-i        add info metadata to generated file (no argument)
2768-o        table column order is preserved     (no argument)
2769======    =================================== ============================================
2770
2771The following are valid parameters for the ``engine`` (``-e``):
2772
2773* sqlite
2774* mysql
2775* postgresql
2776
2777.. warning::
2778    If a password is required to access your database, you will be prompted to
2779    enter it using a secure prompt.
2780
2781    **The password will be included in the output**. Specifically, at the top
2782    of the file a :py:class:`Database` will be defined along with any required
2783    parameters -- including the password.
2784
2785pwiz examples
2786^^^^^^^^^^^^^
2787
2788Examples of introspecting various databases:
2789
2790.. code-block:: console
2791
2792    # Introspect a Sqlite database.
2793    python -m pwiz -e sqlite path/to/sqlite_database.db
2794
2795    # Introspect a MySQL database, logging in as root. You will be prompted
2796    # for a password ("-P").
2797    python -m pwiz -e mysql -u root -P mysql_db_name
2798
2799    # Introspect a Postgresql database on a remote server.
2800    python -m pwiz -e postgres -u postgres -H 10.1.0.3 pg_db_name
2801
2802Full example:
2803
2804.. code-block:: console
2805
2806    $ sqlite3 example.db << EOM
2807    CREATE TABLE "user" ("id" INTEGER NOT NULL PRIMARY KEY, "username" TEXT NOT NULL);
2808    CREATE TABLE "tweet" (
2809        "id" INTEGER NOT NULL PRIMARY KEY,
2810        "content" TEXT NOT NULL,
2811        "timestamp" DATETIME NOT NULL,
2812        "user_id" INTEGER NOT NULL,
2813        FOREIGN KEY ("user_id") REFERENCES "user" ("id"));
2814    CREATE UNIQUE INDEX "user_username" ON "user" ("username");
2815    EOM
2816
2817    $ python -m pwiz -e sqlite example.db
2818
2819Produces the following output:
2820
2821.. code-block:: python
2822
2823    from peewee import *
2824
2825    database = SqliteDatabase('example.db', **{})
2826
2827    class UnknownField(object):
2828        def __init__(self, *_, **__): pass
2829
2830    class BaseModel(Model):
2831        class Meta:
2832            database = database
2833
2834    class User(BaseModel):
2835        username = TextField(unique=True)
2836
2837        class Meta:
2838            table_name = 'user'
2839
2840    class Tweet(BaseModel):
2841        content = TextField()
2842        timestamp = DateTimeField()
2843        user = ForeignKeyField(column_name='user_id', field='id', model=User)
2844
2845        class Meta:
2846            table_name = 'tweet'
2847
2848Observations:
2849
2850* The foreign-key ``Tweet.user_id`` is detected and mapped correctly.
2851* The ``User.username`` UNIQUE constraint is detected.
2852* Each model explicitly declares its table name, even in cases where it is not
2853  necessary (as Peewee would automatically translate the class name into the
2854  appropriate table name).
2855* All the parameters of the :py:class:`ForeignKeyField` are explicitly
2856  declared, even though they follow the conventions Peewee uses by default.
2857
2858.. note::
2859    The ``UnknownField`` is a placeholder that is used in the event your schema
2860    contains a column declaration that Peewee doesn't know how to map to a
2861    field class.
2862
2863.. _migrate:
2864
2865Schema Migrations
2866-----------------
2867
2868Peewee now supports schema migrations, with well-tested support for Postgresql,
2869SQLite and MySQL. Unlike other schema migration tools, peewee's migrations do
2870not handle introspection and database "versioning". Rather, peewee provides a
2871number of helper functions for generating and running schema-altering
2872statements. This engine provides the basis on which a more sophisticated tool
2873could some day be built.
2874
2875Migrations can be written as simple python scripts and executed from the
2876command-line. Since the migrations only depend on your applications
2877:py:class:`Database` object, it should be easy to manage changing your model
2878definitions and maintaining a set of migration scripts without introducing
2879dependencies.
2880
2881Example usage
2882^^^^^^^^^^^^^
2883
2884Begin by importing the helpers from the `migrate` module:
2885
2886.. code-block:: python
2887
2888    from playhouse.migrate import *
2889
2890Instantiate a ``migrator``. The :py:class:`SchemaMigrator` class is responsible
2891for generating schema altering operations, which can then be run sequentially
2892by the :py:func:`migrate` helper.
2893
2894.. code-block:: python
2895
2896    # Postgres example:
2897    my_db = PostgresqlDatabase(...)
2898    migrator = PostgresqlMigrator(my_db)
2899
2900    # SQLite example:
2901    my_db = SqliteDatabase('my_database.db')
2902    migrator = SqliteMigrator(my_db)
2903
2904Use :py:func:`migrate` to execute one or more operations:
2905
2906.. code-block:: python
2907
2908    title_field = CharField(default='')
2909    status_field = IntegerField(null=True)
2910
2911    migrate(
2912        migrator.add_column('some_table', 'title', title_field),
2913        migrator.add_column('some_table', 'status', status_field),
2914        migrator.drop_column('some_table', 'old_column'),
2915    )
2916
2917.. warning::
2918    Migrations are not run inside a transaction. If you wish the migration to
2919    run in a transaction you will need to wrap the call to `migrate` in a
2920    :py:meth:`~Database.atomic` context-manager, e.g.
2921
2922    .. code-block:: python
2923
2924        with my_db.atomic():
2925            migrate(...)
2926
2927Supported Operations
2928^^^^^^^^^^^^^^^^^^^^
2929
2930Add new field(s) to an existing model:
2931
2932.. code-block:: python
2933
2934    # Create your field instances. For non-null fields you must specify a
2935    # default value.
2936    pubdate_field = DateTimeField(null=True)
2937    comment_field = TextField(default='')
2938
2939    # Run the migration, specifying the database table, field name and field.
2940    migrate(
2941        migrator.add_column('comment_tbl', 'pub_date', pubdate_field),
2942        migrator.add_column('comment_tbl', 'comment', comment_field),
2943    )
2944
2945Renaming a field:
2946
2947.. code-block:: python
2948
2949    # Specify the table, original name of the column, and its new name.
2950    migrate(
2951        migrator.rename_column('story', 'pub_date', 'publish_date'),
2952        migrator.rename_column('story', 'mod_date', 'modified_date'),
2953    )
2954
2955Dropping a field:
2956
2957.. code-block:: python
2958
2959    migrate(
2960        migrator.drop_column('story', 'some_old_field'),
2961    )
2962
2963Making a field nullable or not nullable:
2964
2965.. code-block:: python
2966
2967    # Note that when making a field not null that field must not have any
2968    # NULL values present.
2969    migrate(
2970        # Make `pub_date` allow NULL values.
2971        migrator.drop_not_null('story', 'pub_date'),
2972
2973        # Prevent `modified_date` from containing NULL values.
2974        migrator.add_not_null('story', 'modified_date'),
2975    )
2976
2977Altering a field's data-type:
2978
2979.. code-block:: python
2980
2981    # Change a VARCHAR(50) field to a TEXT field.
2982    migrate(
2983        migrator.alter_column_type('person', 'email', TextField())
2984    )
2985
2986Renaming a table:
2987
2988.. code-block:: python
2989
2990    migrate(
2991        migrator.rename_table('story', 'stories_tbl'),
2992    )
2993
2994Adding an index:
2995
2996.. code-block:: python
2997
2998    # Specify the table, column names, and whether the index should be
2999    # UNIQUE or not.
3000    migrate(
3001        # Create an index on the `pub_date` column.
3002        migrator.add_index('story', ('pub_date',), False),
3003
3004        # Create a multi-column index on the `pub_date` and `status` fields.
3005        migrator.add_index('story', ('pub_date', 'status'), False),
3006
3007        # Create a unique index on the category and title fields.
3008        migrator.add_index('story', ('category_id', 'title'), True),
3009    )
3010
3011Dropping an index:
3012
3013.. code-block:: python
3014
3015    # Specify the index name.
3016    migrate(migrator.drop_index('story', 'story_pub_date_status'))
3017
3018Adding or dropping table constraints:
3019
3020.. code-block:: python
3021
3022    # Add a CHECK() constraint to enforce the price cannot be negative.
3023    migrate(migrator.add_constraint(
3024        'products',
3025        'price_check',
3026        Check('price >= 0')))
3027
3028    # Remove the price check constraint.
3029    migrate(migrator.drop_constraint('products', 'price_check'))
3030
3031    # Add a UNIQUE constraint on the first and last names.
3032    migrate(migrator.add_unique('person', 'first_name', 'last_name'))
3033
3034.. note::
3035    Postgres users may need to set the search-path when using a non-standard
3036    schema. This can be done as follows:
3037
3038    .. code-block:: python
3039
3040        new_field = TextField(default='', null=False)
3041        migrator = PostgresqlMigrator(db)
3042        migrate(migrator.set_search_path('my_schema_name'),
3043                migrator.add_column('table', 'field_name', new_field))
3044
3045
3046Migrations API
3047^^^^^^^^^^^^^^
3048
3049.. py:function:: migrate(*operations)
3050
3051    Execute one or more schema altering operations.
3052
3053    Usage:
3054
3055    .. code-block:: python
3056
3057        migrate(
3058            migrator.add_column('some_table', 'new_column', CharField(default='')),
3059            migrator.create_index('some_table', ('new_column',)),
3060        )
3061
3062.. py:class:: SchemaMigrator(database)
3063
3064    :param database: a :py:class:`Database` instance.
3065
3066    The :py:class:`SchemaMigrator` is responsible for generating schema-altering
3067    statements.
3068
3069    .. py:method:: add_column(table, column_name, field)
3070
3071        :param str table: Name of the table to add column to.
3072        :param str column_name: Name of the new column.
3073        :param Field field: A :py:class:`Field` instance.
3074
3075        Add a new column to the provided table. The ``field`` provided will be used
3076        to generate the appropriate column definition.
3077
3078        .. note:: If the field is not nullable it must specify a default value.
3079
3080        .. note::
3081            For non-null fields, the field will initially be added as a null field,
3082            then an ``UPDATE`` statement will be executed to populate the column
3083            with the default value. Finally, the column will be marked as not null.
3084
3085    .. py:method:: drop_column(table, column_name[, cascade=True])
3086
3087        :param str table: Name of the table to drop column from.
3088        :param str column_name: Name of the column to drop.
3089        :param bool cascade: Whether the column should be dropped with `CASCADE`.
3090
3091    .. py:method:: rename_column(table, old_name, new_name)
3092
3093        :param str table: Name of the table containing column to rename.
3094        :param str old_name: Current name of the column.
3095        :param str new_name: New name for the column.
3096
3097    .. py:method:: add_not_null(table, column)
3098
3099        :param str table: Name of table containing column.
3100        :param str column: Name of the column to make not nullable.
3101
3102    .. py:method:: drop_not_null(table, column)
3103
3104        :param str table: Name of table containing column.
3105        :param str column: Name of the column to make nullable.
3106
3107    .. py:method:: alter_column_type(table, column, field[, cast=None])
3108
3109        :param str table: Name of the table.
3110        :param str column_name: Name of the column to modify.
3111        :param Field field: :py:class:`Field` instance representing new
3112            data type.
3113        :param cast: (postgres-only) specify a cast expression if the
3114            data-types are incompatible, e.g. ``column_name::int``. Can be
3115            provided as either a string or a :py:class:`Cast` instance.
3116
3117        Alter the data-type of a column. This method should be used with care,
3118        as using incompatible types may not be well-supported by your database.
3119
3120    .. py:method:: rename_table(old_name, new_name)
3121
3122        :param str old_name: Current name of the table.
3123        :param str new_name: New name for the table.
3124
3125    .. py:method:: add_index(table, columns[, unique=False[, using=None]])
3126
3127        :param str table: Name of table on which to create the index.
3128        :param list columns: List of columns which should be indexed.
3129        :param bool unique: Whether the new index should specify a unique constraint.
3130        :param str using: Index type (where supported), e.g. GiST or GIN.
3131
3132    .. py:method:: drop_index(table, index_name)
3133
3134        :param str table: Name of the table containing the index to be dropped.
3135        :param str index_name: Name of the index to be dropped.
3136
3137    .. py:method:: add_constraint(table, name, constraint)
3138
3139        :param str table: Table to add constraint to.
3140        :param str name: Name used to identify the constraint.
3141        :param constraint: either a :py:func:`Check` constraint or for
3142            adding an arbitrary constraint use :py:class:`SQL`.
3143
3144    .. py:method:: drop_constraint(table, name)
3145
3146        :param str table: Table to drop constraint from.
3147        :param str name: Name of constraint to drop.
3148
3149    .. py:method:: add_unique(table, *column_names)
3150
3151        :param str table: Table to add constraint to.
3152        :param str column_names: One or more columns for UNIQUE constraint.
3153
3154.. py:class:: PostgresqlMigrator(database)
3155
3156    Generate migrations for Postgresql databases.
3157
3158    .. py:method:: set_search_path(schema_name)
3159
3160        :param str schema_name: Schema to use.
3161
3162        Set the search path (schema) for the subsequent operations.
3163
3164.. py:class:: SqliteMigrator(database)
3165
3166    Generate migrations for SQLite databases.
3167
3168    SQLite has limited support for ``ALTER TABLE`` queries, so the following
3169    operations are currently not supported for SQLite:
3170
3171    * ``add_constraint``
3172    * ``drop_constraint``
3173    * ``add_unique``
3174
3175.. py:class:: MySQLMigrator(database)
3176
3177    Generate migrations for MySQL databases.
3178
3179
3180.. _reflection:
3181
3182Reflection
3183----------
3184
3185The reflection module contains helpers for introspecting existing databases.
3186This module is used internally by several other modules in the playhouse,
3187including :ref:`dataset` and :ref:`pwiz`.
3188
3189.. py:function:: generate_models(database[, schema=None[, **options]])
3190
3191    :param Database database: database instance to introspect.
3192    :param str schema: optional schema to introspect.
3193    :param options: arbitrary options, see :py:meth:`Introspector.generate_models` for details.
3194    :returns: a ``dict`` mapping table names to model classes.
3195
3196    Generate models for the tables in the given database. For an example of how
3197    to use this function, see the section :ref:`interactive`.
3198
3199    Example:
3200
3201    .. code-block:: pycon
3202
3203        >>> from peewee import *
3204        >>> from playhouse.reflection import generate_models
3205        >>> db = PostgresqlDatabase('my_app')
3206        >>> models = generate_models(db)
3207        >>> list(models.keys())
3208        ['account', 'customer', 'order', 'orderitem', 'product']
3209
3210        >>> globals().update(models)  # Inject models into namespace.
3211        >>> for cust in customer.select():  # Query using generated model.
3212        ...     print(cust.name)
3213        ...
3214
3215        Huey Kitty
3216        Mickey Dog
3217
3218.. py:function:: print_model(model)
3219
3220    :param Model model: model class to print
3221    :returns: no return value
3222
3223    Print a user-friendly description of a model class, useful for debugging or
3224    interactive use. Currently this prints the table name, and all fields along
3225    with their data-types. The :ref:`interactive` section contains an example.
3226
3227    Example output:
3228
3229    .. code-block:: pycon
3230
3231        >>> from playhouse.reflection import print_model
3232        >>> print_model(User)
3233        user
3234          id AUTO PK
3235          email TEXT
3236          name TEXT
3237          dob DATE
3238
3239        index(es)
3240          email UNIQUE
3241
3242        >>> print_model(Tweet)
3243        tweet
3244          id AUTO PK
3245          user INT FK: User.id
3246          title TEXT
3247          content TEXT
3248          timestamp DATETIME
3249          is_published BOOL
3250
3251        index(es)
3252          user_id
3253          is_published, timestamp
3254
3255.. py:function:: print_table_sql(model)
3256
3257    :param Model model: model to print
3258    :returns: no return value
3259
3260    Prints the SQL ``CREATE TABLE`` for the given model class, which may be
3261    useful for debugging or interactive use. See the :ref:`interactive` section
3262    for example usage. Note that indexes and constraints are not included in
3263    the output of this function.
3264
3265    Example output:
3266
3267    .. code-block:: pycon
3268
3269        >>> from playhouse.reflection import print_table_sql
3270        >>> print_table_sql(User)
3271        CREATE TABLE IF NOT EXISTS "user" (
3272          "id" INTEGER NOT NULL PRIMARY KEY,
3273          "email" TEXT NOT NULL,
3274          "name" TEXT NOT NULL,
3275          "dob" DATE NOT NULL
3276        )
3277
3278        >>> print_table_sql(Tweet)
3279        CREATE TABLE IF NOT EXISTS "tweet" (
3280          "id" INTEGER NOT NULL PRIMARY KEY,
3281          "user_id" INTEGER NOT NULL,
3282          "title" TEXT NOT NULL,
3283          "content" TEXT NOT NULL,
3284          "timestamp" DATETIME NOT NULL,
3285          "is_published" INTEGER NOT NULL,
3286          FOREIGN KEY ("user_id") REFERENCES "user" ("id")
3287        )
3288
3289.. py:class:: Introspector(metadata[, schema=None])
3290
3291    Metadata can be extracted from a database by instantiating an
3292    :py:class:`Introspector`. Rather than instantiating this class directly, it
3293    is recommended to use the factory method
3294    :py:meth:`~Introspector.from_database`.
3295
3296    .. py:classmethod:: from_database(database[, schema=None])
3297
3298        :param database: a :py:class:`Database` instance.
3299        :param str schema: an optional schema (supported by some databases).
3300
3301        Creates an :py:class:`Introspector` instance suitable for use with the
3302        given database.
3303
3304        Usage:
3305
3306        .. code-block:: python
3307
3308            db = SqliteDatabase('my_app.db')
3309            introspector = Introspector.from_database(db)
3310            models = introspector.generate_models()
3311
3312            # User and Tweet (assumed to exist in the database) are
3313            # peewee Model classes generated from the database schema.
3314            User = models['user']
3315            Tweet = models['tweet']
3316
3317    .. py:method:: generate_models([skip_invalid=False[, table_names=None[, literal_column_names=False[, bare_fields=False[, include_views=False]]]]])
3318
3319        :param bool skip_invalid: Skip tables whose names are invalid python
3320            identifiers.
3321        :param list table_names: List of table names to generate. If
3322            unspecified, models are generated for all tables.
3323        :param bool literal_column_names: Use column-names as-is. By default,
3324            column names are "python-ized", i.e. mixed-case becomes lower-case.
3325        :param bare_fields: **SQLite-only**. Do not specify data-types for
3326            introspected columns.
3327        :param include_views: generate models for VIEWs as well.
3328        :return: A dictionary mapping table-names to model classes.
3329
3330        Introspect the database, reading in the tables, columns, and foreign
3331        key constraints, then generate a dictionary mapping each database table
3332        to a dynamically-generated :py:class:`Model` class.
3333
3334
3335.. _db_url:
3336
3337Database URL
3338------------
3339
3340This module contains a helper function to generate a database connection from a
3341URL connection string.
3342
3343.. py:function:: connect(url, **connect_params)
3344
3345    Create a :py:class:`Database` instance from the given connection URL.
3346
3347    Examples:
3348
3349    * *sqlite:///my_database.db* will create a :py:class:`SqliteDatabase` instance for the file ``my_database.db`` in the current directory.
3350    * *sqlite:///:memory:* will create an in-memory :py:class:`SqliteDatabase` instance.
3351    * *postgresql://postgres:my_password@localhost:5432/my_database* will create a :py:class:`PostgresqlDatabase` instance. A username and password are provided, as well as the host and port to connect to.
3352    * *mysql://user:passwd@ip:port/my_db* will create a :py:class:`MySQLDatabase` instance for the local MySQL database *my_db*.
3353    * *mysql+pool://user:passwd@ip:port/my_db?max_connections=20&stale_timeout=300* will create a :py:class:`PooledMySQLDatabase` instance for the local MySQL database *my_db* with max_connections set to 20 and a stale_timeout setting of 300 seconds.
3354
3355    Supported schemes:
3356
3357    * ``apsw``: :py:class:`APSWDatabase`
3358    * ``mysql``: :py:class:`MySQLDatabase`
3359    * ``mysql+pool``: :py:class:`PooledMySQLDatabase`
3360    * ``postgres``: :py:class:`PostgresqlDatabase`
3361    * ``postgres+pool``: :py:class:`PooledPostgresqlDatabase`
3362    * ``postgresext``: :py:class:`PostgresqlExtDatabase`
3363    * ``postgresext+pool``: :py:class:`PooledPostgresqlExtDatabase`
3364    * ``sqlite``: :py:class:`SqliteDatabase`
3365    * ``sqliteext``: :py:class:`SqliteExtDatabase`
3366    * ``sqlite+pool``: :py:class:`PooledSqliteDatabase`
3367    * ``sqliteext+pool``: :py:class:`PooledSqliteExtDatabase`
3368
3369    Usage:
3370
3371    .. code-block:: python
3372
3373        import os
3374        from playhouse.db_url import connect
3375
3376        # Connect to the database URL defined in the environment, falling
3377        # back to a local Sqlite database if no database URL is specified.
3378        db = connect(os.environ.get('DATABASE') or 'sqlite:///default.db')
3379
3380.. py:function:: parse(url)
3381
3382    Parse the information in the given URL into a dictionary containing
3383    ``database``, ``host``, ``port``, ``user`` and/or ``password``. Additional
3384    connection arguments can be passed in the URL query string.
3385
3386    If you are using a custom database class, you can use the ``parse()``
3387    function to extract information from a URL which can then be passed in to
3388    your database object.
3389
3390.. py:function:: register_database(db_class, *names)
3391
3392    :param db_class: A subclass of :py:class:`Database`.
3393    :param names: A list of names to use as the scheme in the URL, e.g. 'sqlite' or 'firebird'
3394
3395    Register additional database class under the specified names. This function
3396    can be used to extend the ``connect()`` function to support additional
3397    schemes. Suppose you have a custom database class for ``Firebird`` named
3398    ``FirebirdDatabase``.
3399
3400    .. code-block:: python
3401
3402        from playhouse.db_url import connect, register_database
3403
3404        register_database(FirebirdDatabase, 'firebird')
3405        db = connect('firebird://my-firebird-db')
3406
3407.. _pool:
3408
3409Connection pool
3410---------------
3411
3412The ``pool`` module contains a number of :py:class:`Database` classes that
3413provide connection pooling for PostgreSQL, MySQL and SQLite databases. The pool
3414works by overriding the methods on the :py:class:`Database` class that open and
3415close connections to the backend. The pool can specify a timeout after which
3416connections are recycled, as well as an upper bound on the number of open
3417connections.
3418
3419In a multi-threaded application, up to `max_connections` will be opened. Each
3420thread (or, if using gevent, greenlet) will have its own connection.
3421
3422In a single-threaded application, only one connection will be created. It will
3423be continually recycled until either it exceeds the stale timeout or is closed
3424explicitly (using `.manual_close()`).
3425
3426**By default, all your application needs to do is ensure that connections are
3427closed when you are finished with them, and they will be returned to the
3428pool**. For web applications, this typically means that at the beginning of a
3429request, you will open a connection, and when you return a response, you will
3430close the connection.
3431
3432Simple Postgres pool example code:
3433
3434.. code-block:: python
3435
3436    # Use the special postgresql extensions.
3437    from playhouse.pool import PooledPostgresqlExtDatabase
3438
3439    db = PooledPostgresqlExtDatabase(
3440        'my_app',
3441        max_connections=32,
3442        stale_timeout=300,  # 5 minutes.
3443        user='postgres')
3444
3445    class BaseModel(Model):
3446        class Meta:
3447            database = db
3448
3449That's it! If you would like finer-grained control over the pool of
3450connections, check out the :ref:`connection_management` section.
3451
3452Pool APIs
3453^^^^^^^^^
3454
3455.. py:class:: PooledDatabase(database[, max_connections=20[, stale_timeout=None[, timeout=None[, **kwargs]]]])
3456
3457    :param str database: The name of the database or database file.
3458    :param int max_connections: Maximum number of connections. Provide ``None`` for unlimited.
3459    :param int stale_timeout: Number of seconds to allow connections to be used.
3460    :param int timeout: Number of seconds to block when pool is full. By default peewee does not block when the pool is full but simply throws an exception. To block indefinitely set this value to ``0``.
3461    :param kwargs: Arbitrary keyword arguments passed to database class.
3462
3463    Mixin class intended to be used with a subclass of :py:class:`Database`.
3464
3465    .. note:: Connections will not be closed exactly when they exceed their `stale_timeout`. Instead, stale connections are only closed when a new connection is requested.
3466
3467    .. note:: If the number of open connections exceeds `max_connections`, a `ValueError` will be raised.
3468
3469    .. py:method:: manual_close()
3470
3471        Close the currently-open connection without returning it to the pool.
3472
3473    .. py:method:: close_idle()
3474
3475        Close all idle connections. This does not include any connections that
3476        are currently in-use -- only those that were previously created but
3477        have since been returned back to the pool.
3478
3479    .. py:method:: close_stale([age=600])
3480
3481        :param int age: Age at which a connection should be considered stale.
3482        :returns: Number of connections closed.
3483
3484        Close connections which are in-use but exceed the given age. **Use
3485        caution when calling this method!**
3486
3487    .. py:method:: close_all()
3488
3489        Close all connections. This includes any connections that may be in use
3490        at the time. **Use caution when calling this method!**
3491
3492.. py:class:: PooledPostgresqlDatabase
3493
3494    Subclass of :py:class:`PostgresqlDatabase` that mixes in the :py:class:`PooledDatabase` helper.
3495
3496.. py:class:: PooledPostgresqlExtDatabase
3497
3498    Subclass of :py:class:`PostgresqlExtDatabase` that mixes in the :py:class:`PooledDatabase` helper. The :py:class:`PostgresqlExtDatabase` is a part of the
3499    :ref:`postgres_ext` module and provides support for many Postgres-specific
3500    features.
3501
3502.. py:class:: PooledMySQLDatabase
3503
3504    Subclass of :py:class:`MySQLDatabase` that mixes in the :py:class:`PooledDatabase` helper.
3505
3506.. py:class:: PooledSqliteDatabase
3507
3508    Persistent connections for SQLite apps.
3509
3510.. py:class:: PooledSqliteExtDatabase
3511
3512    Persistent connections for SQLite apps, using the :ref:`sqlite_ext` advanced database driver :py:class:`SqliteExtDatabase`.
3513
3514.. _test_utils:
3515
3516Test Utils
3517----------
3518
3519Contains utilities helpful when testing peewee projects.
3520
3521.. py:class:: count_queries([only_select=False])
3522
3523    Context manager that will count the number of queries executed within
3524    the context.
3525
3526    :param bool only_select: Only count *SELECT* queries.
3527
3528    .. code-block:: python
3529
3530        with count_queries() as counter:
3531            huey = User.get(User.username == 'huey')
3532            huey_tweets = [tweet.message for tweet in huey.tweets]
3533
3534        assert counter.count == 2
3535
3536    .. py:attribute:: count
3537
3538        The number of queries executed.
3539
3540    .. py:method:: get_queries()
3541
3542        Return a list of 2-tuples consisting of the SQL query and a list of
3543        parameters.
3544
3545
3546.. py:function:: assert_query_count(expected[, only_select=False])
3547
3548    Function or method decorator that will raise an ``AssertionError`` if the
3549    number of queries executed in the decorated function does not equal the
3550    expected number.
3551
3552    .. code-block:: python
3553
3554        class TestMyApp(unittest.TestCase):
3555            @assert_query_count(1)
3556            def test_get_popular_blogs(self):
3557                popular_blogs = Blog.get_popular()
3558                self.assertEqual(
3559                    [blog.title for blog in popular_blogs],
3560                    ["Peewee's Playhouse!", "All About Huey", "Mickey's Adventures"])
3561
3562    This function can also be used as a context manager:
3563
3564    .. code-block:: python
3565
3566        class TestMyApp(unittest.TestCase):
3567            def test_expensive_operation(self):
3568                with assert_query_count(1):
3569                    perform_expensive_operation()
3570
3571
3572.. _flask_utils:
3573
3574Flask Utils
3575-----------
3576
3577The ``playhouse.flask_utils`` module contains several helpers for integrating
3578peewee with the `Flask <http://flask.pocoo.org/>`_ web framework.
3579
3580Database Wrapper
3581^^^^^^^^^^^^^^^^
3582
3583The :py:class:`FlaskDB` class is a wrapper for configuring and referencing a
3584Peewee database from within a Flask application. Don't let its name fool you:
3585it is **not the same thing as a peewee database**. ``FlaskDB`` is designed to
3586remove the following boilerplate from your flask app:
3587
3588* Dynamically create a Peewee database instance based on app config data.
3589* Create a base class from which all your application's models will descend.
3590* Register hooks at the start and end of a request to handle opening and
3591  closing a database connection.
3592
3593Basic usage:
3594
3595.. code-block:: python
3596
3597    import datetime
3598    from flask import Flask
3599    from peewee import *
3600    from playhouse.flask_utils import FlaskDB
3601
3602    DATABASE = 'postgresql://postgres:password@localhost:5432/my_database'
3603
3604    app = Flask(__name__)
3605    app.config.from_object(__name__)
3606
3607    db_wrapper = FlaskDB(app)
3608
3609    class User(db_wrapper.Model):
3610        username = CharField(unique=True)
3611
3612    class Tweet(db_wrapper.Model):
3613        user = ForeignKeyField(User, backref='tweets')
3614        content = TextField()
3615        timestamp = DateTimeField(default=datetime.datetime.now)
3616
3617The above code example will create and instantiate a peewee
3618:py:class:`PostgresqlDatabase` specified by the given database URL. Request
3619hooks will be configured to establish a connection when a request is received,
3620and automatically close the connection when the response is sent. Lastly, the
3621:py:class:`FlaskDB` class exposes a :py:attr:`FlaskDB.Model` property which can
3622be used as a base for your application's models.
3623
3624Here is how you can access the wrapped Peewee database instance that is
3625configured for you by the ``FlaskDB`` wrapper:
3626
3627.. code-block:: python
3628
3629    # Obtain a reference to the Peewee database instance.
3630    peewee_db = db_wrapper.database
3631
3632    @app.route('/transfer-funds/', methods=['POST'])
3633    def transfer_funds():
3634        with peewee_db.atomic():
3635            # ...
3636
3637        return jsonify({'transfer-id': xid})
3638
3639.. note:: The actual peewee database can be accessed using the ``FlaskDB.database`` attribute.
3640
3641Here is another way to configure a Peewee database using ``FlaskDB``:
3642
3643.. code-block:: python
3644
3645    app = Flask(__name__)
3646    db_wrapper = FlaskDB(app, 'sqlite:///my_app.db')
3647
3648While the above examples show using a database URL, for more advanced usages
3649you can specify a dictionary of configuration options, or simply pass in a
3650peewee :py:class:`Database` instance:
3651
3652.. code-block:: python
3653
3654    DATABASE = {
3655        'name': 'my_app_db',
3656        'engine': 'playhouse.pool.PooledPostgresqlDatabase',
3657        'user': 'postgres',
3658        'max_connections': 32,
3659        'stale_timeout': 600,
3660    }
3661
3662    app = Flask(__name__)
3663    app.config.from_object(__name__)
3664
3665    wrapper = FlaskDB(app)
3666    pooled_postgres_db = wrapper.database
3667
3668Using a peewee :py:class:`Database` object:
3669
3670.. code-block:: python
3671
3672    peewee_db = PostgresqlExtDatabase('my_app')
3673    app = Flask(__name__)
3674    db_wrapper = FlaskDB(app, peewee_db)
3675
3676
3677Database with Application Factory
3678^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
3679
3680If you prefer to use the `application factory pattern <https://flask.palletsprojects.com/en/1.1.x/patterns/appfactories/>`_,
3681the :py:class:`FlaskDB` class implements an ``init_app()`` method.
3682
3683Using as a factory:
3684
3685.. code-block:: python
3686
3687    db_wrapper = FlaskDB()
3688
3689    # Even though the database is not yet initialized, you can still use the
3690    # `Model` property to create model classes.
3691    class User(db_wrapper.Model):
3692        username = CharField(unique=True)
3693
3694
3695    def create_app():
3696        app = Flask(__name__)
3697        app.config['DATABASE'] = 'sqlite:////home/code/apps/my-database.db'
3698        db_wrapper.init_app(app)
3699        return app
3700
3701Query utilities
3702^^^^^^^^^^^^^^^
3703
3704The ``flask_utils`` module provides several helpers for managing queries in your web app. Some common patterns include:
3705
3706.. py:function:: get_object_or_404(query_or_model, *query)
3707
3708    :param query_or_model: Either a :py:class:`Model` class or a pre-filtered :py:class:`SelectQuery`.
3709    :param query: An arbitrarily complex peewee expression.
3710
3711    Retrieve the object matching the given query, or return a 404 not found
3712    response. A common use-case might be a detail page for a weblog. You want
3713    to either retrieve the post matching the given URL, or return a 404.
3714
3715    Example:
3716
3717    .. code-block:: python
3718
3719        @app.route('/blog/<slug>/')
3720        def post_detail(slug):
3721            public_posts = Post.select().where(Post.published == True)
3722            post = get_object_or_404(public_posts, (Post.slug == slug))
3723            return render_template('post_detail.html', post=post)
3724
3725.. py:function:: object_list(template_name, query[, context_variable='object_list'[, paginate_by=20[, page_var='page'[, check_bounds=True[, **kwargs]]]]])
3726
3727    :param template_name: The name of the template to render.
3728    :param query: A :py:class:`SelectQuery` instance to paginate.
3729    :param context_variable: The context variable name to use for the paginated object list.
3730    :param paginate_by: Number of objects per-page.
3731    :param page_var: The name of the ``GET`` argument which contains the page.
3732    :param check_bounds: Whether to check that the given page is a valid page. If ``check_bounds`` is ``True`` and an invalid page is specified, then a 404 will be returned.
3733    :param kwargs: Arbitrary key/value pairs to pass into the template context.
3734
3735    Retrieve a paginated list of objects specified by the given query. The
3736    paginated object list will be dropped into the context using the given
3737    ``context_variable``, as well as metadata about the current page and total
3738    number of pages, and finally any arbitrary context data passed as
3739    keyword-arguments.
3740
3741    The page is specified using the ``page`` ``GET`` argument, e.g.
3742    ``/my-object-list/?page=3`` would return the third page of objects.
3743
3744    Example:
3745
3746    .. code-block:: python
3747
3748        @app.route('/blog/')
3749        def post_index():
3750            public_posts = (Post
3751                            .select()
3752                            .where(Post.published == True)
3753                            .order_by(Post.timestamp.desc()))
3754
3755            return object_list(
3756                'post_index.html',
3757                query=public_posts,
3758                context_variable='post_list',
3759                paginate_by=10)
3760
3761    The template will have the following context:
3762
3763    * ``post_list``, which contains a list of up to 10 posts.
3764    * ``page``, which contains the current page based on the value of the ``page`` ``GET`` parameter.
3765    * ``pagination``, a :py:class:`PaginatedQuery` instance.
3766
3767.. py:class:: PaginatedQuery(query_or_model, paginate_by[, page_var='page'[, check_bounds=False]])
3768
3769    :param query_or_model: Either a :py:class:`Model` or a :py:class:`SelectQuery` instance containing the collection of records you wish to paginate.
3770    :param paginate_by: Number of objects per-page.
3771    :param page_var: The name of the ``GET`` argument which contains the page.
3772    :param check_bounds: Whether to check that the given page is a valid page. If ``check_bounds`` is ``True`` and an invalid page is specified, then a 404 will be returned.
3773
3774    Helper class to perform pagination based on ``GET`` arguments.
3775
3776    .. py:method:: get_page()
3777
3778        Return the currently selected page, as indicated by the value of the
3779        ``page_var`` ``GET`` parameter. If no page is explicitly selected, then
3780        this method will return 1, indicating the first page.
3781
3782    .. py:method:: get_page_count()
3783
3784        Return the total number of possible pages.
3785
3786    .. py:method:: get_object_list()
3787
3788        Using the value of :py:meth:`~PaginatedQuery.get_page`, return the page
3789        of objects requested by the user. The return value is a
3790        :py:class:`SelectQuery` with the appropriate ``LIMIT`` and ``OFFSET``
3791        clauses.
3792
3793        If ``check_bounds`` was set to ``True`` and the requested page contains
3794        no objects, then a 404 will be raised.
3795