pax_global_header00006660000000000000000000000064143675114150014521gustar00rootroot0000000000000052 comment=289ecba0beeee8d35f7d3cb5782f0d286094cabc
beaker-1.12.1/000077500000000000000000000000001436751141500130345ustar00rootroot00000000000000beaker-1.12.1/.github/000077500000000000000000000000001436751141500143745ustar00rootroot00000000000000beaker-1.12.1/.github/workflows/000077500000000000000000000000001436751141500164315ustar00rootroot00000000000000beaker-1.12.1/.github/workflows/runtests.yml000066400000000000000000000017371436751141500210530ustar00rootroot00000000000000name: Run Tests
on: [push, pull_request]
jobs:
build:
name: Run tests
strategy:
matrix:
os: [ubuntu-latest]
python-version: ["3.7", "3.8", "3.9", "3.10", "3.11"]
runs-on: ${{ matrix.os }}
steps:
- uses: actions/checkout@v2
- name: Install locales
run: sudo apt-get install -y locales language-pack-it
- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v2
with:
python-version: ${{ matrix.python-version }}
- name: Install dependencies
run: |
python -m pip install --upgrade pip
pip install -U --upgrade-strategy=eager --pre -e .[testsuite]
- name: Start memcached
uses: niden/actions-memcached@v7
- name: Start Redis
uses: supercharge/redis-github-action@1.4.0
- name: Start MongoDB
uses: supercharge/mongodb-github-action@1.8.0
- name: Test with pytest
run: |
pytest -vv
beaker-1.12.1/.gitignore000066400000000000000000000003041436751141500150210ustar00rootroot00000000000000*.egg
*.egg-info
*.pyc
*$py.class
*.pt.py
*.txt.py
*~
.coverage
.tox/
nosetests.xml
build/
dist/
bin/
lib/
include/
.idea/
distribute-*.tar.gz
bookenv/
jyenv/
pypyenv/
env*/
tests/test.db
/.eggs/
beaker-1.12.1/CHANGELOG000066400000000000000000000625701436751141500142600ustar00rootroot00000000000000Release 1.12.1 (2023-01-04)
===========================
* Fix ext:database backend failing to initialize.
* Improved inline code documentation for the crypto module.
Release 1.12.0 (2022-12-07)
===========================
* Enabled testing on Python 3.10 and 3.11
* Fixed issue #122 - Session ignores deserializer json
* Remove ID generation fallback for when the uuid module is not found
* Port testing from nose to pytest
* Fixed issue #180 - KeyError when loading deleted session
Release 1.11.0 (2019-08-26)
===========================
* Fixed cookie path option not being properly set (`self._path` was removed, only `self.path` exists)
* Documented `SameSite` option
* Fixed cookie expiration being localised when it shouldn't.
Release 1.10.1 (2019-02-21)
===========================
* Fix issue with Redis namespace manager TTL
* Fix for SameSite cookie option not being set in some cases
* Fix for memcached tests on Python3
Release 1.10.0 (2018-06-04)
===========================
* Redis namespace manager now supports providing a TTL for session entries that had a ``timeout`` provided.
This will remove the need to manually clear expired sessions from the redis storage.
* ``nsscrypto`` backend is now properly identified as providing AES support.
* When a crypto backend doesn't support AES it will no longer crash if the ``encrypt_key`` is ``None``.
* Session cookies will now provide support for ``SameSite`` through the ``samesite`` option.
By default this will be ``Lax``, but can be set to ``Strict`` or ``None`` to disable it.
Release 1.9.1 (2018-04-09)
==========================
* When decorating a function with @cache_region decorator the function generated to update the cached value
will be named like the decorated function. So that during debugging it's easy to know which function is involved.
* Removed usage of ``async`` as a variable in code, this fixes a compatibility problem with Python 3.7 where it's a keyword.
* Fixed a race condition in ``FileNamespaceManager``.
* ``ext.database`` backend will now properly close connections.
* Do not leave bhind corrupted files if ``FileNamespaceManager`` is interrupted while writing a new file.
Replacing content of a file or writing a new one is now always an atomic operation.
* ``DBMNamespaceManager`` and ``FileSynchronizer`` will now deal with directories disappearing while they try to write to them.
* The Redis and MongoDB backends are not exposed in documentation.
Release 1.9.0 (2017-06-19)
==========================
* Beaker now provides builtin ``ext:mongodb`` and ``ext:redis`` namespace managers.
Both come with a Synchronizer implemented on the storage backend instead of relying on file one.
* Fixed an issue where cookie options like ``Secure``, ``Domain`` and so on where lost.
* Improved support for cache entries expiration. NamespaceManagers that support it will expire their key automatically.
* Pycryptodome can be used instead of pycrypto.
* An issue with ``Cookie`` module import on case insensitive file systems should have been resolved.
* Cryptography module is now as a crypto function provider instead of pycrypto
Release 1.8.1 (2016-10-24)
==========================
* Sessions have a new option save_accessed_time which defaults to true for
backwards compatibility. Set to false to tell beaker not to update
_accessed_time if the session hasn't been changed, for non-cookie sessions
stores. This lets you avoid needless datastore writes. _accessed_time will
always be updated when the session is intentionally saved.
* data_serializer parameter in Session accepts a custom object with `dumps` and `loads` methods.
* Fixed a TypeError in exception reporting when failing to load a NamespaceManager
* Allow to change Cookie Expiration from a value back to None, previously it had no effect.
* Allow SessionMiddleware to setup a custom Session class through the `session_class` argument.
* Added `invalidate_corrupt` option to CookieSessions too for valid cookies containing invalid data.
Release 1.8.0 (2016-01-26)
==========================
* Encrypted sessions can now specify nonce length for salt generation through ``encrypt_nonce_bits`` parameter.
set it to ``48`` for backward compatibility with sessions generated before 1.8.0
* kwargs support in @cache_region decorator
* annotations support in @cache_region decorator
* data_serializer parameter in Session can now specify ``json`` to avoid pickle security issues
* Invalid cookies are now skipped in cookie based sessions
* Memcached based on PyLibMC now share same connection pool for same url
Release 1.7.0 (2015-04-20)
==========================
* Beaker no longer supports python 2.4 and 2.5
* Beaker now supports Python 2.6, 2.7, 3.2, 3.3, 3.4 without 2to3 usage
* Fixed Encrypted Cookie Session on Python3 #57
* New pbkdf2 mobule working on Python3 #21
* Fixed Test suite on Python 3.3 #53, #51
Release 1.6.5 (2015-02-06)
==========================
* @cached decorator now keeps docstring of decorated method.
* Fix crash when Session ``accessed_time`` is not available, this happened
when session ``encrypt_key`` was changed.
* Fix cache regions not providing a default key length even though this was
required and examples in the doc didn't provide it.
* Fix crash when cache expire wasn't an int, this happened when caching options
were loaded from a config file.
Release 1.6.4 (8/13/2012)
=========================
.. warning::
Session hashing for encrypted sessions using PyCrypto has changed. This
will result in sessions being invalidated upon upgrading if PyCrypto is
used.
* Fix bug with key_length not being coerced to a int for comparison. Patch by
Greg Lavallee.
* Fix bug with cookie invalidation not clearing the cookie data. Patch by
Vasiliy Lozovoy.
* Added ability to pass in cookie_path for the Session. Patch by Marcin
Kuzminski.
* Add NSS crypto support to Beaker. Patch by Miloslav Trmac of Redhat.
* Fix security bug with pycrypto not securing data such that an attacker could
possibly determine parts of the encrypted payload. Patch by Miloslav Trmac of
Redhat. See `CVE-2012-3458 `_.
* Add ability to specify schema for database-backed sessions. Patch by Vladimir
Tananko.
* Fix issue with long key names in memcached backend. Patch by Guillaume
Taglang.
Release 1.6.3 (2/29/2012)
=========================
* Fix bug with cookie deletion on leap years. Patch contributed by Greg
Nelson and Michael Wirth.
* Fix issue with referencing same module via different import paths. Patch
contributed by brianfrantz.
* Fix cookie expiration check. Patch contributed by Mike Dirolf.
Release 1.6.2 (12/13/2011)
==========================
* Updated dogpile lock so that it locks per namespace+key rather than on the
entire namespace. (#101)
* Added encryption option for any backend. Patch contributed by Toby Elliot.
Release 1.6.1 (10/20/2011)
==========================
* Remove stray print statement.
* Include .app for consistency instead of requiring wrap_app.
Release 1.6 (10/16/2011)
========================
* Fix bug with cache_key length calculation.
* Fix bug with how path was set so that its restored properly and propagated.
* Fix bug with CacheMiddleware clobbering enabled setting.
* Update option for ``cookie_expires`` so that it can now handle an integer
which will be used as the seconds till the cookie expires.
* Merge fix for Issue 31, can now handle unicode cache keys.
* Add ``key_length`` option for cache regions, and for keyword args passed
into the cache system. Cache keys longer than this will be SHA'd.
* added runtime beaker.__version__
* Add ``webtest_varname`` option to configuration to optionally include
the session value in the environ vars when using Beaker with WebTest.
* Defer running of pkg_resources to look for external cache modules
until requested. #66
* memcached backend uses pylibmc.ThreadMappedPool to ensure thread-local
usage of pylibmc when that library is in use. (#60)
* memcached backend also has ``memcache_module`` string argument, allows
direct specification of the name of which memcache backend to use.
* Basic container/file-based Session support working in Py3K. (#72)
* Further Python 3 fixes
* Added an optimization to the FileNamespaceContainer when used with
Session, such that the pickled contents of the file are not
read a second time when session.save() is called. (#64)
* Fixed bug whereby CacheManager.invalidate wouldn't work for a function
decorated by cache.cache(). (#61)
* cache decorators @cache.cache(), @cache_region() won't include first
argument named 'self' or 'cls' as part of the cache key. This allows
reasonably safe usage for methods as well as functions. (#55)
* file backend no longer squashes unpickling errors. This was inconsistent
behavior versus all the other backends.
* invalidate_corrupt flag on Session now emits a warning. (#52)
* cache.remove_value() removes the value even if it's already marked
'expired' (#42)
Release 1.5.4 (6/16/2010)
=========================
* Fix import error with InvalidCryptoBackendError.
* Fix for domain querying on property.
* Test cleanups
* Fix bug with warnings preventing proper running under Jython.
Release 1.5.3 (3/2/2010)
========================
* Fix Python 2.4 incompatibility with google import.
Release 1.5.2 (3/1/2010)
========================
* pkg_resources scanning for additional Beaker back-ends gracefully handles
situations where its not present (GAE). Fixes #36.
* Avoid timing attacks on hash comparison.
* Provide abstract base for MemoryNamespaceManager that deals with
"dictionaries".
* Added tests for invalidating cache, and fixed bug with function cache when
no args are present.
* The SQLAlchemy backends require SQLAlchemy 0.4 or greater (0.6 recommended).
* Rudimental Python 3 support is now available. Simply use Python 3 with
Distribute and "python setup.py install" to run 2to3 automatically,
or manually run 2to3 on "beaker" and "tests" to convert to a
Python 3 version.
* Added support for PyCrypto module to encrypted session, etc. in addition
to the existing pycryptopp support.
Release 1.5.1 (12/17/2009)
==========================
* Fix cache namespacing.
Release 1.5 (11/23/2009)
========================
* Update memcached to default to using pylibmc when available.
* Fix bug when cache value doesn't exist causing has_key to throw
an exception rather than return False. Fixes #24.
* Fix bug where getpid under GAE is used improperly to assume it
should be a non-string. Fixes #22.
* Add cache_region decorator that works *before* configuration of
the cache regions have been completed for use in module-level
decorations.
* Fix bug where has_value sees the value before its removed.
* Improved accuracy of "dogpile" checker by removing dependency
on "self" attributes, which seem to be slightly unreliable
in highly concurrent scenarios.
Release 1.4.2 (9/25/2009)
=========================
* Fix bug where memcached may yank a value after the has_value but before
the value can be fetched.
* Fix properties for setting the path. Fixes #15.
* Fix the 'TypeError: argument must be an int, or have a fileno()
method' erorr sporadically emitted by FileSynchronizer under moderate
load.
Release 1.4.1 (9/10/2009)
=========================
* Fix verification of options to throw an error if a beaker param is an
empty string.
* Add CacheManager.invalidate function to easily invalidate cache
spaces created by the use of the cache decorator.
* Add CacheManager.region_invalidate function to easily invalidate cache
spaces created by the use of the cache_region decorator.
* Fix the InvalidCryptoBackendError exception triggering a TypeError. Patch
from dz, fixes #13.
Release 1.4 (7/24/2009)
=======================
* Fix bug with hmac on Python 2.4. Patch from toshio, closes ticket #2133
from the TurboGears2 Trac.
* Fix bug with occasional ValueError from FileNamespaceManager.do_open.
Fixes #10.
* Fixed bug with session files being saved despite being new and not
saved.
* Fixed bug with CacheMiddleware overwriting configuration with default
arguments despite prior setting.
* Fixed bug with SyntaxError not being caught properly in entry point
discovery.
* Changed to using BlobProperty for Google Datastore.
* Added domain/path properties to the session. This allows one to
dynamically set the cookie's domain and/or path on the fly, which
will then be set on the cookie for the session.
* Added support for cookie-based sessions in Jython via the JCE (Java
Cryptography Extensions). Patch from Alex Grönholm.
* Update Beaker database extensions to work with SQLAlchemy 0.6
PostgreSQL, and Jython.
Release 1.3.1 (5/5/2009)
========================
* Added a whole bunch of Sphinx documentation for the updated site.
* Added corresponding remove as an alias to the caches remove_value.
* Fixed cookie session not having an invalidate function.
* Fix bug with CacheMiddleware not using proper function to load
configuration options, missing the cache regions.
Release 1.3 (4/6/2009)
======================
* Added last_accessed attribute to session to indicate the previous time the
session was last accessed.
* Added setuptools entry points to dynamically discover additional namespace
backends.
* Fixed bug with invalidate and locks, fixes #594.
* Added cache.cache decorator for arbitrary caching.
* Added cache.region decorator to the CacheManager object.
* Added cache regions. Can be provided in a configuration INI type, or
by adding in a cache_regions arg to the CacheManager.
* Fix bug with timeout not being saved properly.
* Fix bug with cookie-only sessions sending cookies for new sessions even
if they weren't supposed to be saved.
* Fix bug that caused a non-auto accessed session to not record the time it
was previously accessed resulting in session timeouts.
* Add function to parse configuration dicts as appropriate for use with the
CacheManager.
* The "expiretime" is no longer passed to the memcached backend - since
if memcached makes the expired item unavailable at the same time the
container expires it, then all actors must block until the new value
is available (i.e. breaks the anti-dogpile logic).
Release 1.2.3 (3/2/2009)
========================
* Fix accessed increment to take place *after* the accessed time is checked
to see if it has expired. Fixes #580.
* data_dir/lock_dir parameters are optional to most backends; if not
present, mutex-based locking will be used for creation functions
* Adjustments to Container to better account for backends which
don't provide read/write locks, such as memcached. As a result,
the plain "memory" cache no longer requires read/write mutexing.
Release 1.2.2 (2/14/2009)
=========================
* Fix delete bug reported by andres with session not being deleted.
Release 1.2.1 (2/09/2009)
=========================
* Fix memcached behavior as memcached returns None on nonexistent key
fetch which broke invalid session checking.
Release 1.2 (1/22/2009)
=======================
* Updated session to only save to the storage *once* no under any/all
conditions rather than every time save() is called.
* Added session.revert() function that reverts the session to the state at
the beginning of the request.
* Updated session to store entire session data in a single namespace key,
this lets memcached work properly, and makes for more efficient use of the
storage system for sessions.
Release 1.1.3 (12/29/2008)
==========================
* Fix the 1.1.2 old cache/session upgrader to handle the has_current_value
method.
* Make InvalidCacheBackendError an ImportError.
Release 1.1.2 (11/24/2008)
==========================
* Upgrade Beaker pre-1.1 cache/session values to the new format rather than
throwing an exception.
Release 1.1.1 (11/24/2008)
==========================
* Fixed bug in Google extension which passed arguments it should no longer
pass to NamespaceManager.
* Fixed bug involving lockfiles left open during cache "value creation"
step.
Release 1.1 (11/16/2008)
========================
* file-based cache will not hold onto cached value once read from file;
will create new value if the file is deleted as opposed to re-using
what was last read. This allows external removal of files to be
used as a cache-invalidation mechanism.
* file-based locking will not unlink lockfiles; this can interfere
with the flock() mechanism in the event that a concurrent process
is accessing the files.
* Sending "type" and other namespace config arguments to cache.get()/
cache.put()/cache.remove_value() is deprecated. The namespace
configuration is now preferred at the Cache level, i.e. when you construct
a Cache or call cache_manager.get_cache(). This removes the ambiguity
of Cache's dictionary interface and has_key() methods, which have
no awareness of those arguments.
* the "expiretime" in use is stored in the cache itself, so that it is
always available when calling has_key() and other methods. Between
this change and the deprecation of 'type', the Cache no longer has
any need to store cache configuration in memory per cache key, which in a
dynamically-generated key scenario stores an arbitrarily large number
of configurations - essentially a memory leak.
* memcache caching has been vastly improved, no longer stores a list of
all keys, which along the same theme prevented efficient usage for an
arbitrarily large number of keys. The keys() method is now unimplemented,
and cache.remove() clears the entire memcache cache across all namespaces.
This is what the memcache API provides so it's the best we can do.
* memcache caching passes along "expiretime" to the memcached "time"
parameter, so that the cache itself can reduce its size for elements which
are expired (memcache seems to manage its size in any case, this is just a
hint to improve its operation).
* replaced homegrown ThreadLocal implementation with threading.local, falls
back to a 2.3 compat one for python<2.4
Release 1.0.3 (10/14/2008)
==========================
* Fixed os.getpid issue on GAE.
* CookieSession will add '_expires' value to data when an expire time is set,
and uses it
Release 1.0.2 (9/22/2008)
=========================
* Fixed bug caused when attempting to invalidate a session that hadn't
previously been created.
Release 1.0.1 (8/19/2008)
=========================
* Bug fix for cookie sessions to retain id before clearing values.
Release 1.0 (8/13/2008)
=======================
* Added cookie delete to both cookie only sessions and normal sessions, to
help with proxies and such that may determine whether a user is logged in
via a cookie. (cookie varies, etc.). Suggested by Felix Schwarz.
* cache.get_value() now uses the given **kwargs** in all cases in the same
manner as cache.set_value(). This way you can send a new createfunc
to cache.get_value() each time and it will be used.
Release 0.9.5 (6/19/2008)
=========================
* Fixed bug in memcached to be tolerant of keys disappearing when memcached
expires them.
* Fixed the cache functionality to actually work, previously set_value was
ignored if there was already a value set.
Release 0.9.4 (4/13/2008)
=========================
* Adding 'google' backend datastore, available by specifying 'google' as the
cache/session type. Note that this takes an optional table_name used to name
the model class used.
* SECURITY BUG: Fixed security issue with Beaker not properly removing
directory escaping characters from the session ID when un-signed sessions
are used. Reported with patch by Felix Schwarz.
* Fixed bug with Beaker not playing well with Registry when its placed above
it in the stack. Thanks Wichert Akkerman.
Release 0.9.3 (2/28/2008)
=========================
* Adding 'id' to cookie-based sessions for better compatibility.
* Fixed error with exception still raised for PyCrypto missing.
* WARNING: Session middleware no longer catches Paste HTTP Exceptions, apps
are now expected to capture and handle Paste HTTP Exceptions themselves.
* Fixed Python 2.4 compatibility bug in hmac.
* Fixed key lookup bug on cache object to only use the settings for the key
lookup. Found by Andrew Stromnov.
Release 0.9.2 (2/13/2008)
=========================
* Added option to make Beaker use a secure cookie.
* Removed CTRCipher as pycryptopp doesn't need it.
* Changed AES to use 256 bit.
* Fixed signing code to use hmac with sha for better signing security.
* Fixed memcached code to use delete_multi on clearing the keys for efficiency
and updated key retrieval to properly store and retrieve None values.
* Removing cookie.py and signed cookie middleware, as the environ_key option
for session middleware provides a close enough setting.
* Added option to use just cookie-based sessions without requiring
encryption.
* Switched encryption requirement from PyCrypto to pycryptopp which uses a
proper AES in Counter Mode.
Release 0.9.1 (2/4/2008)
========================
* Fixed bug in middleware using module that wasn't imported.
Release 0.9 (12/17/07)
======================
* Fixed bug in memcached replace to actually replace spaces properly.
* Fixed md5 cookie signature to use SHA-1 when available.
* Updated cookie-based session storage to use 256-bit AES-CTR mode with a
SHA-1 HMAC signature. Now requires PyCrypto to use for AES scheme.
* WARNING: Moved session and cache middleware to middleware, as per the old
deprecation warnings had said was going to happen for 0.8.
* Added cookie-only session storage with RC4 ciphered encryption, requires
Python 2.4.
* Add the ability to specify the cookie's domain for sessions.
Release 0.8.1 (11/15/07)
========================
* Fixed bug in database.py not properly handling missing sqlalchemy library.
Release 0.8 (10/17/07)
======================
* Fixed bug in prior db update causing session to occasionally not be written
back to the db.
* Fixed memcached key error with keys containing spaces. Thanks Jim Musil.
* WARNING: Major change to ext:database to use a single row per namespace.
Additionally, there's an accessed and created column present to support
easier deletion of old cache/session data. You *will* need to drop any
existing tables being used by the ext:database backend.
* Streamline ext:database backend to avoid unnecessary database selects for
repeat data.
* Added SQLAlchemy 0.4 support to ext:database backend.
Release 0.7.5 (08/18/07)
========================
* Fixed data_dir parsing for session string coercions, no longer picks up None
as a data_dir.
* Fixed session.get_by_id to lookup recently saved sessions properly, also
updates session with creation/access time upon save.
* Add unit tests for get_by_id function. Updated get_by_id to not result in
additional session files.
* Added session.get_by_id function to retrieve a session of the given id.
Release 0.7.4 (07/09/07)
========================
* Fixed issue with Beaker not properly handling arguments as Pylons may pass
them in.
* Fixed unit test to catch file removal exception.
* Fixed another bug in synchronization, this one involving reentrant
conditions with file synchronization
* If a file open fails due to pickling errors, locks just opened
are released unconditionally
Release 0.7.3 (06/08/07)
========================
* Beaker was not properly parsing input options to session middleware. Thanks
to Yannick Gingras and Timothy S for spotting the issue.
* Changed session to only send the cookie header if its a new session and
save() was called. Also only creates the session file under these
conditions.
Release 0.7.2 (05/19/07)
========================
* Added deprecation warning for middleware move, relocated middleware to cache
and session modules for backwards compatibility.
Release 0.7.1 05/18/07)
=======================
* adjusted synchronization logic to account for Mako/new Cache object's
multithreaded usage of Container.
Release 0.7 (05/18/07)
======================
* WARNING: Cleaned up Cache object based on Mako cache object, this changes
the call interface slightly for creating a Cache object directly. The
middleware cache object is unaffected from an end-user view. This change
also avoids duplicate creations of Cache objects.
* Adding database backend and unit tests.
* Added memcached test, fixed memcached namespace arg passing.
* Fixed session and cache tests, still failing syncdict test. Added doctests
for Cache and Session middleware.
* Cleanup of container/cache/container_test
* Namespaces no longer require a context, removed NamespaceContext?
* Logging in container.py uses logging module
* Cleanup of argument passing, use name **kwargs** instead of **params** for
generic kwargs
* Container classes contain a static create_namespace() method, namespaces are
accessed from the ContainerContext? via string name + container class alone
* Implemented (but not yet tested) clear() method on Cache, locates all
Namespaces used thus far and clears each one based on its keys() collection
* Fixed Cache.clear() method to actually clear the Cache namespace.
* Updated memcached backend to split servers on ';' for multiple memcached
backends.
* Merging MyghtyUtils code into Beaker.
Release 0.6.3 (03/18/2007)
==========================
* Added api with customized Session that doesn't require a Myghty request
object, just a dict. Updated session to use the new version.
* Removing unicode keys as some dbm backends can't handle unicode keys.
* Adding core files that should've been here.
* More stringent checking for existence of a session.
* Avoid recreating the session object when it's empty.
beaker-1.12.1/LICENSE000066400000000000000000000027571436751141500140540ustar00rootroot00000000000000Copyright (c) 2006, 2007 Ben Bangert, Mike Bayer, Philip Jenvey
and contributors.
All rights reserved.
Redistribution and use in source and binary forms, with or without
modification, are permitted provided that the following conditions
are met:
1. Redistributions of source code must retain the above copyright
notice, this list of conditions and the following disclaimer.
2. Redistributions in binary form must reproduce the above copyright
notice, this list of conditions and the following disclaimer in the
documentation and/or other materials provided with the distribution.
3. The name of the author or contributors may not be used to endorse or
promote products derived from this software without specific prior
written permission.
THIS SOFTWARE IS PROVIDED BY THE AUTHOR AND CONTRIBUTORS ``AS IS'' AND
ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
ARE DISCLAIMED. IN NO EVENT SHALL THE AUTHOR OR CONTRIBUTORS BE LIABLE
FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS
OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION)
HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT
LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY
OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF
SUCH DAMAGE.
beaker-1.12.1/MANIFEST.in000066400000000000000000000000351436751141500145700ustar00rootroot00000000000000recursive-include tests *.py
beaker-1.12.1/README.rst000066400000000000000000000036531436751141500145320ustar00rootroot00000000000000=========================
Cache and Session Library
=========================
About
=====
Beaker is a web session and general caching library that includes WSGI
middleware for use in web applications.
As a general caching library, Beaker can handle storing for various times
any Python object that can be pickled with optional back-ends on a
fine-grained basis.
Beaker was built largely on the code from MyghtyUtils, then refactored and
extended with database support.
Beaker includes Cache and Session WSGI middleware to ease integration with
WSGI capable frameworks, and is automatically used by `Pylons
`_ and
`TurboGears `_.
Features
========
* Fast, robust performance
* Multiple reader/single writer lock system to avoid duplicate simultaneous
cache creation
* Cache back-ends include dbm, file, memory, memcached, Redis, MongoDB, and
database (Using SQLAlchemy for multiple-db vendor support)
* Signed cookies to prevent session hijacking/spoofing
* Cookie-only sessions to remove the need for a db or file backend (ideal
for clustered systems)
* Extensible Container object to support new back-ends
* Caches can be divided into namespaces (to represent templates, objects,
etc.) then keyed for different copies
* Create functions for automatic call-backs to create new cache copies after
expiration
* Fine-grained toggling of back-ends, keys, and expiration per Cache object
Documentation
=============
Documentation can be found on the `Official Beaker Docs site
`_.
Source
======
The latest developer version is available in a `GitHub repository
`_.
Contributing
============
Bugs can be filed on GitHub, **should be accompanied by a test case** to
retain current code coverage, and should be in a pull request when ready to be
accepted into the beaker code-base.
beaker-1.12.1/beaker/000077500000000000000000000000001436751141500142655ustar00rootroot00000000000000beaker-1.12.1/beaker/__init__.py000066400000000000000000000000271436751141500163750ustar00rootroot00000000000000__version__ = '1.12.1'
beaker-1.12.1/beaker/_compat.py000066400000000000000000000102401436751141500162560ustar00rootroot00000000000000from __future__ import absolute_import
import sys
# True if we are running on Python 2.
PY2 = sys.version_info[0] == 2
PYVER = sys.version_info[:2]
JYTHON = sys.platform.startswith('java')
if PY2 and not JYTHON: # pragma: no cover
import cPickle as pickle
else: # pragma: no cover
import pickle
if not PY2: # pragma: no cover
xrange_ = range
NoneType = type(None)
string_type = str
unicode_text = str
byte_string = bytes
from urllib.parse import urlencode as url_encode
from urllib.parse import quote as url_quote
from urllib.parse import unquote as url_unquote
from urllib.parse import urlparse as url_parse
from urllib.request import url2pathname
import http.cookies as http_cookies
from base64 import b64decode as _b64decode, b64encode as _b64encode
try:
import dbm as anydbm
except:
import dumbdbm as anydbm
def b64decode(b):
return _b64decode(b.encode('ascii'))
def b64encode(s):
return _b64encode(s).decode('ascii')
def u_(s):
return str(s)
def bytes_(s):
if isinstance(s, byte_string):
return s
return str(s).encode('ascii', 'strict')
def dictkeyslist(d):
return list(d.keys())
else:
xrange_ = xrange
from types import NoneType
string_type = basestring
unicode_text = unicode
byte_string = str
from urllib import urlencode as url_encode
from urllib import quote as url_quote
from urllib import unquote as url_unquote
from urlparse import urlparse as url_parse
from urllib import url2pathname
import Cookie as http_cookies
from base64 import b64decode, b64encode
import anydbm
def u_(s):
if isinstance(s, unicode_text):
return s
if not isinstance(s, byte_string):
s = str(s)
return unicode(s, 'utf-8')
def bytes_(s):
if isinstance(s, byte_string):
return s
return str(s)
def dictkeyslist(d):
return d.keys()
def im_func(f):
if not PY2: # pragma: no cover
return getattr(f, '__func__', None)
else:
return getattr(f, 'im_func', None)
def default_im_func(f):
if not PY2: # pragma: no cover
return getattr(f, '__func__', f)
else:
return getattr(f, 'im_func', f)
def im_self(f):
if not PY2: # pragma: no cover
return getattr(f, '__self__', None)
else:
return getattr(f, 'im_self', None)
def im_class(f):
if not PY2: # pragma: no cover
self = im_self(f)
if self is not None:
return self.__class__
else:
return None
else:
return getattr(f, 'im_class', None)
def add_metaclass(metaclass):
"""Class decorator for creating a class with a metaclass."""
def wrapper(cls):
orig_vars = cls.__dict__.copy()
slots = orig_vars.get('__slots__')
if slots is not None:
if isinstance(slots, str):
slots = [slots]
for slots_var in slots:
orig_vars.pop(slots_var)
orig_vars.pop('__dict__', None)
orig_vars.pop('__weakref__', None)
return metaclass(cls.__name__, cls.__bases__, orig_vars)
return wrapper
if not PY2: # pragma: no cover
import builtins
exec_ = getattr(builtins, "exec")
def reraise(tp, value, tb=None):
if value.__traceback__ is not tb:
raise value.with_traceback(tb)
raise value
else: # pragma: no cover
def exec_(code, globs=None, locs=None):
"""Execute code in a namespace."""
if globs is None:
frame = sys._getframe(1)
globs = frame.f_globals
if locs is None:
locs = frame.f_locals
del frame
elif locs is None:
locs = globs
exec("""exec code in globs, locs""")
exec_("""def reraise(tp, value, tb=None):
raise tp, value, tb
""")
try:
from inspect import signature as func_signature
except ImportError:
from funcsigs import signature as func_signature
def bindfuncargs(arginfo, args, kwargs):
boundargs = arginfo.bind(*args, **kwargs)
return boundargs.args, boundargs.kwargs
beaker-1.12.1/beaker/cache.py000066400000000000000000000526341436751141500157140ustar00rootroot00000000000000"""This package contains the "front end" classes and functions
for Beaker caching.
Included are the :class:`.Cache` and :class:`.CacheManager` classes,
as well as the function decorators :func:`.region_decorate`,
:func:`.region_invalidate`.
"""
import warnings
from itertools import chain
from beaker._compat import u_, unicode_text, func_signature, bindfuncargs
import beaker.container as container
import beaker.util as util
from beaker.crypto.util import sha1
from beaker.exceptions import BeakerException, InvalidCacheBackendError
from beaker.synchronization import _threading
import beaker.ext.memcached as memcached
import beaker.ext.database as database
import beaker.ext.sqla as sqla
import beaker.ext.google as google
import beaker.ext.mongodb as mongodb
import beaker.ext.redisnm as redisnm
from functools import wraps
# Initialize the cache region dict
cache_regions = {}
"""Dictionary of 'region' arguments.
A "region" is a string name that refers to a series of cache
configuration arguments. An application may have multiple
"regions" - one which stores things in a memory cache, one
which writes data to files, etc.
The dictionary stores string key names mapped to dictionaries
of configuration arguments. Example::
from beaker.cache import cache_regions
cache_regions.update({
'short_term':{
'expire':60,
'type':'memory'
},
'long_term':{
'expire':1800,
'type':'dbm',
'data_dir':'/tmp',
}
})
"""
cache_managers = {}
class _backends(object):
initialized = False
def __init__(self, clsmap):
self._clsmap = clsmap
self._mutex = _threading.Lock()
def __getitem__(self, key):
try:
return self._clsmap[key]
except KeyError as e:
if not self.initialized:
self._mutex.acquire()
try:
if not self.initialized:
self._init()
self.initialized = True
return self._clsmap[key]
finally:
self._mutex.release()
raise e
def _init(self):
try:
import pkg_resources
# Load up the additional entry point defined backends
for entry_point in pkg_resources.iter_entry_points('beaker.backends'):
try:
namespace_manager = entry_point.load()
name = entry_point.name
if name in self._clsmap:
raise BeakerException("NamespaceManager name conflict,'%s' "
"already loaded" % name)
self._clsmap[name] = namespace_manager
except (InvalidCacheBackendError, SyntaxError):
# Ignore invalid backends
pass
except:
import sys
from pkg_resources import DistributionNotFound
# Warn when there's a problem loading a NamespaceManager
if not isinstance(sys.exc_info()[1], DistributionNotFound):
import traceback
try:
from StringIO import StringIO # Python2
except ImportError:
from io import StringIO # Python3
tb = StringIO()
traceback.print_exc(file=tb)
warnings.warn(
"Unable to load NamespaceManager "
"entry point: '%s': %s" % (
entry_point,
tb.getvalue()),
RuntimeWarning, 2)
except ImportError:
pass
# Initialize the basic available backends
clsmap = _backends({
'memory': container.MemoryNamespaceManager,
'dbm': container.DBMNamespaceManager,
'file': container.FileNamespaceManager,
'ext:memcached': memcached.MemcachedNamespaceManager,
'ext:database': database.DatabaseNamespaceManager,
'ext:sqla': sqla.SqlaNamespaceManager,
'ext:google': google.GoogleNamespaceManager,
'ext:mongodb': mongodb.MongoNamespaceManager,
'ext:redis': redisnm.RedisNamespaceManager
})
def cache_region(region, *args):
"""Decorate a function such that its return result is cached,
using a "region" to indicate the cache arguments.
Example::
from beaker.cache import cache_regions, cache_region
# configure regions
cache_regions.update({
'short_term':{
'expire':60,
'type':'memory'
}
})
@cache_region('short_term', 'load_things')
def load(search_term, limit, offset):
'''Load from a database given a search term, limit, offset.'''
return database.query(search_term)[offset:offset + limit]
The decorator can also be used with object methods. The ``self``
argument is not part of the cache key. This is based on the
actual string name ``self`` being in the first argument
position (new in 1.6)::
class MyThing(object):
@cache_region('short_term', 'load_things')
def load(self, search_term, limit, offset):
'''Load from a database given a search term, limit, offset.'''
return database.query(search_term)[offset:offset + limit]
Classmethods work as well - use ``cls`` as the name of the class argument,
and place the decorator around the function underneath ``@classmethod``
(new in 1.6)::
class MyThing(object):
@classmethod
@cache_region('short_term', 'load_things')
def load(cls, search_term, limit, offset):
'''Load from a database given a search term, limit, offset.'''
return database.query(search_term)[offset:offset + limit]
:param region: String name of the region corresponding to the desired
caching arguments, established in :attr:`.cache_regions`.
:param *args: Optional ``str()``-compatible arguments which will uniquely
identify the key used by this decorated function, in addition
to the positional arguments passed to the function itself at call time.
This is recommended as it is needed to distinguish between any two functions
or methods that have the same name (regardless of parent class or not).
.. note::
The function being decorated must only be called with
positional arguments, and the arguments must support
being stringified with ``str()``. The concatenation
of the ``str()`` version of each argument, combined
with that of the ``*args`` sent to the decorator,
forms the unique cache key.
.. note::
When a method on a class is decorated, the ``self`` or ``cls``
argument in the first position is
not included in the "key" used for caching. New in 1.6.
"""
return _cache_decorate(args, None, None, region)
def region_invalidate(namespace, region, *args):
"""Invalidate a cache region corresponding to a function
decorated with :func:`.cache_region`.
:param namespace: The namespace of the cache to invalidate. This is typically
a reference to the original function (as returned by the :func:`.cache_region`
decorator), where the :func:`.cache_region` decorator applies a "memo" to
the function in order to locate the string name of the namespace.
:param region: String name of the region used with the decorator. This can be
``None`` in the usual case that the decorated function itself is passed,
not the string name of the namespace.
:param args: Stringifyable arguments that are used to locate the correct
key. This consists of the ``*args`` sent to the :func:`.cache_region`
decorator itself, plus the ``*args`` sent to the function itself
at runtime.
Example::
from beaker.cache import cache_regions, cache_region, region_invalidate
# configure regions
cache_regions.update({
'short_term':{
'expire':60,
'type':'memory'
}
})
@cache_region('short_term', 'load_data')
def load(search_term, limit, offset):
'''Load from a database given a search term, limit, offset.'''
return database.query(search_term)[offset:offset + limit]
def invalidate_search(search_term, limit, offset):
'''Invalidate the cached storage for a given search term, limit, offset.'''
region_invalidate(load, 'short_term', 'load_data', search_term, limit, offset)
Note that when a method on a class is decorated, the first argument ``cls``
or ``self`` is not included in the cache key. This means you don't send
it to :func:`.region_invalidate`::
class MyThing(object):
@cache_region('short_term', 'some_data')
def load(self, search_term, limit, offset):
'''Load from a database given a search term, limit, offset.'''
return database.query(search_term)[offset:offset + limit]
def invalidate_search(self, search_term, limit, offset):
'''Invalidate the cached storage for a given search term, limit, offset.'''
region_invalidate(self.load, 'short_term', 'some_data', search_term, limit, offset)
"""
if callable(namespace):
if not region:
region = namespace._arg_region
namespace = namespace._arg_namespace
if not region:
raise BeakerException("Region or callable function "
"namespace is required")
else:
region = cache_regions[region]
cache = Cache._get_cache(namespace, region)
_cache_decorator_invalidate(cache,
region.get('key_length', util.DEFAULT_CACHE_KEY_LENGTH),
args)
class Cache(object):
"""Front-end to the containment API implementing a data cache.
:param namespace: the namespace of this Cache
:param type: type of cache to use
:param expire: seconds to keep cached data
:param expiretime: seconds to keep cached data (legacy support)
:param starttime: time when cache was cache was
"""
def __init__(self, namespace, type='memory', expiretime=None,
starttime=None, expire=None, **nsargs):
try:
cls = clsmap[type]
if isinstance(cls, InvalidCacheBackendError):
raise cls
except KeyError:
raise TypeError("Unknown cache implementation %r" % type)
if expire is not None:
expire = int(expire)
self.namespace_name = namespace
self.namespace = cls(namespace, **nsargs)
self.expiretime = expiretime or expire
self.starttime = starttime
self.nsargs = nsargs
@classmethod
def _get_cache(cls, namespace, kw):
key = namespace + str(kw)
try:
return cache_managers[key]
except KeyError:
cache_managers[key] = cache = cls(namespace, **kw)
return cache
def put(self, key, value, **kw):
self._get_value(key, **kw).set_value(value)
set_value = put
def get(self, key, **kw):
"""Retrieve a cached value from the container"""
return self._get_value(key, **kw).get_value()
get_value = get
def remove_value(self, key, **kw):
mycontainer = self._get_value(key, **kw)
mycontainer.clear_value()
remove = remove_value
def _get_value(self, key, **kw):
if isinstance(key, unicode_text):
key = key.encode('ascii', 'backslashreplace')
if 'type' in kw:
return self._legacy_get_value(key, **kw)
kw.setdefault('expiretime', self.expiretime)
kw.setdefault('starttime', self.starttime)
return container.Value(key, self.namespace, **kw)
@util.deprecated("Specifying a "
"'type' and other namespace configuration with cache.get()/put()/etc. "
"is deprecated. Specify 'type' and other namespace configuration to "
"cache_manager.get_cache() and/or the Cache constructor instead.")
def _legacy_get_value(self, key, type, **kw):
expiretime = kw.pop('expiretime', self.expiretime)
starttime = kw.pop('starttime', None)
createfunc = kw.pop('createfunc', None)
kwargs = self.nsargs.copy()
kwargs.update(kw)
c = Cache(self.namespace.namespace, type=type, **kwargs)
return c._get_value(key, expiretime=expiretime, createfunc=createfunc,
starttime=starttime)
def clear(self):
"""Clear all the values from the namespace"""
self.namespace.remove()
# dict interface
def __getitem__(self, key):
return self.get(key)
def __contains__(self, key):
return self._get_value(key).has_current_value()
def has_key(self, key):
return key in self
def __delitem__(self, key):
self.remove_value(key)
def __setitem__(self, key, value):
self.put(key, value)
class CacheManager(object):
def __init__(self, **kwargs):
"""Initialize a CacheManager object with a set of options
Options should be parsed with the
:func:`~beaker.util.parse_cache_config_options` function to
ensure only valid options are used.
"""
self.kwargs = kwargs
self.regions = kwargs.pop('cache_regions', {})
# Add these regions to the module global
cache_regions.update(self.regions)
def get_cache(self, name, **kwargs):
kw = self.kwargs.copy()
kw.update(kwargs)
return Cache._get_cache(name, kw)
def get_cache_region(self, name, region):
if region not in self.regions:
raise BeakerException('Cache region not configured: %s' % region)
kw = self.regions[region]
return Cache._get_cache(name, kw)
def region(self, region, *args):
"""Decorate a function to cache itself using a cache region
The region decorator requires arguments if there are more than
two of the same named function, in the same module. This is
because the namespace used for the functions cache is based on
the functions name and the module.
Example::
# Assuming a cache object is available like:
cache = CacheManager(dict_of_config_options)
def populate_things():
@cache.region('short_term', 'some_data')
def load(search_term, limit, offset):
return load_the_data(search_term, limit, offset)
return load('rabbits', 20, 0)
.. note::
The function being decorated must only be called with
positional arguments.
"""
return cache_region(region, *args)
def region_invalidate(self, namespace, region, *args):
"""Invalidate a cache region namespace or decorated function
This function only invalidates cache spaces created with the
cache_region decorator.
:param namespace: Either the namespace of the result to invalidate, or the
cached function
:param region: The region the function was cached to. If the function was
cached to a single region then this argument can be None
:param args: Arguments that were used to differentiate the cached
function as well as the arguments passed to the decorated
function
Example::
# Assuming a cache object is available like:
cache = CacheManager(dict_of_config_options)
def populate_things(invalidate=False):
@cache.region('short_term', 'some_data')
def load(search_term, limit, offset):
return load_the_data(search_term, limit, offset)
# If the results should be invalidated first
if invalidate:
cache.region_invalidate(load, None, 'some_data',
'rabbits', 20, 0)
return load('rabbits', 20, 0)
"""
return region_invalidate(namespace, region, *args)
def cache(self, *args, **kwargs):
"""Decorate a function to cache itself with supplied parameters
:param args: Used to make the key unique for this function, as in region()
above.
:param kwargs: Parameters to be passed to get_cache(), will override defaults
Example::
# Assuming a cache object is available like:
cache = CacheManager(dict_of_config_options)
def populate_things():
@cache.cache('mycache', expire=15)
def load(search_term, limit, offset):
return load_the_data(search_term, limit, offset)
return load('rabbits', 20, 0)
.. note::
The function being decorated must only be called with
positional arguments.
"""
return _cache_decorate(args, self, kwargs, None)
def invalidate(self, func, *args, **kwargs):
"""Invalidate a cache decorated function
This function only invalidates cache spaces created with the
cache decorator.
:param func: Decorated function to invalidate
:param args: Used to make the key unique for this function, as in region()
above.
:param kwargs: Parameters that were passed for use by get_cache(), note that
this is only required if a ``type`` was specified for the
function
Example::
# Assuming a cache object is available like:
cache = CacheManager(dict_of_config_options)
def populate_things(invalidate=False):
@cache.cache('mycache', type="file", expire=15)
def load(search_term, limit, offset):
return load_the_data(search_term, limit, offset)
# If the results should be invalidated first
if invalidate:
cache.invalidate(load, 'mycache', 'rabbits', 20, 0, type="file")
return load('rabbits', 20, 0)
"""
namespace = func._arg_namespace
cache = self.get_cache(namespace, **kwargs)
if hasattr(func, '_arg_region'):
cachereg = cache_regions[func._arg_region]
key_length = cachereg.get('key_length', util.DEFAULT_CACHE_KEY_LENGTH)
else:
key_length = kwargs.pop('key_length', util.DEFAULT_CACHE_KEY_LENGTH)
_cache_decorator_invalidate(cache, key_length, args)
def _cache_decorate(deco_args, manager, options, region):
"""Return a caching function decorator."""
cache = [None]
def decorate(func):
namespace = util.func_namespace(func)
skip_self = util.has_self_arg(func)
signature = func_signature(func)
@wraps(func)
def cached(*args, **kwargs):
if not cache[0]:
if region is not None:
if region not in cache_regions:
raise BeakerException(
'Cache region not configured: %s' % region)
reg = cache_regions[region]
if not reg.get('enabled', True):
return func(*args, **kwargs)
cache[0] = Cache._get_cache(namespace, reg)
elif manager:
cache[0] = manager.get_cache(namespace, **options)
else:
raise Exception("'manager + kwargs' or 'region' "
"argument is required")
cache_key_kwargs = []
if kwargs:
# kwargs provided, merge them in positional args
# to avoid having different cache keys.
args, kwargs = bindfuncargs(signature, args, kwargs)
cache_key_kwargs = [u_(':').join((u_(key), u_(value))) for key, value in kwargs.items()]
cache_key_args = args
if skip_self:
cache_key_args = args[1:]
cache_key = u_(" ").join(map(u_, chain(deco_args, cache_key_args, cache_key_kwargs)))
if region:
cachereg = cache_regions[region]
key_length = cachereg.get('key_length', util.DEFAULT_CACHE_KEY_LENGTH)
else:
key_length = options.pop('key_length', util.DEFAULT_CACHE_KEY_LENGTH)
# TODO: This is probably a bug as length is checked before converting to UTF8
# which will cause cache_key to grow in size.
if len(cache_key) + len(namespace) > int(key_length):
cache_key = sha1(cache_key.encode('utf-8')).hexdigest()
def go():
return func(*args, **kwargs)
# save org function name
go.__name__ = '_cached_%s' % (func.__name__,)
return cache[0].get_value(cache_key, createfunc=go)
cached._arg_namespace = namespace
if region is not None:
cached._arg_region = region
return cached
return decorate
def _cache_decorator_invalidate(cache, key_length, args):
"""Invalidate a cache key based on function arguments."""
cache_key = u_(" ").join(map(u_, args))
if len(cache_key) + len(cache.namespace_name) > key_length:
cache_key = sha1(cache_key.encode('utf-8')).hexdigest()
cache.remove_value(cache_key)
beaker-1.12.1/beaker/container.py000066400000000000000000000570171436751141500166330ustar00rootroot00000000000000"""Container and Namespace classes"""
import errno
from ._compat import pickle, anydbm, add_metaclass, PYVER, unicode_text
import beaker.util as util
import logging
import os
import time
from beaker.exceptions import CreationAbortedError, MissingCacheParameter
from beaker.synchronization import _threading, file_synchronizer, \
mutex_synchronizer, NameLock, null_synchronizer
__all__ = ['Value', 'Container', 'ContainerContext',
'MemoryContainer', 'DBMContainer', 'NamespaceManager',
'MemoryNamespaceManager', 'DBMNamespaceManager', 'FileContainer',
'OpenResourceNamespaceManager',
'FileNamespaceManager', 'CreationAbortedError']
logger = logging.getLogger('beaker.container')
if logger.isEnabledFor(logging.DEBUG):
debug = logger.debug
else:
def debug(message, *args):
pass
class NamespaceManager(object):
"""Handles dictionary operations and locking for a namespace of
values.
:class:`.NamespaceManager` provides a dictionary-like interface,
implementing ``__getitem__()``, ``__setitem__()``, and
``__contains__()``, as well as functions related to lock
acquisition.
The implementation for setting and retrieving the namespace data is
handled by subclasses.
NamespaceManager may be used alone, or may be accessed by
one or more :class:`.Value` objects. :class:`.Value` objects provide per-key
services like expiration times and automatic recreation of values.
Multiple NamespaceManagers created with a particular name will all
share access to the same underlying datasource and will attempt to
synchronize against a common mutex object. The scope of this
sharing may be within a single process or across multiple
processes, depending on the type of NamespaceManager used.
The NamespaceManager itself is generally threadsafe, except in the
case of the DBMNamespaceManager in conjunction with the gdbm dbm
implementation.
"""
@classmethod
def _init_dependencies(cls):
"""Initialize module-level dependent libraries required
by this :class:`.NamespaceManager`."""
def __init__(self, namespace):
self._init_dependencies()
self.namespace = namespace
def get_creation_lock(self, key):
"""Return a locking object that is used to synchronize
multiple threads or processes which wish to generate a new
cache value.
This function is typically an instance of
:class:`.FileSynchronizer`, :class:`.ConditionSynchronizer`,
or :class:`.null_synchronizer`.
The creation lock is only used when a requested value
does not exist, or has been expired, and is only used
by the :class:`.Value` key-management object in conjunction
with a "createfunc" value-creation function.
"""
raise NotImplementedError()
def do_remove(self):
"""Implement removal of the entire contents of this
:class:`.NamespaceManager`.
e.g. for a file-based namespace, this would remove
all the files.
The front-end to this method is the
:meth:`.NamespaceManager.remove` method.
"""
raise NotImplementedError()
def acquire_read_lock(self):
"""Establish a read lock.
This operation is called before a key is read. By
default the function does nothing.
"""
def release_read_lock(self):
"""Release a read lock.
This operation is called after a key is read. By
default the function does nothing.
"""
def acquire_write_lock(self, wait=True, replace=False):
"""Establish a write lock.
This operation is called before a key is written.
A return value of ``True`` indicates the lock has
been acquired.
By default the function returns ``True`` unconditionally.
'replace' is a hint indicating the full contents
of the namespace may be safely discarded. Some backends
may implement this (i.e. file backend won't unpickle the
current contents).
"""
return True
def release_write_lock(self):
"""Release a write lock.
This operation is called after a new value is written.
By default this function does nothing.
"""
def has_key(self, key):
"""Return ``True`` if the given key is present in this
:class:`.Namespace`.
"""
return self.__contains__(key)
def __getitem__(self, key):
raise NotImplementedError()
def __setitem__(self, key, value):
raise NotImplementedError()
def set_value(self, key, value, expiretime=None):
"""Sets a value in this :class:`.NamespaceManager`.
This is the same as ``__setitem__()``, but
also allows an expiration time to be passed
at the same time.
"""
self[key] = value
def __contains__(self, key):
raise NotImplementedError()
def __delitem__(self, key):
raise NotImplementedError()
def keys(self):
"""Return the list of all keys.
This method may not be supported by all
:class:`.NamespaceManager` implementations.
"""
raise NotImplementedError()
def remove(self):
"""Remove the entire contents of this
:class:`.NamespaceManager`.
e.g. for a file-based namespace, this would remove
all the files.
"""
self.do_remove()
class OpenResourceNamespaceManager(NamespaceManager):
"""A NamespaceManager where read/write operations require opening/
closing of a resource which is possibly mutexed.
"""
def __init__(self, namespace):
NamespaceManager.__init__(self, namespace)
self.access_lock = self.get_access_lock()
self.openers = 0
self.mutex = _threading.Lock()
def get_access_lock(self):
raise NotImplementedError()
def do_open(self, flags, replace):
raise NotImplementedError()
def do_close(self):
raise NotImplementedError()
def acquire_read_lock(self):
self.access_lock.acquire_read_lock()
try:
self.open('r', checkcount=True)
except:
self.access_lock.release_read_lock()
raise
def release_read_lock(self):
try:
self.close(checkcount=True)
finally:
self.access_lock.release_read_lock()
def acquire_write_lock(self, wait=True, replace=False):
r = self.access_lock.acquire_write_lock(wait)
try:
if (wait or r):
self.open('c', checkcount=True, replace=replace)
return r
except:
self.access_lock.release_write_lock()
raise
def release_write_lock(self):
try:
self.close(checkcount=True)
finally:
self.access_lock.release_write_lock()
def open(self, flags, checkcount=False, replace=False):
self.mutex.acquire()
try:
if checkcount:
if self.openers == 0:
self.do_open(flags, replace)
self.openers += 1
else:
self.do_open(flags, replace)
self.openers = 1
finally:
self.mutex.release()
def close(self, checkcount=False):
self.mutex.acquire()
try:
if checkcount:
self.openers -= 1
if self.openers == 0:
self.do_close()
else:
if self.openers > 0:
self.do_close()
self.openers = 0
finally:
self.mutex.release()
def remove(self):
self.access_lock.acquire_write_lock()
try:
self.close(checkcount=False)
self.do_remove()
finally:
self.access_lock.release_write_lock()
class Value(object):
"""Implements synchronization, expiration, and value-creation logic
for a single value stored in a :class:`.NamespaceManager`.
"""
__slots__ = 'key', 'createfunc', 'expiretime', 'expire_argument', 'starttime', 'storedtime',\
'namespace'
def __init__(self, key, namespace, createfunc=None, expiretime=None, starttime=None):
self.key = key
self.createfunc = createfunc
self.expire_argument = expiretime
self.starttime = starttime
self.storedtime = -1
self.namespace = namespace
def has_value(self):
"""return true if the container has a value stored.
This is regardless of it being expired or not.
"""
self.namespace.acquire_read_lock()
try:
return self.key in self.namespace
finally:
self.namespace.release_read_lock()
def can_have_value(self):
return self.has_current_value() or self.createfunc is not None
def has_current_value(self):
self.namespace.acquire_read_lock()
try:
has_value = self.key in self.namespace
if has_value:
try:
stored, expired, value = self._get_value()
return not self._is_expired(stored, expired)
except KeyError:
pass
return False
finally:
self.namespace.release_read_lock()
def _is_expired(self, storedtime, expiretime):
"""Return true if this container's value is expired."""
return (
(
self.starttime is not None and
storedtime < self.starttime
)
or
(
expiretime is not None and
time.time() >= expiretime + storedtime
)
)
def get_value(self):
self.namespace.acquire_read_lock()
try:
has_value = self.has_value()
if has_value:
try:
stored, expired, value = self._get_value()
if not self._is_expired(stored, expired):
return value
except KeyError:
# guard against un-mutexed backends raising KeyError
has_value = False
if not self.createfunc:
raise KeyError(self.key)
finally:
self.namespace.release_read_lock()
has_createlock = False
creation_lock = self.namespace.get_creation_lock(self.key)
if has_value:
if not creation_lock.acquire(wait=False):
debug("get_value returning old value while new one is created")
return value
else:
debug("lock_creatfunc (didnt wait)")
has_createlock = True
if not has_createlock:
debug("lock_createfunc (waiting)")
creation_lock.acquire()
debug("lock_createfunc (waited)")
try:
# see if someone created the value already
self.namespace.acquire_read_lock()
try:
if self.has_value():
try:
stored, expired, value = self._get_value()
if not self._is_expired(stored, expired):
return value
except KeyError:
# guard against un-mutexed backends raising KeyError
pass
finally:
self.namespace.release_read_lock()
debug("get_value creating new value")
v = self.createfunc()
self.set_value(v)
return v
finally:
creation_lock.release()
debug("released create lock")
def _get_value(self):
value = self.namespace[self.key]
try:
stored, expired, value = value
except ValueError:
if not len(value) == 2:
raise
# Old format: upgrade
stored, value = value
expired = self.expire_argument
debug("get_value upgrading time %r expire time %r", stored, self.expire_argument)
self.namespace.release_read_lock()
self.set_value(value, stored)
self.namespace.acquire_read_lock()
except TypeError:
# occurs when the value is None. memcached
# may yank the rug from under us in which case
# that's the result
raise KeyError(self.key)
return stored, expired, value
def set_value(self, value, storedtime=None):
self.namespace.acquire_write_lock()
try:
if storedtime is None:
storedtime = time.time()
debug("set_value stored time %r expire time %r", storedtime, self.expire_argument)
self.namespace.set_value(self.key, (storedtime, self.expire_argument, value),
expiretime=self.expire_argument)
finally:
self.namespace.release_write_lock()
def clear_value(self):
self.namespace.acquire_write_lock()
try:
debug("clear_value")
if self.key in self.namespace:
try:
del self.namespace[self.key]
except KeyError:
# guard against un-mutexed backends raising KeyError
pass
self.storedtime = -1
finally:
self.namespace.release_write_lock()
class AbstractDictionaryNSManager(NamespaceManager):
"""A subclassable NamespaceManager that places data in a dictionary.
Subclasses should provide a "dictionary" attribute or descriptor
which returns a dict-like object. The dictionary will store keys
that are local to the "namespace" attribute of this manager, so
ensure that the dictionary will not be used by any other namespace.
e.g.::
import collections
cached_data = collections.defaultdict(dict)
class MyDictionaryManager(AbstractDictionaryNSManager):
def __init__(self, namespace):
AbstractDictionaryNSManager.__init__(self, namespace)
self.dictionary = cached_data[self.namespace]
The above stores data in a global dictionary called "cached_data",
which is structured as a dictionary of dictionaries, keyed
first on namespace name to a sub-dictionary, then on actual
cache key to value.
"""
def get_creation_lock(self, key):
return NameLock(
identifier="memorynamespace/funclock/%s/%s" %
(self.namespace, key),
reentrant=True
)
def __getitem__(self, key):
return self.dictionary[key]
def __contains__(self, key):
return self.dictionary.__contains__(key)
def has_key(self, key):
return self.dictionary.__contains__(key)
def __setitem__(self, key, value):
self.dictionary[key] = value
def __delitem__(self, key):
del self.dictionary[key]
def do_remove(self):
self.dictionary.clear()
def keys(self):
return self.dictionary.keys()
class MemoryNamespaceManager(AbstractDictionaryNSManager):
""":class:`.NamespaceManager` that uses a Python dictionary for storage."""
namespaces = util.SyncDict()
def __init__(self, namespace, **kwargs):
AbstractDictionaryNSManager.__init__(self, namespace)
self.dictionary = MemoryNamespaceManager.\
namespaces.get(self.namespace, dict)
class DBMNamespaceManager(OpenResourceNamespaceManager):
""":class:`.NamespaceManager` that uses ``dbm`` files for storage."""
def __init__(self, namespace, dbmmodule=None, data_dir=None,
dbm_dir=None, lock_dir=None,
digest_filenames=True, **kwargs):
self.digest_filenames = digest_filenames
if not dbm_dir and not data_dir:
raise MissingCacheParameter("data_dir or dbm_dir is required")
elif dbm_dir:
self.dbm_dir = dbm_dir
else:
self.dbm_dir = data_dir + "/container_dbm"
util.verify_directory(self.dbm_dir)
if not lock_dir and not data_dir:
raise MissingCacheParameter("data_dir or lock_dir is required")
elif lock_dir:
self.lock_dir = lock_dir
else:
self.lock_dir = data_dir + "/container_dbm_lock"
util.verify_directory(self.lock_dir)
self.dbmmodule = dbmmodule or anydbm
self.dbm = None
OpenResourceNamespaceManager.__init__(self, namespace)
self.file = util.encoded_path(root=self.dbm_dir,
identifiers=[self.namespace],
extension='.dbm',
digest_filenames=self.digest_filenames)
debug("data file %s", self.file)
self._checkfile()
def get_access_lock(self):
return file_synchronizer(identifier=self.namespace,
lock_dir=self.lock_dir)
def get_creation_lock(self, key):
return file_synchronizer(
identifier="dbmcontainer/funclock/%s/%s" % (
self.namespace, key
),
lock_dir=self.lock_dir
)
def file_exists(self, file):
if os.access(file, os.F_OK):
return True
else:
for ext in ('db', 'dat', 'pag', 'dir'):
if os.access(file + os.extsep + ext, os.F_OK):
return True
return False
def _ensuredir(self, filename):
dirname = os.path.dirname(filename)
if not os.path.exists(dirname):
util.verify_directory(dirname)
def _checkfile(self):
if not self.file_exists(self.file):
self._ensuredir(self.file)
g = self.dbmmodule.open(self.file, 'c')
g.close()
def get_filenames(self):
list = []
if os.access(self.file, os.F_OK):
list.append(self.file)
for ext in ('pag', 'dir', 'db', 'dat'):
if os.access(self.file + os.extsep + ext, os.F_OK):
list.append(self.file + os.extsep + ext)
return list
def do_open(self, flags, replace):
debug("opening dbm file %s", self.file)
try:
self.dbm = self.dbmmodule.open(self.file, flags)
except:
self._checkfile()
self.dbm = self.dbmmodule.open(self.file, flags)
def do_close(self):
if self.dbm is not None:
debug("closing dbm file %s", self.file)
self.dbm.close()
def do_remove(self):
for f in self.get_filenames():
os.remove(f)
def __getitem__(self, key):
return pickle.loads(self.dbm[key])
def __contains__(self, key):
if PYVER == (3, 2):
# Looks like this is a bug that got solved in PY3.3 and PY3.4
# http://bugs.python.org/issue19288
if isinstance(key, unicode_text):
key = key.encode('UTF-8')
return key in self.dbm
def __setitem__(self, key, value):
self.dbm[key] = pickle.dumps(value)
def __delitem__(self, key):
del self.dbm[key]
def keys(self):
return self.dbm.keys()
class FileNamespaceManager(OpenResourceNamespaceManager):
""":class:`.NamespaceManager` that uses binary files for storage.
Each namespace is implemented as a single file storing a
dictionary of key/value pairs, serialized using the Python
``pickle`` module.
"""
def __init__(self, namespace, data_dir=None, file_dir=None, lock_dir=None,
digest_filenames=True, **kwargs):
self.digest_filenames = digest_filenames
if not file_dir and not data_dir:
raise MissingCacheParameter("data_dir or file_dir is required")
elif file_dir:
self.file_dir = file_dir
else:
self.file_dir = data_dir + "/container_file"
util.verify_directory(self.file_dir)
if not lock_dir and not data_dir:
raise MissingCacheParameter("data_dir or lock_dir is required")
elif lock_dir:
self.lock_dir = lock_dir
else:
self.lock_dir = data_dir + "/container_file_lock"
util.verify_directory(self.lock_dir)
OpenResourceNamespaceManager.__init__(self, namespace)
self.file = util.encoded_path(root=self.file_dir,
identifiers=[self.namespace],
extension='.cache',
digest_filenames=self.digest_filenames)
self.hash = {}
debug("data file %s", self.file)
def get_access_lock(self):
return file_synchronizer(identifier=self.namespace,
lock_dir=self.lock_dir)
def get_creation_lock(self, key):
return file_synchronizer(
identifier="dbmcontainer/funclock/%s/%s" % (
self.namespace, key
),
lock_dir=self.lock_dir
)
def file_exists(self, file):
return os.access(file, os.F_OK)
def do_open(self, flags, replace):
if not replace and self.file_exists(self.file):
try:
with open(self.file, 'rb') as fh:
self.hash = pickle.load(fh)
except IOError as e:
# Ignore EACCES and ENOENT as it just means we are no longer
# able to access the file or that it no longer exists
if e.errno not in [errno.EACCES, errno.ENOENT]:
raise
self.flags = flags
def do_close(self):
if self.flags == 'c' or self.flags == 'w':
pickled = pickle.dumps(self.hash)
util.safe_write(self.file, pickled)
self.hash = {}
self.flags = None
def do_remove(self):
try:
os.remove(self.file)
except OSError:
# for instance, because we haven't yet used this cache,
# but client code has asked for a clear() operation...
pass
self.hash = {}
def __getitem__(self, key):
return self.hash[key]
def __contains__(self, key):
return key in self.hash
def __setitem__(self, key, value):
self.hash[key] = value
def __delitem__(self, key):
del self.hash[key]
def keys(self):
return self.hash.keys()
#### legacy stuff to support the old "Container" class interface
namespace_classes = {}
ContainerContext = dict
class ContainerMeta(type):
def __init__(cls, classname, bases, dict_):
namespace_classes[cls] = cls.namespace_class
return type.__init__(cls, classname, bases, dict_)
def __call__(self, key, context, namespace, createfunc=None,
expiretime=None, starttime=None, **kwargs):
if namespace in context:
ns = context[namespace]
else:
nscls = namespace_classes[self]
context[namespace] = ns = nscls(namespace, **kwargs)
return Value(key, ns, createfunc=createfunc,
expiretime=expiretime, starttime=starttime)
@add_metaclass(ContainerMeta)
class Container(object):
"""Implements synchronization and value-creation logic
for a 'value' stored in a :class:`.NamespaceManager`.
:class:`.Container` and its subclasses are deprecated. The
:class:`.Value` class is now used for this purpose.
"""
namespace_class = NamespaceManager
class FileContainer(Container):
namespace_class = FileNamespaceManager
class MemoryContainer(Container):
namespace_class = MemoryNamespaceManager
class DBMContainer(Container):
namespace_class = DBMNamespaceManager
DbmContainer = DBMContainer
beaker-1.12.1/beaker/converters.py000066400000000000000000000016021436751141500170300ustar00rootroot00000000000000from beaker._compat import string_type
# (c) 2005 Ian Bicking and contributors; written for Paste (http://pythonpaste.org)
# Licensed under the MIT license: http://www.opensource.org/licenses/mit-license.php
def asbool(obj):
if isinstance(obj, string_type):
obj = obj.strip().lower()
if obj in ['true', 'yes', 'on', 'y', 't', '1']:
return True
elif obj in ['false', 'no', 'off', 'n', 'f', '0']:
return False
else:
raise ValueError(
"String is not true/false: %r" % obj)
return bool(obj)
def aslist(obj, sep=None, strip=True):
if isinstance(obj, string_type):
lst = obj.split(sep)
if strip:
lst = [v.strip() for v in lst]
return lst
elif isinstance(obj, (list, tuple)):
return obj
elif obj is None:
return []
else:
return [obj]
beaker-1.12.1/beaker/cookie.py000066400000000000000000000061651436751141500161200ustar00rootroot00000000000000import sys
from ._compat import http_cookies
# Some versions of Python 2.7 and later won't need this encoding bug fix:
_cookie_encodes_correctly = http_cookies.SimpleCookie().value_encode(';') == (';', '"\\073"')
# Cookie pickling bug is fixed in Python 2.7.9 and Python 3.4.3+
# http://bugs.python.org/issue22775
cookie_pickles_properly = (
(sys.version_info[:2] == (2, 7) and sys.version_info >= (2, 7, 9)) or
sys.version_info >= (3, 4, 3)
)
# Add support for the SameSite attribute (obsolete when PY37 is unsupported).
http_cookies.Morsel._reserved.setdefault('samesite', 'SameSite')
# Adapted from Django.http.cookies and always enabled the bad_cookies
# behaviour to cope with any invalid cookie key while keeping around
# the session.
class SimpleCookie(http_cookies.SimpleCookie):
if not cookie_pickles_properly:
def __setitem__(self, key, value):
# Apply the fix from http://bugs.python.org/issue22775 where
# it's not fixed in Python itself
if isinstance(value, http_cookies.Morsel):
# allow assignment of constructed Morsels (e.g. for pickling)
dict.__setitem__(self, key, value)
else:
super(SimpleCookie, self).__setitem__(key, value)
if not _cookie_encodes_correctly:
def value_encode(self, val):
# Some browsers do not support quoted-string from RFC 2109,
# including some versions of Safari and Internet Explorer.
# These browsers split on ';', and some versions of Safari
# are known to split on ', '. Therefore, we encode ';' and ','
# SimpleCookie already does the hard work of encoding and decoding.
# It uses octal sequences like '\\012' for newline etc.
# and non-ASCII chars. We just make use of this mechanism, to
# avoid introducing two encoding schemes which would be confusing
# and especially awkward for javascript.
# NB, contrary to Python docs, value_encode returns a tuple containing
# (real val, encoded_val)
val, encoded = super(SimpleCookie, self).value_encode(val)
encoded = encoded.replace(";", "\\073").replace(",", "\\054")
# If encoded now contains any quoted chars, we need double quotes
# around the whole string.
if "\\" in encoded and not encoded.startswith('"'):
encoded = '"' + encoded + '"'
return val, encoded
def load(self, rawdata):
self.bad_cookies = set()
super(SimpleCookie, self).load(rawdata)
for key in self.bad_cookies:
del self[key]
# override private __set() method:
# (needed for using our Morsel, and for laxness with CookieError
def _BaseCookie__set(self, key, real_value, coded_value):
try:
super(SimpleCookie, self)._BaseCookie__set(key, real_value, coded_value)
except http_cookies.CookieError:
if not hasattr(self, 'bad_cookies'):
self.bad_cookies = set()
self.bad_cookies.add(key)
dict.__setitem__(self, key, http_cookies.Morsel())
beaker-1.12.1/beaker/crypto/000077500000000000000000000000001436751141500156055ustar00rootroot00000000000000beaker-1.12.1/beaker/crypto/__init__.py000066400000000000000000000055361436751141500177270ustar00rootroot00000000000000"""Provide a crypto object, depending on the available modules.
The object has this interface:
aesEncrypt(DATA, KEY)
Encrypt the DATA with key KEY.
aesDecrypt(DATA, KEY):
Decrypt the DATA with key KEY.
has_aes
True if the encryption provides AES encryption.
getKeyLength()
Return the maximum size for keys for this crypto object, in bytes.
"""
from .._compat import JYTHON
from beaker.crypto.pbkdf2 import pbkdf2
from beaker.crypto.util import hmac, sha1, hmac_sha1, md5
from beaker import util
from beaker.exceptions import InvalidCryptoBackendError
keyLength = None
DEFAULT_NONCE_BITS = 128
CRYPTO_MODULES = {}
def load_default_module():
"""Load the default crypto module and return it.
Note: if no crypto module is available, return a dummy module
which does not encrypt at all.
"""
if JYTHON:
try:
from beaker.crypto import jcecrypto
return jcecrypto
except ImportError:
pass
else:
try:
from beaker.crypto import nsscrypto
return nsscrypto
except ImportError:
try:
from beaker.crypto import pycrypto
return pycrypto
except ImportError:
pass
from beaker.crypto import noencryption
return noencryption
def register_crypto_module(name, mod):
"""
Register the given module under the name given.
"""
CRYPTO_MODULES[name] = mod
def get_crypto_module(name):
"""
Get the active crypto module for this name
"""
if name not in CRYPTO_MODULES:
if name == 'default':
register_crypto_module('default', load_default_module())
elif name == 'nss':
from beaker.crypto import nsscrypto
register_crypto_module(name, nsscrypto)
elif name == 'pycrypto':
from beaker.crypto import pycrypto
register_crypto_module(name, pycrypto)
elif name == 'cryptography':
from beaker.crypto import pyca_cryptography
register_crypto_module(name, pyca_cryptography)
else:
raise InvalidCryptoBackendError(
"No crypto backend with name '%s' is registered." % name)
return CRYPTO_MODULES[name]
def generateCryptoKeys(master_key, salt, iterations, keylen):
# NB: We XOR parts of the keystream into the randomly-generated parts, just
# in case os.urandom() isn't as random as it should be. Note that if
# os.urandom() returns truly random data, this will have no effect on the
# overall security.
return pbkdf2(master_key, salt, iterations=iterations, dklen=keylen)
def get_nonce_size(number_of_bits):
if number_of_bits % 8:
raise ValueError('Nonce complexity currently supports multiples of 8')
bytes = number_of_bits // 8
b64bytes = ((4 * bytes // 3) + 3) & ~3
return bytes, b64bytes
beaker-1.12.1/beaker/crypto/jcecrypto.py000066400000000000000000000024211436751141500201600ustar00rootroot00000000000000"""
Encryption module that uses the Java Cryptography Extensions (JCE).
Note that in default installations of the Java Runtime Environment, the
maximum key length is limited to 128 bits due to US export
restrictions. This makes the generated keys incompatible with the ones
generated by pycryptopp, which has no such restrictions. To fix this,
download the "Unlimited Strength Jurisdiction Policy Files" from Sun,
which will allow encryption using 256 bit AES keys.
"""
from warnings import warn
from javax.crypto import Cipher
from javax.crypto.spec import SecretKeySpec, IvParameterSpec
import jarray
# Initialization vector filled with zeros
_iv = IvParameterSpec(jarray.zeros(16, 'b'))
def aesEncrypt(data, key):
cipher = Cipher.getInstance('AES/CTR/NoPadding')
skeySpec = SecretKeySpec(key, 'AES')
cipher.init(Cipher.ENCRYPT_MODE, skeySpec, _iv)
return cipher.doFinal(data).tostring()
# magic.
aesDecrypt = aesEncrypt
has_aes = True
def getKeyLength():
maxlen = Cipher.getMaxAllowedKeyLength('AES/CTR/NoPadding')
return min(maxlen, 256) / 8
if getKeyLength() < 32:
warn('Crypto implementation only supports key lengths up to %d bits. '
'Generated session cookies may be incompatible with other '
'environments' % (getKeyLength() * 8))
beaker-1.12.1/beaker/crypto/noencryption.py000066400000000000000000000002661436751141500207120ustar00rootroot00000000000000"""Encryption module that does nothing"""
def aesEncrypt(data, key):
return data
def aesDecrypt(data, key):
return data
has_aes = False
def getKeyLength():
return 32
beaker-1.12.1/beaker/crypto/nsscrypto.py000066400000000000000000000031021436751141500202170ustar00rootroot00000000000000"""Encryption module that uses nsscrypto"""
import nss.nss
nss.nss.nss_init_nodb()
# Apparently the rest of beaker doesn't care about the particular cipher,
# mode and padding used.
# NOTE: A constant IV!!! This is only secure if the KEY is never reused!!!
_mech = nss.nss.CKM_AES_CBC_PAD
_iv = '\0' * nss.nss.get_iv_length(_mech)
def aesEncrypt(data, key):
slot = nss.nss.get_best_slot(_mech)
key_obj = nss.nss.import_sym_key(slot, _mech, nss.nss.PK11_OriginGenerated,
nss.nss.CKA_ENCRYPT, nss.nss.SecItem(key))
param = nss.nss.param_from_iv(_mech, nss.nss.SecItem(_iv))
ctx = nss.nss.create_context_by_sym_key(_mech, nss.nss.CKA_ENCRYPT, key_obj,
param)
l1 = ctx.cipher_op(data)
# Yes, DIGEST. This needs fixing in NSS, but apparently nobody (including
# me :( ) cares enough.
l2 = ctx.digest_final()
return l1 + l2
def aesDecrypt(data, key):
slot = nss.nss.get_best_slot(_mech)
key_obj = nss.nss.import_sym_key(slot, _mech, nss.nss.PK11_OriginGenerated,
nss.nss.CKA_DECRYPT, nss.nss.SecItem(key))
param = nss.nss.param_from_iv(_mech, nss.nss.SecItem(_iv))
ctx = nss.nss.create_context_by_sym_key(_mech, nss.nss.CKA_DECRYPT, key_obj,
param)
l1 = ctx.cipher_op(data)
# Yes, DIGEST. This needs fixing in NSS, but apparently nobody (including
# me :( ) cares enough.
l2 = ctx.digest_final()
return l1 + l2
has_aes = True
def getKeyLength():
return 32
beaker-1.12.1/beaker/crypto/pbkdf2.py000066400000000000000000000062441436751141500173350ustar00rootroot00000000000000"""
PBKDF2 Implementation adapted from django.utils.crypto.
This is used to generate the encryption key for enciphered sessions.
"""
from beaker._compat import bytes_, xrange_
import hmac
import struct
import hashlib
import binascii
def _bin_to_long(x):
"""Convert a binary string into a long integer"""
return int(binascii.hexlify(x), 16)
def _long_to_bin(x, hex_format_string):
"""
Convert a long integer into a binary string.
hex_format_string is like "%020x" for padding 10 characters.
"""
return binascii.unhexlify((hex_format_string % x).encode('ascii'))
if hasattr(hashlib, "pbkdf2_hmac"):
def pbkdf2(password, salt, iterations, dklen=0, digest=None):
"""
Implements PBKDF2 using the stdlib. This is used in Python 2.7.8+ and 3.4+.
HMAC+SHA256 is used as the default pseudo random function.
As of 2014, 100,000 iterations was the recommended default which took
100ms on a 2.7Ghz Intel i7 with an optimized implementation. This is
probably the bare minimum for security given 1000 iterations was
recommended in 2001.
"""
if digest is None:
digest = hashlib.sha1
if not dklen:
dklen = None
password = bytes_(password)
salt = bytes_(salt)
return hashlib.pbkdf2_hmac(
digest().name, password, salt, iterations, dklen)
else:
def pbkdf2(password, salt, iterations, dklen=0, digest=None):
"""
Implements PBKDF2 as defined in RFC 2898, section 5.2
HMAC+SHA256 is used as the default pseudo random function.
As of 2014, 100,000 iterations was the recommended default which took
100ms on a 2.7Ghz Intel i7 with an optimized implementation. This is
probably the bare minimum for security given 1000 iterations was
recommended in 2001. This code is very well optimized for CPython and
is about five times slower than OpenSSL's implementation.
"""
assert iterations > 0
if not digest:
digest = hashlib.sha1
password = bytes_(password)
salt = bytes_(salt)
hlen = digest().digest_size
if not dklen:
dklen = hlen
if dklen > (2 ** 32 - 1) * hlen:
raise OverflowError('dklen too big')
l = -(-dklen // hlen)
r = dklen - (l - 1) * hlen
hex_format_string = "%%0%ix" % (hlen * 2)
inner, outer = digest(), digest()
if len(password) > inner.block_size:
password = digest(password).digest()
password += b'\x00' * (inner.block_size - len(password))
inner.update(password.translate(hmac.trans_36))
outer.update(password.translate(hmac.trans_5C))
def F(i):
u = salt + struct.pack(b'>I', i)
result = 0
for j in xrange_(int(iterations)):
dig1, dig2 = inner.copy(), outer.copy()
dig1.update(u)
dig2.update(dig1.digest())
u = dig2.digest()
result ^= _bin_to_long(u)
return _long_to_bin(result, hex_format_string)
T = [F(x) for x in xrange_(1, l)]
return b''.join(T) + F(l)[:r]
beaker-1.12.1/beaker/crypto/pyca_cryptography.py000066400000000000000000000024651436751141500217350ustar00rootroot00000000000000"""Encryption module that uses pyca/cryptography"""
import os
import json
from cryptography.hazmat.backends import default_backend
from cryptography.hazmat.primitives.ciphers import (
Cipher, algorithms, modes
)
def aesEncrypt(data, key):
# Generate a random 96-bit IV.
iv = os.urandom(12)
# Construct an AES-GCM Cipher object with the given key and a
# randomly generated IV.
encryptor = Cipher(
algorithms.AES(key),
modes.GCM(iv),
backend=default_backend()
).encryptor()
# Encrypt the plaintext and get the associated ciphertext.
# GCM does not require padding.
ciphertext = encryptor.update(data) + encryptor.finalize()
return iv + encryptor.tag + ciphertext
def aesDecrypt(data, key):
iv = data[:12]
tag = data[12:28]
ciphertext = data[28:]
# Construct a Cipher object, with the key, iv, and additionally the
# GCM tag used for authenticating the message.
decryptor = Cipher(
algorithms.AES(key),
modes.GCM(iv, tag),
backend=default_backend()
).decryptor()
# Decryption gets us the authenticated plaintext.
# If the tag does not match an InvalidTag exception will be raised.
return decryptor.update(ciphertext) + decryptor.finalize()
has_aes = True
def getKeyLength():
return 32
beaker-1.12.1/beaker/crypto/pycrypto.py000066400000000000000000000017301436751141500200510ustar00rootroot00000000000000"""Encryption module that uses pycryptopp or pycrypto"""
try:
# Pycryptopp is preferred over Crypto because Crypto has had
# various periods of not being maintained, and pycryptopp uses
# the Crypto++ library which is generally considered the 'gold standard'
# of crypto implementations
from pycryptopp.cipher import aes
def aesEncrypt(data, key):
cipher = aes.AES(key)
return cipher.process(data)
# magic.
aesDecrypt = aesEncrypt
except ImportError:
from Crypto.Cipher import AES
from Crypto.Util import Counter
def aesEncrypt(data, key):
cipher = AES.new(key, AES.MODE_CTR,
counter=Counter.new(128, initial_value=0))
return cipher.encrypt(data)
def aesDecrypt(data, key):
cipher = AES.new(key, AES.MODE_CTR,
counter=Counter.new(128, initial_value=0))
return cipher.decrypt(data)
has_aes = True
def getKeyLength():
return 32
beaker-1.12.1/beaker/crypto/util.py000066400000000000000000000006601436751141500171360ustar00rootroot00000000000000from hashlib import md5
try:
# Use PyCrypto (if available)
from Crypto.Hash import HMAC as hmac, SHA as hmac_sha1
sha1 = hmac_sha1.new
except ImportError:
# PyCrypto not available. Use the Python standard library.
import hmac
# NOTE: We have to use the callable with hashlib (hashlib.sha1),
# otherwise hmac only accepts the sha module object itself
from hashlib import sha1
hmac_sha1 = sha1beaker-1.12.1/beaker/docs/000077500000000000000000000000001436751141500152155ustar00rootroot00000000000000beaker-1.12.1/beaker/docs/Makefile000066400000000000000000000044431436751141500166620ustar00rootroot00000000000000# Makefile for Sphinx documentation
#
# You can set these variables from the command line.
SPHINXOPTS =
SPHINXBUILD = sphinx-build
PAPER =
# Internal variables.
PAPEROPT_a4 = -D latex_paper_size=a4
PAPEROPT_letter = -D latex_paper_size=letter
ALLSPHINXOPTS = -d build/doctrees $(PAPEROPT_$(PAPER)) $(SPHINXOPTS) .
.PHONY: help clean html web pickle htmlhelp latex changes linkcheck
help:
@echo "Please use \`make ' where is one of"
@echo " html to make standalone HTML files"
@echo " pickle to make pickle files"
@echo " json to make JSON files"
@echo " htmlhelp to make HTML files and a HTML help project"
@echo " latex to make LaTeX files, you can set PAPER=a4 or PAPER=letter"
@echo " changes to make an overview over all changed/added/deprecated items"
@echo " linkcheck to check all external links for integrity"
clean:
-rm -rf build/*
html:
mkdir -p build/html build/doctrees
$(SPHINXBUILD) -b html $(ALLSPHINXOPTS) build/html
@echo
@echo "Build finished. The HTML pages are in build/html."
pickle:
mkdir -p build/pickle build/doctrees
$(SPHINXBUILD) -b pickle $(ALLSPHINXOPTS) build/pickle
@echo
@echo "Build finished; now you can process the pickle files."
web: pickle
json:
mkdir -p build/json build/doctrees
$(SPHINXBUILD) -b json $(ALLSPHINXOPTS) build/json
@echo
@echo "Build finished; now you can process the JSON files."
htmlhelp:
mkdir -p build/htmlhelp build/doctrees
$(SPHINXBUILD) -b htmlhelp $(ALLSPHINXOPTS) build/htmlhelp
@echo
@echo "Build finished; now you can run HTML Help Workshop with the" \
".hhp project file in build/htmlhelp."
latex:
mkdir -p build/latex build/doctrees
$(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) build/latex
@echo
@echo "Build finished; the LaTeX files are in build/latex."
@echo "Run \`make all-pdf' or \`make all-ps' in that directory to" \
"run these through (pdf)latex."
changes:
mkdir -p build/changes build/doctrees
$(SPHINXBUILD) -b changes $(ALLSPHINXOPTS) build/changes
@echo
@echo "The overview file is in build/changes."
linkcheck:
mkdir -p build/linkcheck build/doctrees
$(SPHINXBUILD) -b linkcheck $(ALLSPHINXOPTS) build/linkcheck
@echo
@echo "Link check complete; look for any errors in the above output " \
"or in build/linkcheck/output.txt."
beaker-1.12.1/beaker/docs/caching.rst000066400000000000000000000224101436751141500173420ustar00rootroot00000000000000.. _caching:
=======
Caching
=======
About
=====
Beaker's caching system was originally based off the Perl Cache::Cache module,
which was ported for use in `Myghty`_. Beaker was then extracted from this
code, and has been substantially rewritten and modernized.
Several concepts still exist from this origin though. Beaker's caching (and
its sessions, behind the scenes) utilize the concept of
:term:`NamespaceManager`, and :term:`Container` objects to handle storing
cached data.
Each back-end utilizes a customized version of each of these objects to handle
storing data appropriately depending on the type of the back-end.
The :class:`~beaker.cache.CacheManager` is responsible for getting the
appropriate NamespaceManager, which then stores the cached values. Each
namespace corresponds to a single ``thing`` that should be cached. Usually
a single ``thing`` to be cached might vary slightly depending on parameters,
for example a template might need several different copies of itself stored
depending on whether a user is logged in or not. Each one of these copies
is then ``keyed`` under the NamespaceManager and stored in a Container.
There are three schemes for using Beaker's caching, the first and more
traditional style is the programmatic API. This exposes the namespace's
and retrieves a :class:`~beaker.cache.Cache` object that handles storing
keyed values in a NamespaceManager with Container objects.
The more elegant system, introduced in Beaker 1.3, is to use the
:ref:`cache decorators `, these also support the
use of :term:`Cache Regions`.
Introduced in Beaker 1.5 is a more flexible :func:`~beaker.cache.cache_region`
decorator capable of decorating functions for use with Beaker's
:ref:`caching_with_regions` **before** Beaker has been configured. This makes
it possible to easily use Beaker's region caching decorator on functions in
the module level.
Creating the CacheManager Instance
==================================
Before using Beaker's caching, an instance of the
:class:`~beaker.cache.CacheManager` class should be created. All of the
examples below assume that it has already been created.
Creating the cache instance::
from beaker.cache import CacheManager
from beaker.util import parse_cache_config_options
cache_opts = {
'cache.type': 'file',
'cache.data_dir': '/tmp/cache/data',
'cache.lock_dir': '/tmp/cache/lock'
}
cache = CacheManager(**parse_cache_config_options(cache_opts))
Additional configuration options are documented in the :ref:`Configuration`
section of the Beaker docs.
Programmatic API
================
.. _programmatic:
To store data for a cache value, first, a NamespaceManager has to be
retrieved to manage the keys for a ``thing`` to be cached::
# Assuming that cache is an already created CacheManager instance
tmpl_cache = cache.get_cache('mytemplate.html', type='dbm', expire=3600)
.. note::
In addition to the defaults supplied to the
:class:`~beaker.cache.CacheManager` instance, any of the Cache options
can be changed on a per-namespace basis, as this example demonstrates
by setting a ``type``, and ``expire`` option.
Individual values should be stored using a creation function, which will
be called anytime the cache has expired or a new copy needs to be made. The
creation function must not accept any arguments as it won't be called with
any. Options affecting the created value can be passed in by using closure
scope on the creation function::
search_param = 'gophers'
def get_results():
# do something to retrieve data
data = get_data(search_param)
return data
# Cache this function, based on the search_param, using the tmpl_cache
# instance from the prior example
results = tmpl_cache.get(key=search_param, createfunc=get_results)
Invalidating
------------
All of the values for a particular namespace can be removed by calling the
:meth:`~beaker.cache.Cache.clear` method::
tmpl_cache.clear()
Note that this only clears the key's in the namespace that this particular
Cache instance is aware of. Therefore, it is recommended to manually clear out
specific keys in a cache namespace that should be removed::
tmpl_cache.remove_value(key=search_param)
Decorator API
=============
.. _decorator_api:
When using the decorator API, a namespace does not need to be specified and
will instead be created for you with the name of the module + the name of the
function that will have its output cached.
Since it's possible that multiple functions in the same module might have the
same name, additional arguments can be provided to the decorators that will be
used in the namespace to prevent multiple functions from caching their values
in the same location.
For example::
# Assuming that cache is an already created CacheManager instance
@cache.cache('my_search_func', expire=3600)
def get_results(search_param):
# do something to retrieve data
data = get_data(search_param)
return data
results = get_results('gophers')
The non-keyword arguments to the :meth:`~beaker.cache.CacheManager.cache`
method are the additional ones used to ensure this function's cache results
won't clash with another function in this module called ``get_results``.
The cache expire argument is specified as a keyword argument. Other valid
arguments to the :meth:`~beaker.cache.CacheManager.get_cache` method such
as ``type`` can also be passed in.
When using the decorator, the function to cache can have arguments, which will
be used as the key was in the :ref:`Programmatic API ` for
the data generated.
.. warning::
These arguments can **not** be keyword arguments.
Invalidating
------------
Since the :meth:`~beaker.cache.CacheManager.cache` decorator hides the
namespace used, manually removing the key requires the use of the
:meth:`~beaker.cache.CacheManager.invalidate` function. To invalidate
the 'gophers' result that the prior example referred to::
cache.invalidate(get_results, 'my_search_func', 'gophers')
If however, a type was specified for the cached function, the type must
also be given to the :meth:`~beaker.cache.CacheManager.invalidate`
function so that it can remove the value from the appropriate back-end.
Example::
# Assuming that cache is an already created CacheManager instance
@cache.cache('my_search_func', type="file", expire=3600)
def get_results(search_param):
# do something to retrieve data
data = get_data(search_param)
return data
cache.invalidate(get_results, 'my_search_func', 'gophers', type="file")
.. note::
Both the arguments used to specify the additional namespace info to the
cache decorator **and** the arguments sent to the function need to be
given to the :meth:`~beaker.cache.CacheManager.region_invalidate`
function so that it can properly locate the namespace and cache key
to remove.
.. _caching_with_regions:
Cache Regions
=============
Rather than having to specify the expiration, or toggle the type used for
caching different functions, commonly used cache parameters can be defined
as :term:`Cache Regions`. These user-defined regions may be used
with the :meth:`~beaker.cache.CacheManager.region` decorator rather than
passing the configuration.
This can be useful if there are a few common cache schemes used by an
application that should be setup in a single place then used as appropriate
throughout the application.
Setting up cache regions is documented in the
:ref:`cache region options ` section in
:ref:`configuration`.
Assuming a ``long_term`` and ``short_term`` region were setup, the
:meth:`~beaker.cache.CacheManager.region` decorator can be used::
@cache.region('short_term', 'my_search_func')
def get_results(search_param):
# do something to retrieve data
data = get_data(search_param)
return data
results = get_results('gophers')
Or using the :func:`~beaker.cache.cache_region` decorator::
@cache_region('short_term', 'my_search_func')
def get_results(search_param):
# do something to retrieve data
data = get_data(search_param)
return data
results = get_results('gophers')
The only difference with the :func:`~beaker.cache.cache_region` decorator is
that the cache does not need to be configured when it is used. This allows one
to decorate functions in a module before the Beaker cache is configured.
Invalidating
------------
Since the :meth:`~beaker.cache.CacheManager.region` decorator hides the
namespace used, manually removing the key requires the use of the
:meth:`~beaker.cache.CacheManager.region_invalidate` function. To invalidate
the 'gophers' result that the prior example referred to::
cache.region_invalidate(get_results, None, 'my_search_func', 'gophers')
Or when using the :func:`~beaker.cache.cache_region` decorator, the
:func:`beaker.cache.region_invalidate` function should be used::
region_invalidate(get_results, None, 'my_search_func', 'gophers')
.. note::
Both the arguments used to specify the additional namespace info to the
cache decorator **and** the arguments sent to the function need to be
given to the :meth:`~beaker.cache.CacheManager.region_invalidate`
function so that it can properly locate the namespace and cache key
to remove.
.. _Myghty: http://www.myghty.org/
beaker-1.12.1/beaker/docs/changes.rst000066400000000000000000000001361436751141500173570ustar00rootroot00000000000000:tocdepth: 2
.. _changes:
Changes in Beaker
*****************
.. include:: ../../CHANGELOG
beaker-1.12.1/beaker/docs/conf.py000066400000000000000000000143251436751141500165210ustar00rootroot00000000000000# -*- coding: utf-8 -*-
#
# Beaker documentation build configuration file, created by
# sphinx-quickstart on Fri Sep 19 15:12:15 2008.
#
# This file is execfile()d with the current directory set to its containing dir.
#
# The contents of this file are pickled, so don't put values in the namespace
# that aren't pickleable (module imports are okay, they're removed automatically).
#
# All configuration values have a default; values that are commented out
# serve to show the default.
import sys
import os
# If your extensions are in another directory, add it here. If the directory
# is relative to the documentation root, use os.path.abspath to make it
# absolute, like shown here.
sys.path.insert(0, os.path.abspath('../..'))
# General configuration
# ---------------------
# Add any Sphinx extension module names here, as strings. They can be extensions
# coming with Sphinx (named 'sphinx.ext.*') or your custom ones.
extensions = ['sphinx.ext.autodoc']
# Add any paths that contain templates here, relative to this directory.
# templates_path = ['_templates']
# The suffix of source filenames.
source_suffix = '.rst'
# The master toctree document.
master_doc = 'index'
# General information about the project.
project = u'Beaker'
copyright = u'2008-2016, Ben Bangert, Mike Bayer'
# The version info for the project you're documenting, acts as replacement for
# |version| and |release|, also used in various other places throughout the
# built documents.
#
# The short X.Y version.
version = '1.9'
# The full version, including alpha/beta/rc tags.
release = '1.9.0'
# The language for content autogenerated by Sphinx. Refer to documentation
# for a list of supported languages.
#language = None
# There are two options for replacing |today|: either, you set today to some
# non-false value, then it is used:
#today = ''
# Else, today_fmt is used as the format for a strftime call.
#today_fmt = '%B %d, %Y'
# List of documents that shouldn't be included in the build.
#unused_docs = []
# List of directories, relative to source directory, that shouldn't be searched
# for source files.
exclude_trees = []
# The reST default role (used for this markup: `text`) to use for all documents.
#default_role = None
# If true, '()' will be appended to :func: etc. cross-reference text.
#add_function_parentheses = True
# If true, the current module name will be prepended to all description
# unit titles (such as .. function::).
#add_module_names = True
# If true, sectionauthor and moduleauthor directives will be shown in the
# output. They are ignored by default.
show_authors = True
# The name of the Pygments (syntax highlighting) style to use.
pygments_style = 'pastie'
# Options for HTML output
# -----------------------
# The style sheet to use for HTML and HTML Help pages. A file of that name
# must exist either in Sphinx' static/ path, or in one of the custom paths
# given in html_static_path.
# html_style = 'default.css'
# The name for this set of Sphinx documents. If None, it defaults to
# " v documentation".
#html_title = None
# A shorter title for the navigation bar. Default is the same as html_title.
#html_short_title = None
# The name of an image file (within the static path) to place at the top of
# the sidebar.
#html_logo = None
# The name of an image file (within the static path) to use as favicon of the
# docs. This file should be a Windows icon file (.ico) being 16x16 or 32x32
# pixels large.
#html_favicon = None
# Add any paths that contain custom static files (such as style sheets) here,
# relative to this directory. They are copied after the builtin static files,
# so a file named "default.css" will overwrite the builtin "default.css".
html_static_path = ['_static']
# If not '', a 'Last updated on:' timestamp is inserted at every page bottom,
# using the given strftime format.
#html_last_updated_fmt = '%b %d, %Y'
# If true, SmartyPants will be used to convert quotes and dashes to
# typographically correct entities.
#html_use_smartypants = True
# html_index = 'contents.html'
# Custom sidebar templates, maps document names to template names.
# html_sidebars = {'index': 'indexsidebar.html'}
# Additional templates that should be rendered to pages, maps page names to
# template names.
# html_additional_pages = {'index': 'index.html'}
html_theme_options = {
}
# If false, no module index is generated.
#html_use_modindex = True
# If false, no index is generated.
#html_use_index = True
# If true, the index is split into individual pages for each letter.
#html_split_index = False
# If true, the reST sources are included in the HTML build as _sources/.
#html_copy_source = True
# If true, an OpenSearch description file will be output, and all pages will
# contain a tag referring to it. The value of this option must be the
# base URL from which the finished HTML is served.
html_use_opensearch = 'https://beaker.readthedocs.io/'
# If nonempty, this is the file name suffix for HTML files (e.g. ".xhtml").
#html_file_suffix = ''
# Output file base name for HTML help builder.
htmlhelp_basename = 'Beakerdoc'
# Options for LaTeX output
# ------------------------
# The paper size ('letter' or 'a4').
#latex_paper_size = 'letter'
# The font size ('10pt', '11pt' or '12pt').
#latex_font_size = '10pt'
# Grouping the document tree into LaTeX files. List of tuples
# (source start file, target name, title, author, document class [howto/manual]).
latex_documents = [
('contents', 'Beaker.tex', u'Beaker Documentation',
u'Ben Bangert, Mike Bayer', 'manual'),
]
# The name of an image file (relative to this directory) to place at the top of
# the title page.
#latex_logo = None
# For "manual" documents, if this is true, then toplevel headings are parts,
# not chapters.
#latex_use_parts = False
# Additional stuff for the LaTeX preamble.
latex_preamble = '''
\usepackage{palatino}
\definecolor{TitleColor}{rgb}{0.7,0,0}
\definecolor{InnerLinkColor}{rgb}{0.7,0,0}
\definecolor{OuterLinkColor}{rgb}{0.8,0,0}
\definecolor{VerbatimColor}{rgb}{0.985,0.985,0.985}
\definecolor{VerbatimBorderColor}{rgb}{0.8,0.8,0.8}
'''
# Documents to append as an appendix to all manuals.
#latex_appendices = []
# If false, no module index is generated.
latex_use_modindex = False
# Added to handle docs in middleware.py
autoclass_content = "both"
beaker-1.12.1/beaker/docs/configuration.rst000066400000000000000000000242671436751141500206310ustar00rootroot00000000000000.. _configuration:
=============
Configuration
=============
Beaker can be configured several different ways, depending on how it is to be
used. The most recommended style is to use a dictionary of preferences that
are to be passed to either the :class:`~beaker.middleware.SessionMiddleware` or
the :class:`~beaker.cache.CacheManager`.
Since both Beaker's sessions and caching use the same back-end container
storage system, there's some options that are applicable to both of them in
addition to session and cache specific configuration.
Most options can be specified as a string (necessary to config options that
are setup in INI files), and will be coerced to the appropriate value. Only
datetime's and timedelta's cannot be coerced and must be the actual objects.
Frameworks using Beaker usually allow both caching and sessions to be
configured in the same spot, Beaker assumes this condition as well and
requires options for caching and sessions to be prefixed appropriately.
For example, to configure the ``cookie_expires`` option for Beaker sessions
below, an appropriate entry in a `Pylons`_ INI file would be::
# Setting cookie_expires = true causes Beaker to omit the
# expires= field from the Set-Cookie: header, signaling the cookie
# should be discarded when the browser closes.
beaker.session.cookie_expires = true
.. note::
When using the options in a framework like `Pylons`_ or `TurboGears2`_, these
options must be prefixed by ``beaker.``, for example in a `Pylons`_ INI file::
beaker.session.data_dir = %(here)s/data/sessions/data
beaker.session.lock_dir = %(here)s/data/sessions/lock
Or when using stand-alone with the :class:`~beaker.middleware.SessionMiddleware`:
.. code-block:: python
from beaker.middleware import SessionMiddleware
session_opts = {
'session.cookie_expires': True
}
app = SomeWSGIAPP()
app = SessionMiddleware(app, session_opts)
Or when using the :class:`~beaker.cache.CacheManager`:
.. code-block:: python
from beaker.cache import CacheManager
from beaker.util import parse_cache_config_options
cache_opts = {
'cache.type': 'file',
'cache.data_dir': '/tmp/cache/data',
'cache.lock_dir': '/tmp/cache/lock'
}
cache = CacheManager(**parse_cache_config_options(cache_opts))
.. note::
When using the CacheManager directly, all dict options must be run through the
:func:`beaker.util.parse_cache_config_options` function to ensure they're valid
and of the appropriate type.
Options For Sessions and Caching
================================
data_dir (**optional**, string)
Used with any back-end that stores its data in physical files, such as the
dbm or file-based back-ends. This path should be an absolute path to the
directory that stores the files.
lock_dir (**required**, string)
Used with every back-end, to coordinate locking. With caching, this lock
file is used to ensure that multiple processes/threads aren't attempting
to re-create the same value at the same time (The :term:`Dog-Pile Effect`)
memcache_module (**optional**, string)
One of the names ``memcache``, ``cmemcache``, ``pylibmc``, or ``auto``.
Default is ``auto``. Specifies which memcached client library should
be imported when using the ext:memcached backend. If left at its
default of ``auto``, ``pylibmc`` is favored first, then ``cmemcache``,
then ``memcache``. New in 1.5.5.
type (**required**, string)
The name of the back-end to use for storing the sessions or cache objects.
Available back-ends supplied with Beaker: ``file``, ``dbm``, ``memory``,
``ext:memcached``, ``ext:database``, ``ext:google``, ``ext:mongodb``,
and ``ext:redis``.
For sessions, the additional type of ``cookie`` is available which
will store all the session data in the cookie itself. As such, size
limitations apply (4096 bytes).
Some of these back-ends require the url option as listed below.
webtest_varname (**optional**, string)
The name of the attribute to use when stashing the session object into
the environ for use with WebTest. The name provided here is where the
session object will be attached to the WebTest TestApp return value.
url (**optional**, string)
URL is specific to use of either ``ext:memcached``, ``ext:database``,
``ext:mongodb``, or ``ext:redis``. When using one of those types, this
option is **required**.
When used with ``ext:memcached``, this should be either a single, or
semi-colon separated list of memcached servers::
session_opts = {
'session.type': 'ext:memcached',
'session.url': '127.0.0.1:11211',
}
When used with ``ext:database``, this should be a valid `SQLAlchemy`_ database
string.
When used with ``ext:redis``, this should be an URL as passed to
``StrictRedis.from_url()``.
Session Options
===============
The Session handling takes a variety of additional options relevant to how it
stores session id's in cookies, and when using the optional encryption.
auto (**optional**, bool)
When set to True, the session will save itself anytime it is accessed
during a request, negating the need to issue the
:meth:`~beaker.session.Session.save` method.
Defaults to False.
cookie_expires (**optional**, bool, datetime, timedelta, int)
Determines when the cookie used to track the client-side of the session
will expire. When set to a boolean value, it will either expire at the
end of the browsers session, or never expire.
Setting to a datetime forces a hard ending time for the session (generally
used for setting a session to a far off date).
Setting to an integer will result in the cookie being set to expire in
that many seconds. I.e. a value of ``300`` will result in the cookie being
set to expire in 300 seconds.
Defaults to never expiring.
.. _cookie_domain_config:
cookie_domain (**optional**, string)
What domain the cookie should be set to. When using sub-domains, this
should be set to the main domain the cookie should be valid for. For
example, if a cookie should be valid under ``www.nowhere.com`` **and**
``files.nowhere.com`` then it should be set to ``.nowhere.com``.
Defaults to the current domain in its entirety.
Alternatively, the domain can be set dynamically on the session by
calling, see :ref:`cookie_attributes`.
key (**required**, string)
Name of the cookie key used to save the session under.
save_accessed_time (**optional**, bool)
Whether beaker should save the session's access time (true) or only
modification time (false).
Defaults to true.
secret (**required**, string)
Used with the HMAC to ensure session integrity. This value should
ideally be a randomly generated string.
When using in a cluster environment, the secret must be the same on
every machine.
secure (**optional**, bool)
Whether or not the session cookie should be marked as secure. When
marked as secure, browsers are instructed to not send the cookie over
anything other than an SSL connection.
timeout (**optional**, integer)
Seconds until the session is considered invalid, after which it will
be ignored and invalidated. This number is based on the time since
the session was last accessed, not from when the session was created.
Defaults to never expiring.
Requires that save_accessed_time be true.
Encryption Options
------------------
These options should then be used *instead* of the ``secret`` option when
a **cookie** only session is used and *together* with the ``secret`` option
when a server side session is used.
encrypt_key (**required**, string)
Encryption key to use for the AES cipher. This should be a fairly long
randomly generated string.
validate_key (**required**, string)
Validation key used to sign the AES encrypted data.
crypto_type (**optional**, string)
Encryption backend to use. If ``default`` is used, one of the installed
backends is picked.
Other valid choices are ``cryptography``, ``nss``, ``pycrypto``.
.. note::
You may need to install additional libraries to use Beaker's
cookie-based session encryption. See the :ref:`encryption` section for
more information.
Cache Options
=============
For caching, options may be directly specified on a per-use basis with the
:meth:`~beaker.cache.CacheManager.cache` decorator, with the rest of these
options used as fallback should one of them not be specified in the call.
Only the ``lock_dir`` option is strictly required, unless using the file-based
back-ends as noted with the sessions.
expire (**optional**, integer)
Seconds until the cache is considered old and a new value is created.
Cache Region Options
--------------------
.. _cache_region_options:
Starting in Beaker 1.3, cache regions are now supported. These can be thought
of as bundles of configuration options to apply, rather than specifying the
type and expiration on a per-usage basis.
enabled (**optional**, bool)
Quick toggle to disable or enable caching across an entire application.
This should generally be used when testing an application or in
development when caching should be ignored.
Defaults to True.
regions (**optional**, list, tuple)
Names of the regions that are to be configured.
For each region, all of the other cache options are valid and will
be read out of the cache options for that key. Options that are not
listed under a region will be used globally in the cache unless a
region specifies a different value.
For example, to specify two batches of options, one called ``long-term``,
and one called ``short-term``::
cache_opts = {
'cache.data_dir': '/tmp/cache/data',
'cache.lock_dir': '/tmp/cache/lock'
'cache.regions': 'short_term, long_term',
'cache.short_term.type': 'ext:memcached',
'cache.short_term.url': '127.0.0.1.11211',
'cache.short_term.expire': '3600',
'cache.long_term.type': 'file',
'cache.long_term.expire': '86400',
.. _Pylons: https://pylonsproject.org/
.. _TurboGears2: http://turbogears.org/
.. _SQLAlchemy: http://www.sqlalchemy.org/
.. _pycryptopp: http://pypi.python.org/pypi/pycryptopp
beaker-1.12.1/beaker/docs/glossary.rst000066400000000000000000000026311436751141500176140ustar00rootroot00000000000000.. _glossary:
Glossary
========
.. glossary::
Cache Regions
Bundles of configuration options keyed to a user-defined variable
for use with the :meth:`beaker.cache.CacheManager.region`
decorator.
Container
A Beaker container is a storage object for a specific cache value
and the key under the namespace it has been assigned.
Dog-Pile Effect
What occurs when a cached object expires, and multiple requests to
fetch it are made at the same time. In systems that don't lock or
use a scheme to prevent multiple instances from simultaneously
creating the same thing, every request will cause the system to
create a new value to be cached.
Beaker alleviates this with file locking to ensure that only a single
copy is re-created while other requests for the same object are
instead given the old value until the new one is ready.
NamespaceManager
A Beaker namespace manager, is best thought of as a collection of
containers with various keys. For example, a single template to be
cached might vary slightly depending on search term, or user login, so
the template would be keyed based on the variable that changes its
output.
The namespace would be the template name, while each container would
correspond to one of the values and the key it responds to.
beaker-1.12.1/beaker/docs/index.rst000066400000000000000000000031311436751141500170540ustar00rootroot00000000000000Beaker Documentation
====================
Beaker is a library for caching and sessions for use with web applications and
stand-alone Python scripts and applications. It comes with WSGI middleware for
easy drop-in use with WSGI based web applications, and caching decorators for
ease of use with any Python based application.
* **Lazy-Loading Sessions**: No performance hit for having sessions active in a request unless they're actually used
* **Performance**: Utilizes a multiple-reader / single-writer locking system to prevent the Dog Pile effect when caching.
* **Multiple Back-ends**: File-based, DBM files, memcached, memory, Redis, MongoDB, and database (via SQLAlchemy) back-ends available for sessions and caching
* **Cookie-based Sessions**: SHA-1 signatures with optional AES encryption for client-side cookie-based session storage
* **Flexible Caching**: Data can be cached per function to different back-ends, with different expirations, and different keys
* **Extensible Back-ends**: Add more back-ends using setuptools entrypoints to support new back-ends.
.. toctree::
:maxdepth: 2
configuration
sessions
caching
.. toctree::
:maxdepth: 1
changes
Indices and tables
==================
* :ref:`genindex`
* :ref:`modindex`
* :ref:`search`
* :ref:`glossary`
Module Listing
--------------
.. toctree::
:maxdepth: 2
modules/cache
modules/container
modules/middleware
modules/session
modules/synchronization
modules/util
modules/database
modules/memcached
modules/mongodb
modules/redis
modules/google
modules/sqla
modules/pbkdf2
beaker-1.12.1/beaker/docs/modules/000077500000000000000000000000001436751141500166655ustar00rootroot00000000000000beaker-1.12.1/beaker/docs/modules/cache.rst000066400000000000000000000006071436751141500204650ustar00rootroot00000000000000:mod:`beaker.cache` -- Cache module
================================================
.. automodule:: beaker.cache
Module Contents
---------------
.. autodata:: beaker.cache.cache_regions
.. autofunction:: cache_region
.. autofunction:: region_invalidate
.. autoclass:: Cache
:members: get, clear
.. autoclass:: CacheManager
:members: region, region_invalidate, cache, invalidate
beaker-1.12.1/beaker/docs/modules/container.rst000066400000000000000000000014441436751141500214040ustar00rootroot00000000000000:mod:`beaker.container` -- Container and Namespace classes
==========================================================
.. automodule:: beaker.container
Module Contents
---------------
.. autoclass:: DBMNamespaceManager
:show-inheritance:
.. autoclass:: FileNamespaceManager
:show-inheritance:
.. autoclass:: MemoryNamespaceManager
:show-inheritance:
.. autoclass:: NamespaceManager
:members:
.. autoclass:: OpenResourceNamespaceManager
:show-inheritance:
.. autoclass:: Value
:members:
:undoc-members:
Deprecated Classes
------------------
.. autoclass:: Container
.. autoclass:: ContainerMeta
:show-inheritance:
.. autoclass:: DBMContainer
:show-inheritance:
.. autoclass:: FileContainer
:show-inheritance:
.. autoclass:: MemoryContainer
:show-inheritance:
beaker-1.12.1/beaker/docs/modules/database.rst000066400000000000000000000004571436751141500211710ustar00rootroot00000000000000:mod:`beaker.ext.database` -- Database Container and NameSpace Manager classes
==============================================================================
.. automodule:: beaker.ext.database
Module Contents
---------------
.. autoclass:: DatabaseContainer
.. autoclass:: DatabaseNamespaceManager
beaker-1.12.1/beaker/docs/modules/google.rst000066400000000000000000000004411436751141500206720ustar00rootroot00000000000000:mod:`beaker.ext.google` -- Google Container and NameSpace Manager classes
==========================================================================
.. automodule:: beaker.ext.google
Module Contents
---------------
.. autoclass:: GoogleContainer
.. autoclass:: GoogleNamespaceManager
beaker-1.12.1/beaker/docs/modules/memcached.rst000066400000000000000000000006421436751141500213270ustar00rootroot00000000000000:mod:`beaker.ext.memcached` -- Memcached Container and NameSpace Manager classes
================================================================================
.. automodule:: beaker.ext.memcached
Module Contents
---------------
.. autoclass:: MemcachedContainer
:show-inheritance:
.. autoclass:: MemcachedNamespaceManager
:show-inheritance:
.. autoclass:: PyLibMCNamespaceManager
:show-inheritance:
beaker-1.12.1/beaker/docs/modules/middleware.rst000066400000000000000000000003441436751141500215350ustar00rootroot00000000000000:mod:`beaker.middleware` -- Middleware classes
==============================================
.. automodule:: beaker.middleware
Module Contents
---------------
.. autoclass:: CacheMiddleware
.. autoclass:: SessionMiddleware
beaker-1.12.1/beaker/docs/modules/mongodb.rst000066400000000000000000000004431436751141500210450ustar00rootroot00000000000000:mod:`beaker.ext.mongodb` -- MongoDB NameSpace Manager and Synchronizer
==============================================================================
.. automodule:: beaker.ext.mongodb
Module Contents
---------------
.. autoclass:: MongoNamespaceManager
.. autoclass:: MongoSynchronizer
beaker-1.12.1/beaker/docs/modules/pbkdf2.rst000066400000000000000000000004041436751141500205650ustar00rootroot00000000000000:mod:`beaker.crypto.pbkdf2` -- PKCS#5 v2.0 Password-Based Key Derivation classes
================================================================================
.. automodule:: beaker.crypto.pbkdf2
Module Contents
---------------
.. autofunction:: pbkdf2
beaker-1.12.1/beaker/docs/modules/redis.rst000066400000000000000000000004411436751141500205240ustar00rootroot00000000000000:mod:`beaker.ext.redisnm` -- Redis NameSpace Manager and Synchronizer
==============================================================================
.. automodule:: beaker.ext.redisnm
Module Contents
---------------
.. autoclass:: RedisNamespaceManager
.. autoclass:: RedisSynchronizer
beaker-1.12.1/beaker/docs/modules/session.rst000066400000000000000000000006641436751141500211100ustar00rootroot00000000000000:mod:`beaker.session` -- Session classes
========================================
.. automodule:: beaker.session
Module Contents
---------------
.. autoclass:: CookieSession
:members: save, expire, delete, invalidate
.. autoclass:: Session
:members: save, revert, lock, unlock, delete, invalidate
.. autoclass:: SessionObject
:members: persist, get_by_id, accessed
.. autoclass:: SignedCookie
.. autodata:: InvalidSignature
beaker-1.12.1/beaker/docs/modules/sqla.rst000066400000000000000000000005021436751141500203540ustar00rootroot00000000000000:mod:`beaker.ext.sqla` -- SqlAlchemy Container and NameSpace Manager classes
============================================================================
.. automodule:: beaker.ext.sqla
Module Contents
---------------
.. autofunction:: make_cache_table
.. autoclass:: SqlaContainer
.. autoclass:: SqlaNamespaceManager
beaker-1.12.1/beaker/docs/modules/synchronization.rst000066400000000000000000000005511436751141500226610ustar00rootroot00000000000000:mod:`beaker.synchronization` -- Synchronization classes
========================================================
.. automodule:: beaker.synchronization
Module Contents
---------------
.. autoclass:: ConditionSynchronizer
.. autoclass:: FileSynchronizer
.. autoclass:: NameLock
.. autoclass:: null_synchronizer
.. autoclass:: SynchronizerImpl
:members:
beaker-1.12.1/beaker/docs/modules/util.rst000066400000000000000000000005411436751141500203740ustar00rootroot00000000000000:mod:`beaker.util` -- Beaker Utilities
========================================================
.. automodule:: beaker.util
Module Contents
---------------
.. autofunction:: encoded_path
.. autofunction:: func_namespace
.. autoclass:: SyncDict
.. autoclass:: ThreadLocal
.. autofunction:: verify_directory
.. autofunction:: parse_cache_config_optionsbeaker-1.12.1/beaker/docs/sessions.rst000066400000000000000000000241431436751141500176210ustar00rootroot00000000000000.. _sessions:
========
Sessions
========
About
=====
Sessions provide a place to persist data in web applications, Beaker's session
system simplifies session implementation details by providing WSGI middleware
that handles them.
All cookies are signed with an HMAC signature to prevent tampering by the
client.
Lazy-Loading
------------
Only when a session object is actually accessed will the session be loaded
from the file-system, preventing performance hits on pages that don't use
the session.
Using
=====
The session object provided by Beaker's
:class:`~beaker.middleware.SessionMiddleware` implements a dict-style interface
with a few additional object methods. Once the SessionMiddleware is in place,
a session object will be made available as ``beaker.session`` in the WSGI
environ.
When a session is created on the backend, a cookie is placed in the response to
the client.
Getting data out of the session::
myvar = session['somekey']
Testing for a value::
logged_in = 'user_id' in session
Adding data to the session::
session['name'] = 'Fred Smith'
Complete example using a basic WSGI app with sessions::
from beaker.middleware import SessionMiddleware
def simple_app(environ, start_response):
# Get the session object from the environ
session = environ['beaker.session']
# Check to see if a value is in the session
user = 'logged_in' in session
# Set some other session variable
session['user_id'] = 10
start_response('200 OK', [('Content-type', 'text/plain')])
return ['User is logged in: %s' % user]
# Configure the SessionMiddleware
session_opts = {
'session.type': 'file',
'session.cookie_expires': True,
}
wsgi_app = SessionMiddleware(simple_app, session_opts)
Now ``wsgi_app`` is a replacement of original application ``simple_app``.
You should specify it as a request handler in your WSGI configuration file.
.. note::
This example does **not** actually save the session for the next request.
Adding the :meth:`~beaker.session.Session.save` call explained below is
required, or having the session set to auto-save.
.. _cookie_attributes:
Session Attributes / Keys
-------------------------
Sessions have several special attributes that can be used as needed by an
application.
* id - Unique 40 char SHA-generated session ID (by default this is uuid4).
* last_accessed - The last time the session was accessed before the current
access, if save_accessed_time is true; the last time it was modified if false;
will be None if the session was just made
There's several special session keys populated as well:
* _accessed_time - When the session was loaded if save_accessed_time is true;
when it was last written if false
* _creation_time - When the session was created
Saving
======
Sessions can be saved using the :meth:`~beaker.session.Session.save` method
on the session object::
session.save()
.. warning::
Beaker relies on Python's pickle module to pickle data objects for storage
in the session. Objects that cannot be pickled should **not** be stored in
the session. It's suggested to switch to **json** ``data_serializer`` to avoid
possible security issues with pickle.
This flags a session to be saved, and it will be stored on the chosen back-end
at the end of the request.
.. warning::
When using the ``memory`` backend, session will only be valid for the process
that created it and will be lost when process is restarted. It is usually
suggested to only use the ``memory`` backend for development and not for production.
If it's necessary to immediately save the session to the back-end, the
:meth:`~beaker.session.SessionObject.persist` method should be used::
session.persist()
This is not usually the case however, as a session generally should not be
saved should something catastrophic happen during a request.
**Order Matters**: When using the Beaker middleware, you **must call save before
the headers are sent to the client**. Since Beaker's middleware watches for when
the ``start_response`` function is called to know that it should add its cookie
header, the session must be saved before it is called.
Keep in mind that Response objects in popular frameworks (WebOb, Werkzeug,
etc.) call start_response immediately, so if you are using one of those
objects to handle your Response, you must call .save() before the Response
object is called::
# this would apply to WebOb and possibly others too
from werkzeug.wrappers import Response
# this will work
def sessions_work(environ, start_response):
environ['beaker.session']['count'] += 1
resp = Response('hello')
environ['beaker.session'].save()
return resp(environ, start_response)
# this will not work
def sessions_broken(environ, start_response):
environ['beaker.session']['count'] += 1
resp = Response('hello')
retval = resp(environ, start_response)
environ['beaker.session'].save()
return retval
Auto-save
---------
Saves can be done automatically by setting the ``auto`` configuration option
for sessions. When set, calling the :meth:`~beaker.session.Session.save` method
is no longer required, and the session will be saved automatically anytime it is
accessed during a request.
Deleting
========
Calling the :meth:`~beaker.session.Session.delete` method deletes the session
from the back-end storage and sends an expiration on the cookie requesting the
browser to clear it::
session.delete()
This should be used at the end of a request when the session should be deleted
and will not be used further in the request.
If a session should be invalidated, and a new session created and used during
the request, the :meth:`~beaker.session.Session.invalidate` method should be
used::
session.invalidate()
Removing Expired/Old Sessions
-----------------------------
Beaker does **not** automatically delete expired or old cookies on any of its
back-ends. This task is left up to the developer based on how sessions are
being used, and on what back-end.
The database backend records the last accessed time as a column in the database
so a script could be run to delete session rows in the database that haven't
been used in a long time.
When using the file-based sessions, a script could run to remove files that
haven't been touched in a long time, for example (in the session's data dir):
.. code-block:: bash
find . -type f -mtime +3 -print -exec rm {} \;
Cookie Domain and Path
======================
In addition to setting a default cookie domain with the
:ref:`cookie domain setting `, the cookie's domain and
path can be set dynamically for a session with the domain and path properties.
These settings will persist as long as the cookie exists, or until changed.
Example::
# Setting the session's cookie domain and path
session.domain = '.domain.com'
session.path = '/admin'
Cookie Security
======================
Beaker uses the defaults of setting cookie attributes `httponly` and `secure`
to False. You may want to set those to True in production. `samesite` also setting
with default value `Lax`, you can choice `Strict` for more protection. And the reasons for
using these cookie attributes are explained in these Owasp guides - `HttpOnly`_
, `SecureFlag`_, `SameSite`_.
Example::
# Best practice cookie flags for security
session.httponly = True
session.secure = True
session.samesite = 'Lax' # or 'Strict'
.. _SecureFlag: https://www.owasp.org/index.php/SecureFlag
.. _HttpOnly: https://www.owasp.org/index.php/HttpOnly#Mitigating_the_Most_Common_XSS_attack_using_HttpOnly
.. _SameSite: https://www.owasp.org/index.php/SameSite
Cookie-Based
============
Session can be stored purely on the client-side using cookie-based sessions.
This option can be turned on by setting the session type to ``cookie``.
Using cookie-based session carries the limitation of how large a cookie can
be (generally 4096 bytes). An exception will be thrown should a session get
too large to fit in a cookie, so using cookie-based session should be done
carefully and only small bits of data should be stored in them (the users login
name, admin status, etc.).
Large cookies can slow down page-loads as they increase latency to every
page request since the cookie is sent for every request under that domain.
Static content such as images and Javascript should be served off a domain
that the cookie is not valid for to prevent this.
Cookie-based sessions scale easily in a clustered environment as there's no
need for a shared storage system when different servers handle the same
session.
.. _encryption:
Encryption
----------
In the event that the cookie-based sessions should also be encrypted to
prevent the user from being able to decode the data (in addition to not
being able to tamper with it), Beaker can use 256-bit AES encryption to
secure the contents of the cookie.
Depending on the Python implementation used, Beaker may require an additional
library to provide AES encryption.
On CPython (the regular Python), one of the following libraries is required:
* The `python-nss`_ library
* The `pycryptopp`_ library
* The `cryptography`_ library
* The `PyCrypto`_ library
On Jython, no additional packages are required, but at least on the Sun JRE,
the size of the encryption key is by default limited to 128 bits, which causes
generated sessions to be incompatible with those generated in CPython, and vice
versa. To overcome this limitation, you need to install the unlimited strength
jurisdiction policy files from Sun:
* `Policy files for Java 5 `_
* `Policy files for Java 6 `_
.. _cryptography: https://pypi.python.org/pypi/cryptography/
.. _python-nss: https://pypi.python.org/pypi/python-nss/
.. _pycryptopp: https://pypi.python.org/pypi/pycryptopp/
.. _PyCrypto: https://pypi.python.org/pypi/pycrypto/
beaker-1.12.1/beaker/exceptions.py000066400000000000000000000006731436751141500170260ustar00rootroot00000000000000"""Beaker exception classes"""
class BeakerException(Exception):
pass
class BeakerWarning(RuntimeWarning):
"""Issued at runtime."""
class CreationAbortedError(Exception):
"""Deprecated."""
class InvalidCacheBackendError(BeakerException, ImportError):
pass
class MissingCacheParameter(BeakerException):
pass
class LockError(BeakerException):
pass
class InvalidCryptoBackendError(BeakerException):
pass
beaker-1.12.1/beaker/ext/000077500000000000000000000000001436751141500150655ustar00rootroot00000000000000beaker-1.12.1/beaker/ext/__init__.py000066400000000000000000000000001436751141500171640ustar00rootroot00000000000000beaker-1.12.1/beaker/ext/database.py000066400000000000000000000061451436751141500172110ustar00rootroot00000000000000from beaker._compat import pickle
import logging
import pickle
from datetime import datetime
from beaker.container import OpenResourceNamespaceManager, Container
from beaker.exceptions import InvalidCacheBackendError, MissingCacheParameter
from beaker.synchronization import file_synchronizer, null_synchronizer
from beaker.util import verify_directory, SyncDict
from beaker.ext.sqla import SqlaNamespaceManager
log = logging.getLogger(__name__)
sa = None
types = None
class DatabaseNamespaceManager(SqlaNamespaceManager):
@classmethod
def _init_dependencies(cls):
SqlaNamespaceManager._init_dependencies()
global sa, types
if sa is not None:
return
# SqlaNamespaceManager will already error
import sqlalchemy as sa
from sqlalchemy import types
def __init__(self, namespace, url=None, sa_opts=None, table_name='beaker_cache',
data_dir=None, lock_dir=None, schema_name=None, **params):
"""Creates a database namespace manager
``url``
SQLAlchemy compliant db url
``sa_opts``
A dictionary of SQLAlchemy keyword options to initialize the engine
with.
``table_name``
The table name to use in the database for the cache.
``schema_name``
The schema name to use in the database for the cache.
"""
OpenResourceNamespaceManager.__init__(self, namespace)
if sa_opts is None:
sa_opts = {}
self.lock_dir = None
if lock_dir:
self.lock_dir = lock_dir
elif data_dir:
self.lock_dir = data_dir + "/container_db_lock"
if self.lock_dir:
verify_directory(self.lock_dir)
# Check to see if the table's been created before
sa_opts['sa.url'] = url = url or sa_opts['sa.url']
table_key = url + table_name
def make_table(engine):
meta = sa.MetaData()
meta.bind = engine
cache_table = sa.Table(table_name, meta,
sa.Column('id', types.Integer, primary_key=True),
sa.Column('namespace', types.String(255), nullable=False),
sa.Column('accessed', types.DateTime, nullable=False),
sa.Column('created', types.DateTime, nullable=False),
sa.Column('data', types.PickleType, nullable=False),
sa.UniqueConstraint('namespace'),
schema=schema_name if schema_name else meta.schema)
cache_table.create(bind=engine, checkfirst=True)
return cache_table
engine = self.__class__.binds.get(url, lambda: sa.engine_from_config(sa_opts, 'sa.'))
table = self.__class__.tables.get(table_key, lambda: make_table(engine))
SqlaNamespaceManager.__init__(self, namespace, engine, table,
data_dir=data_dir, lock_dir=lock_dir)
class DatabaseContainer(Container):
namespace_manager = DatabaseNamespaceManager
beaker-1.12.1/beaker/ext/google.py000066400000000000000000000075711436751141500167250ustar00rootroot00000000000000from beaker._compat import pickle
import logging
from datetime import datetime
from beaker.container import OpenResourceNamespaceManager, Container
from beaker.exceptions import InvalidCacheBackendError
from beaker.synchronization import null_synchronizer
log = logging.getLogger(__name__)
db = None
class GoogleNamespaceManager(OpenResourceNamespaceManager):
tables = {}
@classmethod
def _init_dependencies(cls):
global db
if db is not None:
return
try:
db = __import__('google.appengine.ext.db').appengine.ext.db
except ImportError:
raise InvalidCacheBackendError("Datastore cache backend requires the "
"'google.appengine.ext' library")
def __init__(self, namespace, table_name='beaker_cache', **params):
"""Creates a datastore namespace manager"""
OpenResourceNamespaceManager.__init__(self, namespace)
def make_cache():
table_dict = dict(created=db.DateTimeProperty(),
accessed=db.DateTimeProperty(),
data=db.BlobProperty())
table = type(table_name, (db.Model,), table_dict)
return table
self.table_name = table_name
self.cache = GoogleNamespaceManager.tables.setdefault(table_name, make_cache())
self.hash = {}
self._is_new = False
self.loaded = False
self.log_debug = logging.DEBUG >= log.getEffectiveLevel()
# Google wants namespaces to start with letters, change the namespace
# to start with a letter
self.namespace = 'p%s' % self.namespace
def get_access_lock(self):
return null_synchronizer()
def get_creation_lock(self, key):
# this is weird, should probably be present
return null_synchronizer()
def do_open(self, flags, replace):
# If we already loaded the data, don't bother loading it again
if self.loaded:
self.flags = flags
return
item = self.cache.get_by_key_name(self.namespace)
if not item:
self._is_new = True
self.hash = {}
else:
self._is_new = False
try:
self.hash = pickle.loads(str(item.data))
except (IOError, OSError, EOFError, pickle.PickleError):
if self.log_debug:
log.debug("Couln't load pickle data, creating new storage")
self.hash = {}
self._is_new = True
self.flags = flags
self.loaded = True
def do_close(self):
if self.flags is not None and (self.flags == 'c' or self.flags == 'w'):
if self._is_new:
item = self.cache(key_name=self.namespace)
item.data = pickle.dumps(self.hash)
item.created = datetime.now()
item.accessed = datetime.now()
item.put()
self._is_new = False
else:
item = self.cache.get_by_key_name(self.namespace)
item.data = pickle.dumps(self.hash)
item.accessed = datetime.now()
item.put()
self.flags = None
def do_remove(self):
item = self.cache.get_by_key_name(self.namespace)
item.delete()
self.hash = {}
# We can retain the fact that we did a load attempt, but since the
# file is gone this will be a new namespace should it be saved.
self._is_new = True
def __getitem__(self, key):
return self.hash[key]
def __contains__(self, key):
return key in self.hash
def __setitem__(self, key, value):
self.hash[key] = value
def __delitem__(self, key):
del self.hash[key]
def keys(self):
return self.hash.keys()
class GoogleContainer(Container):
namespace_class = GoogleNamespaceManager
beaker-1.12.1/beaker/ext/memcached.py000066400000000000000000000155041436751141500173520ustar00rootroot00000000000000from .._compat import PY2
from beaker.container import NamespaceManager, Container
from beaker.crypto.util import sha1
from beaker.exceptions import InvalidCacheBackendError, MissingCacheParameter
from beaker.synchronization import file_synchronizer
from beaker.util import verify_directory, SyncDict, parse_memcached_behaviors
import warnings
MAX_KEY_LENGTH = 250
_client_libs = {}
def _load_client(name='auto'):
if name in _client_libs:
return _client_libs[name]
def _pylibmc():
global pylibmc
import pylibmc
return pylibmc
def _cmemcache():
global cmemcache
import cmemcache
warnings.warn("cmemcache is known to have serious "
"concurrency issues; consider using 'memcache' "
"or 'pylibmc'")
return cmemcache
def _memcache():
global memcache
import memcache
return memcache
def _bmemcached():
global bmemcached
import bmemcached
return bmemcached
def _auto():
for _client in (_pylibmc, _cmemcache, _memcache, _bmemcached):
try:
return _client()
except ImportError:
pass
else:
raise InvalidCacheBackendError(
"Memcached cache backend requires one "
"of: 'pylibmc' or 'memcache' to be installed.")
clients = {
'pylibmc': _pylibmc,
'cmemcache': _cmemcache,
'memcache': _memcache,
'bmemcached': _bmemcached,
'auto': _auto
}
_client_libs[name] = clib = clients[name]()
return clib
def _is_configured_for_pylibmc(memcache_module_config, memcache_client):
return memcache_module_config == 'pylibmc' or \
memcache_client.__name__.startswith('pylibmc')
class MemcachedNamespaceManager(NamespaceManager):
"""Provides the :class:`.NamespaceManager` API over a memcache client library."""
clients = SyncDict()
def __new__(cls, *args, **kw):
memcache_module = kw.pop('memcache_module', 'auto')
memcache_client = _load_client(memcache_module)
if _is_configured_for_pylibmc(memcache_module, memcache_client):
return object.__new__(PyLibMCNamespaceManager)
else:
return object.__new__(MemcachedNamespaceManager)
def __init__(self, namespace, url,
memcache_module='auto',
data_dir=None, lock_dir=None,
**kw):
NamespaceManager.__init__(self, namespace)
_memcache_module = _client_libs[memcache_module]
if not url:
raise MissingCacheParameter("url is required")
self.lock_dir = None
if lock_dir:
self.lock_dir = lock_dir
elif data_dir:
self.lock_dir = data_dir + "/container_mcd_lock"
if self.lock_dir:
verify_directory(self.lock_dir)
# Check for pylibmc namespace manager, in which case client will be
# instantiated by subclass __init__, to handle behavior passing to the
# pylibmc client
if not _is_configured_for_pylibmc(memcache_module, _memcache_module):
self.mc = MemcachedNamespaceManager.clients.get(
(memcache_module, url),
_memcache_module.Client,
url.split(';'))
def get_creation_lock(self, key):
return file_synchronizer(
identifier="memcachedcontainer/funclock/%s/%s" %
(self.namespace, key), lock_dir=self.lock_dir)
def _format_key(self, key):
if not isinstance(key, str):
key = key.decode('ascii')
formated_key = (self.namespace + '_' + key).replace(' ', '\302\267')
if len(formated_key) > MAX_KEY_LENGTH:
if not PY2:
formated_key = formated_key.encode('utf-8')
formated_key = sha1(formated_key).hexdigest()
return formated_key
def __getitem__(self, key):
return self.mc.get(self._format_key(key))
def __contains__(self, key):
value = self.mc.get(self._format_key(key))
return value is not None
def has_key(self, key):
return key in self
def set_value(self, key, value, expiretime=None):
if expiretime:
self.mc.set(self._format_key(key), value, time=expiretime)
else:
self.mc.set(self._format_key(key), value)
def __setitem__(self, key, value):
self.set_value(key, value)
def __delitem__(self, key):
self.mc.delete(self._format_key(key))
def do_remove(self):
self.mc.flush_all()
def keys(self):
raise NotImplementedError(
"Memcache caching does not "
"support iteration of all cache keys")
class PyLibMCNamespaceManager(MemcachedNamespaceManager):
"""Provide thread-local support for pylibmc."""
pools = SyncDict()
def __init__(self, *arg, **kw):
super(PyLibMCNamespaceManager, self).__init__(*arg, **kw)
memcache_module = kw.get('memcache_module', 'auto')
_memcache_module = _client_libs[memcache_module]
protocol = kw.get('protocol', 'text')
username = kw.get('username', None)
password = kw.get('password', None)
url = kw.get('url')
behaviors = parse_memcached_behaviors(kw)
self.mc = MemcachedNamespaceManager.clients.get(
(memcache_module, url),
_memcache_module.Client,
servers=url.split(';'), behaviors=behaviors,
binary=(protocol == 'binary'), username=username,
password=password)
self.pool = PyLibMCNamespaceManager.pools.get(
(memcache_module, url),
pylibmc.ThreadMappedPool, self.mc)
def __getitem__(self, key):
with self.pool.reserve() as mc:
return mc.get(self._format_key(key))
def __contains__(self, key):
with self.pool.reserve() as mc:
value = mc.get(self._format_key(key))
return value is not None
def has_key(self, key):
return key in self
def set_value(self, key, value, expiretime=None):
with self.pool.reserve() as mc:
if expiretime:
mc.set(self._format_key(key), value, time=expiretime)
else:
mc.set(self._format_key(key), value)
def __setitem__(self, key, value):
self.set_value(key, value)
def __delitem__(self, key):
with self.pool.reserve() as mc:
mc.delete(self._format_key(key))
def do_remove(self):
with self.pool.reserve() as mc:
mc.flush_all()
class MemcachedContainer(Container):
"""Container class which invokes :class:`.MemcacheNamespaceManager`."""
namespace_class = MemcachedNamespaceManager
beaker-1.12.1/beaker/ext/mongodb.py000066400000000000000000000150771436751141500170760ustar00rootroot00000000000000import datetime
import os
import threading
import time
import pickle
try:
import pymongo
import pymongo.errors
import bson
except ImportError:
pymongo = None
bson = None
from beaker.container import NamespaceManager
from beaker.synchronization import SynchronizerImpl
from beaker.util import SyncDict, machine_identifier
from beaker.crypto.util import sha1
from beaker._compat import string_type, PY2
class MongoNamespaceManager(NamespaceManager):
"""Provides the :class:`.NamespaceManager` API over MongoDB.
Provided ``url`` can be both a mongodb connection string or
an already existing MongoClient instance.
The data will be stored into ``beaker_cache`` collection of the
*default database*, so make sure your connection string or
MongoClient point to a default database.
"""
MAX_KEY_LENGTH = 1024
clients = SyncDict()
def __init__(self, namespace, url, **kw):
super(MongoNamespaceManager, self).__init__(namespace)
self.lock_dir = None # MongoDB uses mongo itself for locking.
if pymongo is None:
raise RuntimeError('pymongo3 is not available')
if isinstance(url, string_type):
self.client = MongoNamespaceManager.clients.get(url, pymongo.MongoClient, url)
else:
self.client = url
self.db = self.client.get_default_database()
def _format_key(self, key):
if not isinstance(key, str):
key = key.decode('ascii')
if len(key) > (self.MAX_KEY_LENGTH - len(self.namespace) - 1):
if not PY2:
key = key.encode('utf-8')
key = sha1(key).hexdigest()
return '%s:%s' % (self.namespace, key)
def get_creation_lock(self, key):
return MongoSynchronizer(self._format_key(key), self.client)
def __getitem__(self, key):
self._clear_expired()
entry = self.db.backer_cache.find_one({'_id': self._format_key(key)})
if entry is None:
raise KeyError(key)
return pickle.loads(entry['value'])
def __contains__(self, key):
self._clear_expired()
entry = self.db.backer_cache.find_one({'_id': self._format_key(key)})
return entry is not None
def has_key(self, key):
return key in self
def set_value(self, key, value, expiretime=None):
self._clear_expired()
expiration = None
if expiretime is not None:
expiration = time.time() + expiretime
value = pickle.dumps(value)
self.db.backer_cache.update_one({'_id': self._format_key(key)},
{'$set': {'value': bson.Binary(value),
'expiration': expiration}},
upsert=True)
def __setitem__(self, key, value):
self.set_value(key, value)
def __delitem__(self, key):
self._clear_expired()
self.db.backer_cache.delete_many({'_id': self._format_key(key)})
def do_remove(self):
self.db.backer_cache.delete_many({'_id': {'$regex': '^%s' % self.namespace}})
def keys(self):
return [e['key'].split(':', 1)[-1] for e in self.db.backer_cache.find_all(
{'_id': {'$regex': '^%s' % self.namespace}}
)]
def _clear_expired(self):
now = time.time()
self.db.backer_cache.delete_many({'_id': {'$regex': '^%s' % self.namespace},
'expiration': {'$ne': None, '$lte': now}})
class MongoSynchronizer(SynchronizerImpl):
"""Provides a Writer/Reader lock based on MongoDB.
Provided ``url`` can be both a mongodb connection string or
an already existing MongoClient instance.
The data will be stored into ``beaker_locks`` collection of the
*default database*, so make sure your connection string or
MongoClient point to a default database.
Locks are identified by local machine, PID and threadid, so
are suitable for use in both local and distributed environments.
"""
# If a cache entry generation function can take a lot,
# but 15 minutes is more than a reasonable time.
LOCK_EXPIRATION = 900
MACHINE_ID = machine_identifier()
def __init__(self, identifier, url):
super(MongoSynchronizer, self).__init__()
self.identifier = identifier
if isinstance(url, string_type):
self.client = MongoNamespaceManager.clients.get(url, pymongo.MongoClient, url)
else:
self.client = url
self.db = self.client.get_default_database()
def _clear_expired_locks(self):
now = datetime.datetime.utcnow()
expired = now - datetime.timedelta(seconds=self.LOCK_EXPIRATION)
self.db.beaker_locks.delete_many({'_id': self.identifier, 'timestamp': {'$lte': expired}})
return now
def _get_owner_id(self):
return '%s-%s-%s' % (self.MACHINE_ID, os.getpid(), threading.current_thread().ident)
def do_release_read_lock(self):
owner_id = self._get_owner_id()
self.db.beaker_locks.update_one({'_id': self.identifier, 'readers': owner_id},
{'$pull': {'readers': owner_id}})
def do_acquire_read_lock(self, wait):
now = self._clear_expired_locks()
owner_id = self._get_owner_id()
while True:
try:
self.db.beaker_locks.update_one({'_id': self.identifier, 'owner': None},
{'$set': {'timestamp': now},
'$push': {'readers': owner_id}},
upsert=True)
return True
except pymongo.errors.DuplicateKeyError:
if not wait:
return False
time.sleep(0.2)
def do_release_write_lock(self):
self.db.beaker_locks.delete_one({'_id': self.identifier, 'owner': self._get_owner_id()})
def do_acquire_write_lock(self, wait):
now = self._clear_expired_locks()
owner_id = self._get_owner_id()
while True:
try:
self.db.beaker_locks.update_one({'_id': self.identifier, 'owner': None,
'readers': []},
{'$set': {'owner': owner_id,
'timestamp': now}},
upsert=True)
return True
except pymongo.errors.DuplicateKeyError:
if not wait:
return False
time.sleep(0.2)
beaker-1.12.1/beaker/ext/redisnm.py000066400000000000000000000107751436751141500171120ustar00rootroot00000000000000import os
import threading
import time
import pickle
try:
import redis
except ImportError:
redis = None
from beaker.container import NamespaceManager
from beaker.synchronization import SynchronizerImpl
from beaker.util import SyncDict, machine_identifier
from beaker.crypto.util import sha1
from beaker._compat import string_type, PY2
class RedisNamespaceManager(NamespaceManager):
"""Provides the :class:`.NamespaceManager` API over Redis.
Provided ``url`` can be both a redis connection string or
an already existing StrictRedis instance.
The data will be stored into redis keys, with their name
starting with ``beaker_cache:``. So make sure you provide
a specific database number if you don't want to mix them
with your own data.
"""
MAX_KEY_LENGTH = 1024
clients = SyncDict()
def __init__(self, namespace, url, timeout=None, **kw):
super(RedisNamespaceManager, self).__init__(namespace)
self.lock_dir = None # Redis uses redis itself for locking.
self.timeout = timeout
if redis is None:
raise RuntimeError('redis is not available')
if isinstance(url, string_type):
self.client = RedisNamespaceManager.clients.get(url, redis.StrictRedis.from_url, url)
else:
self.client = url
def _format_key(self, key):
if not isinstance(key, str):
key = key.decode('ascii')
if len(key) > (self.MAX_KEY_LENGTH - len(self.namespace) - len('beaker_cache:') - 1):
if not PY2:
key = key.encode('utf-8')
key = sha1(key).hexdigest()
return 'beaker_cache:%s:%s' % (self.namespace, key)
def get_creation_lock(self, key):
return RedisSynchronizer(self._format_key(key), self.client)
def __getitem__(self, key):
entry = self.client.get(self._format_key(key))
if entry is None:
raise KeyError(key)
return pickle.loads(entry)
def __contains__(self, key):
return self.client.exists(self._format_key(key))
def has_key(self, key):
return key in self
def set_value(self, key, value, expiretime=None):
value = pickle.dumps(value)
if expiretime is None and self.timeout is not None:
expiretime = self.timeout
if expiretime is not None:
self.client.setex(self._format_key(key), int(expiretime), value)
else:
self.client.set(self._format_key(key), value)
def __setitem__(self, key, value):
self.set_value(key, value)
def __delitem__(self, key):
self.client.delete(self._format_key(key))
def do_remove(self):
for k in self.keys():
self.client.delete(k)
def keys(self):
return self.client.keys('beaker_cache:%s:*' % self.namespace)
class RedisSynchronizer(SynchronizerImpl):
"""Synchronizer based on redis.
Provided ``url`` can be both a redis connection string or
an already existing StrictRedis instance.
This Synchronizer only supports 1 reader or 1 writer at time, not concurrent readers.
"""
# If a cache entry generation function can take a lot,
# but 15 minutes is more than a reasonable time.
LOCK_EXPIRATION = 900
MACHINE_ID = machine_identifier()
def __init__(self, identifier, url):
super(RedisSynchronizer, self).__init__()
self.identifier = 'beaker_lock:%s' % identifier
if isinstance(url, string_type):
self.client = RedisNamespaceManager.clients.get(url, redis.StrictRedis.from_url, url)
else:
self.client = url
def _get_owner_id(self):
return (
'%s-%s-%s' % (self.MACHINE_ID, os.getpid(), threading.current_thread().ident)
).encode('ascii')
def do_release_read_lock(self):
self.do_release_write_lock()
def do_acquire_read_lock(self, wait):
self.do_acquire_write_lock(wait)
def do_release_write_lock(self):
identifier = self.identifier
owner_id = self._get_owner_id()
def execute_release(pipe):
lock_value = pipe.get(identifier)
if lock_value == owner_id:
pipe.delete(identifier)
self.client.transaction(execute_release, identifier)
def do_acquire_write_lock(self, wait):
owner_id = self._get_owner_id()
while True:
if self.client.set(self.identifier, owner_id, ex=self.LOCK_EXPIRATION, nx=True):
return True
if not wait:
return False
time.sleep(0.2)
beaker-1.12.1/beaker/ext/sqla.py000066400000000000000000000114131436751141500163770ustar00rootroot00000000000000from beaker._compat import pickle
import logging
import pickle
from datetime import datetime
from beaker.container import OpenResourceNamespaceManager, Container
from beaker.exceptions import InvalidCacheBackendError, MissingCacheParameter
from beaker.synchronization import file_synchronizer, null_synchronizer
from beaker.util import verify_directory, SyncDict
log = logging.getLogger(__name__)
sa = None
class SqlaNamespaceManager(OpenResourceNamespaceManager):
binds = SyncDict()
tables = SyncDict()
@classmethod
def _init_dependencies(cls):
global sa
if sa is not None:
return
try:
import sqlalchemy as sa
except ImportError:
raise InvalidCacheBackendError("SQLAlchemy, which is required by "
"this backend, is not installed")
def __init__(self, namespace, bind, table, data_dir=None, lock_dir=None,
**kwargs):
"""Create a namespace manager for use with a database table via
SQLAlchemy.
``bind``
SQLAlchemy ``Engine`` or ``Connection`` object
``table``
SQLAlchemy ``Table`` object in which to store namespace data.
This should usually be something created by ``make_cache_table``.
"""
OpenResourceNamespaceManager.__init__(self, namespace)
if lock_dir:
self.lock_dir = lock_dir
elif data_dir:
self.lock_dir = data_dir + "/container_db_lock"
if self.lock_dir:
verify_directory(self.lock_dir)
self.bind = self.__class__.binds.get(str(bind.url), lambda: bind)
self.table = self.__class__.tables.get('%s:%s' % (bind.url, table.name),
lambda: table)
self.hash = {}
self._is_new = False
self.loaded = False
def get_access_lock(self):
return null_synchronizer()
def get_creation_lock(self, key):
return file_synchronizer(
identifier="databasecontainer/funclock/%s" % self.namespace,
lock_dir=self.lock_dir)
def do_open(self, flags, replace):
if self.loaded:
self.flags = flags
return
select = sa.select(self.table.c.data).where(self.table.c.namespace == self.namespace)
with self.bind.connect() as conn:
result = conn.execute(select).fetchone()
if not result:
self._is_new = True
self.hash = {}
else:
self._is_new = False
try:
self.hash = result.data
except (IOError, OSError, EOFError, pickle.PickleError,
pickle.PickleError):
log.debug("Couln't load pickle data, creating new storage")
self.hash = {}
self._is_new = True
self.flags = flags
self.loaded = True
def do_close(self):
if self.flags is not None and (self.flags == 'c' or self.flags == 'w'):
with self.bind.begin() as conn:
if self._is_new:
insert = self.table.insert()
conn.execute(insert, dict(namespace=self.namespace, data=self.hash,
accessed=datetime.now(), created=datetime.now()))
self._is_new = False
else:
update = self.table.update().where(self.table.c.namespace == self.namespace)
conn.execute(update, dict(data=self.hash, accessed=datetime.now()))
self.flags = None
def do_remove(self):
delete = self.table.delete().where(self.table.c.namespace == self.namespace)
with self.bind.begin() as conn:
conn.execute(delete)
self.hash = {}
self._is_new = True
def __getitem__(self, key):
return self.hash[key]
def __contains__(self, key):
return key in self.hash
def __setitem__(self, key, value):
self.hash[key] = value
def __delitem__(self, key):
del self.hash[key]
def keys(self):
return self.hash.keys()
class SqlaContainer(Container):
namespace_manager = SqlaNamespaceManager
def make_cache_table(metadata, table_name='beaker_cache', schema_name=None):
"""Return a ``Table`` object suitable for storing cached values for the
namespace manager. Do not create the table."""
return sa.Table(table_name, metadata,
sa.Column('namespace', sa.String(255), primary_key=True),
sa.Column('accessed', sa.DateTime, nullable=False),
sa.Column('created', sa.DateTime, nullable=False),
sa.Column('data', sa.PickleType, nullable=False),
schema=schema_name if schema_name else metadata.schema)
beaker-1.12.1/beaker/middleware.py000066400000000000000000000145241436751141500167620ustar00rootroot00000000000000import warnings
try:
from paste.registry import StackedObjectProxy
beaker_session = StackedObjectProxy(name="Beaker Session")
beaker_cache = StackedObjectProxy(name="Cache Manager")
except:
beaker_cache = None
beaker_session = None
from beaker.cache import CacheManager
from beaker.session import Session, SessionObject
from beaker.util import coerce_cache_params, coerce_session_params, \
parse_cache_config_options
class CacheMiddleware(object):
cache = beaker_cache
def __init__(self, app, config=None, environ_key='beaker.cache', **kwargs):
"""Initialize the Cache Middleware
The Cache middleware will make a CacheManager instance available
every request under the ``environ['beaker.cache']`` key by
default. The location in environ can be changed by setting
``environ_key``.
``config``
dict All settings should be prefixed by 'cache.'. This
method of passing variables is intended for Paste and other
setups that accumulate multiple component settings in a
single dictionary. If config contains *no cache. prefixed
args*, then *all* of the config options will be used to
initialize the Cache objects.
``environ_key``
Location where the Cache instance will keyed in the WSGI
environ
``**kwargs``
All keyword arguments are assumed to be cache settings and
will override any settings found in ``config``
"""
self.app = app
config = config or {}
self.options = {}
# Update the options with the parsed config
self.options.update(parse_cache_config_options(config))
# Add any options from kwargs, but leave out the defaults this
# time
self.options.update(
parse_cache_config_options(kwargs, include_defaults=False))
# Assume all keys are intended for cache if none are prefixed with
# 'cache.'
if not self.options and config:
self.options = config
self.options.update(kwargs)
self.cache_manager = CacheManager(**self.options)
self.environ_key = environ_key
def __call__(self, environ, start_response):
if environ.get('paste.registry'):
if environ['paste.registry'].reglist:
environ['paste.registry'].register(self.cache,
self.cache_manager)
environ[self.environ_key] = self.cache_manager
return self.app(environ, start_response)
class SessionMiddleware(object):
session = beaker_session
def __init__(self, wrap_app, config=None, environ_key='beaker.session',
**kwargs):
"""Initialize the Session Middleware
The Session middleware will make a lazy session instance
available every request under the ``environ['beaker.session']``
key by default. The location in environ can be changed by
setting ``environ_key``.
``config``
dict All settings should be prefixed by 'session.'. This
method of passing variables is intended for Paste and other
setups that accumulate multiple component settings in a
single dictionary. If config contains *no session. prefixed
args*, then *all* of the config options will be used to
initialize the Session objects.
``environ_key``
Location where the Session instance will keyed in the WSGI
environ
``**kwargs``
All keyword arguments are assumed to be session settings and
will override any settings found in ``config``
"""
config = config or {}
# Load up the default params
self.options = dict(invalidate_corrupt=True, type=None,
data_dir=None, key='beaker.session.id',
timeout=None, save_accessed_time=True, secret=None,
log_file=None)
# Pull out any config args meant for beaker session. if there are any
for dct in [config, kwargs]:
for key, val in dct.items():
if key.startswith('beaker.session.'):
self.options[key[15:]] = val
if key.startswith('session.'):
self.options[key[8:]] = val
if key.startswith('session_'):
warnings.warn('Session options should start with session. '
'instead of session_.', DeprecationWarning, 2)
self.options[key[8:]] = val
# Coerce and validate session params
coerce_session_params(self.options)
# Assume all keys are intended for session if none are prefixed with
# 'session.'
if not self.options and config:
self.options = config
self.options.update(kwargs)
self.wrap_app = self.app = wrap_app
self.environ_key = environ_key
def __call__(self, environ, start_response):
session = SessionObject(environ, **self.options)
if environ.get('paste.registry'):
if environ['paste.registry'].reglist:
environ['paste.registry'].register(self.session, session)
environ[self.environ_key] = session
environ['beaker.get_session'] = self._get_session
if 'paste.testing_variables' in environ and 'webtest_varname' in self.options:
environ['paste.testing_variables'][self.options['webtest_varname']] = session
def session_start_response(status, headers, exc_info=None):
if session.accessed():
session.persist()
if session.__dict__['_headers']['set_cookie']:
cookie = session.__dict__['_headers']['cookie_out']
if cookie:
headers.append(('Set-cookie', cookie))
return start_response(status, headers, exc_info)
return self.wrap_app(environ, session_start_response)
def _get_session(self):
return Session({}, use_cookies=False, **self.options)
def session_filter_factory(global_conf, **kwargs):
def filter(app):
return SessionMiddleware(app, global_conf, **kwargs)
return filter
def session_filter_app_factory(app, global_conf, **kwargs):
return SessionMiddleware(app, global_conf, **kwargs)
beaker-1.12.1/beaker/session.py000066400000000000000000000764051436751141500163360ustar00rootroot00000000000000from ._compat import PY2, pickle, http_cookies, unicode_text, b64encode, b64decode, string_type
import os
import time
from datetime import datetime, timedelta
from beaker.crypto import hmac as HMAC, hmac_sha1 as SHA1, sha1, get_nonce_size, DEFAULT_NONCE_BITS, get_crypto_module
from beaker import crypto, util
from beaker.cache import clsmap
from beaker.exceptions import BeakerException, InvalidCryptoBackendError
from beaker.cookie import SimpleCookie
import uuid
months = (None, "Jan", "Feb", "Mar", "Apr", "May", "Jun",
"Jul", "Aug", "Sep", "Oct", "Nov", "Dec")
weekdays = ("Mon", "Tue", "Wed", "Thu", "Fri", "Sat", "Sun")
__all__ = ['SignedCookie', 'Session', 'InvalidSignature']
class _InvalidSignatureType(object):
"""Returned from SignedCookie when the value's signature was invalid."""
def __nonzero__(self):
return False
def __bool__(self):
return False
InvalidSignature = _InvalidSignatureType()
def _session_id():
return uuid.uuid4().hex
class SignedCookie(SimpleCookie):
"""Extends python cookie to give digital signature support"""
def __init__(self, secret, input=None):
self.secret = secret.encode('UTF-8')
http_cookies.BaseCookie.__init__(self, input)
def value_decode(self, val):
val = val.strip('"')
if not val:
return None, val
sig = HMAC.new(self.secret, val[40:].encode('utf-8'), SHA1).hexdigest()
# Avoid timing attacks
invalid_bits = 0
input_sig = val[:40]
if len(sig) != len(input_sig):
return InvalidSignature, val
for a, b in zip(sig, input_sig):
invalid_bits += a != b
if invalid_bits:
return InvalidSignature, val
else:
return val[40:], val
def value_encode(self, val):
sig = HMAC.new(self.secret, val.encode('utf-8'), SHA1).hexdigest()
return str(val), ("%s%s" % (sig, val))
class _ConfigurableSession(dict):
"""Provides support for configurable Session objects.
Provides a way to ensure some properties of sessions
are always available with pre-configured values
when they are not available in the session cookie itself.
"""
def __init__(self, cookie_domain=None, cookie_path='/'):
self._config = {
'_domain': cookie_domain,
'_path': cookie_path
}
def clear(self):
"""Clears Session data. Preserves session configuration."""
super(_ConfigurableSession, self).clear()
self.update(self._config)
class Session(_ConfigurableSession):
"""Session object that uses container package for storage.
:param invalidate_corrupt: How to handle corrupt data when loading. When
set to True, then corrupt data will be silently
invalidated and a new session created,
otherwise invalid data will cause an exception.
:type invalidate_corrupt: bool
:param use_cookies: Whether or not cookies should be created. When set to
False, it is assumed the user will handle storing the
session on their own.
:type use_cookies: bool
:param type: What data backend type should be used to store the underlying
session data
:param key: The name the cookie should be set to.
:param timeout: How long session data is considered valid. This is used
regardless of the cookie being present or not to determine
whether session data is still valid. Can be set to None to
disable session time out.
:type timeout: int or None
:param save_accessed_time: Whether beaker should save the session's access
time (True) or only modification time (False).
Defaults to True.
:param cookie_expires: Expiration date for cookie
:param cookie_domain: Domain to use for the cookie.
:param cookie_path: Path to use for the cookie.
:param data_serializer: If ``"json"`` or ``"pickle"`` should be used
to serialize data. Can also be an object with
``loads` and ``dumps`` methods. By default
``"pickle"`` is used.
:param secure: Whether or not the cookie should only be sent over SSL.
:param httponly: Whether or not the cookie should only be accessible by
the browser not by JavaScript.
:param encrypt_key: The key to use for the local session encryption, if not
provided the session will not be encrypted.
:param validate_key: The key used to sign the local encrypted session
:param encrypt_nonce_bits: Number of bits used to generate nonce for encryption key salt.
For security reason this is 128bits be default. If you want
to keep backward compatibility with sessions generated before 1.8.0
set this to 48.
:param crypto_type: encryption module to use
:param samesite: SameSite value for the cookie -- should be either 'Lax',
'Strict', or None.
"""
def __init__(self, request, id=None, invalidate_corrupt=False,
use_cookies=True, type=None, data_dir=None,
key='beaker.session.id', timeout=None, save_accessed_time=True,
cookie_expires=True, cookie_domain=None, cookie_path='/',
data_serializer='pickle', secret=None,
secure=False, namespace_class=None, httponly=False,
encrypt_key=None, validate_key=None, encrypt_nonce_bits=DEFAULT_NONCE_BITS,
crypto_type='default', samesite='Lax',
**namespace_args):
_ConfigurableSession.__init__(
self,
cookie_domain=cookie_domain,
cookie_path=cookie_path
)
self.clear()
if not type:
if data_dir:
self.type = 'file'
else:
self.type = 'memory'
else:
self.type = type
self.namespace_class = namespace_class or clsmap[self.type]
self.namespace_args = namespace_args
self.request = request
self.data_dir = data_dir
self.key = key
if timeout and not save_accessed_time:
raise BeakerException("timeout requires save_accessed_time")
self.timeout = timeout
# If a timeout was provided, forward it to the backend too, so the backend
# can automatically expire entries if it's supported.
if self.timeout is not None:
# The backend expiration should always be a bit longer than the
# session expiration itself to prevent the case where the backend data expires while
# the session is being read (PR#153). 2 Minutes seems a reasonable time.
self.namespace_args['timeout'] = self.timeout + 60 * 2
self.save_atime = save_accessed_time
self.use_cookies = use_cookies
self.cookie_expires = cookie_expires
self._set_serializer(data_serializer)
# Default cookie domain/path
self.was_invalidated = False
self.secret = secret
self.secure = secure
self.httponly = httponly
self.samesite = samesite
self.encrypt_key = encrypt_key
self.validate_key = validate_key
self.encrypt_nonce_size = get_nonce_size(encrypt_nonce_bits)
self.crypto_module = get_crypto_module(crypto_type)
self.id = id
self.accessed_dict = {}
self.invalidate_corrupt = invalidate_corrupt
if self.use_cookies:
cookieheader = request.get('cookie', '')
if secret:
try:
self.cookie = SignedCookie(
secret,
input=cookieheader,
)
except http_cookies.CookieError:
self.cookie = SignedCookie(
secret,
input=None,
)
else:
self.cookie = SimpleCookie(input=cookieheader)
if not self.id and self.key in self.cookie:
cookie_data = self.cookie[self.key].value
# Should we check invalidate_corrupt here?
if cookie_data is InvalidSignature:
cookie_data = None
self.id = cookie_data
self.is_new = self.id is None
if self.is_new:
self._create_id()
self['_accessed_time'] = self['_creation_time'] = time.time()
else:
try:
self.load()
except Exception as e:
if self.invalidate_corrupt:
util.warn(
"Invalidating corrupt session %s; "
"error was: %s. Set invalidate_corrupt=False "
"to propagate this exception." % (self.id, e))
self.invalidate()
else:
raise
def _set_serializer(self, data_serializer):
self.data_serializer = data_serializer
if self.data_serializer == 'json':
self.serializer = util.JsonSerializer()
elif self.data_serializer == 'pickle':
self.serializer = util.PickleSerializer()
elif isinstance(self.data_serializer, string_type):
raise BeakerException('Invalid value for data_serializer: %s' % data_serializer)
else:
self.serializer = data_serializer
def has_key(self, name):
return name in self
def _set_cookie_values(self, expires=None):
self.cookie[self.key] = self.id
if self.domain:
self.cookie[self.key]['domain'] = self.domain
if self.secure:
self.cookie[self.key]['secure'] = True
if self.samesite:
self.cookie[self.key]['samesite'] = self.samesite
self._set_cookie_http_only()
self.cookie[self.key]['path'] = self.path
self._set_cookie_expires(expires)
@staticmethod
def serialize_cookie_date(v):
v = v.timetuple()
r = time.strftime("%%s, %d-%%s-%Y %H:%M:%S GMT", v)
return r % (weekdays[v[6]], months[v[1]])
def _set_cookie_expires(self, expires):
if expires is None:
expires = self.cookie_expires
if expires is False:
expires_date = datetime.fromtimestamp(0x7FFFFFFF)
elif isinstance(expires, timedelta):
expires_date = datetime.utcnow() + expires
elif isinstance(expires, datetime):
expires_date = expires
elif expires is not True:
raise ValueError("Invalid argument for cookie_expires: %s"
% repr(self.cookie_expires))
self.cookie_expires = expires
if not self.cookie or self.key not in self.cookie:
self.cookie[self.key] = self.id
if expires is True:
self.cookie[self.key]['expires'] = ''
return True
self.cookie[self.key]['expires'] = \
self.serialize_cookie_date(expires_date)
return expires_date
def _update_cookie_out(self, set_cookie=True):
self._set_cookie_values()
cookie_out = self.cookie[self.key].output(header='')
if not isinstance(cookie_out, str):
cookie_out = cookie_out.encode('latin1')
self.request['cookie_out'] = cookie_out
self.request['set_cookie'] = set_cookie
def _set_cookie_http_only(self):
try:
if self.httponly:
self.cookie[self.key]['httponly'] = True
except http_cookies.CookieError as e:
if 'Invalid Attribute httponly' not in str(e):
raise
util.warn('Python 2.6+ is required to use httponly')
def _create_id(self, set_new=True):
self.id = _session_id()
if set_new:
self.is_new = True
self.last_accessed = None
if self.use_cookies:
sc = set_new is False
self._update_cookie_out(set_cookie=sc)
@property
def created(self):
return self['_creation_time']
def _set_domain(self, domain):
self['_domain'] = domain
self._update_cookie_out()
def _get_domain(self):
return self['_domain']
domain = property(_get_domain, _set_domain)
def _set_path(self, path):
self['_path'] = path
self._update_cookie_out()
def _get_path(self):
return self.get('_path', '/')
path = property(_get_path, _set_path)
def _encrypt_data(self, session_data=None):
"""Serialize, encipher, and base64 the session dict"""
session_data = session_data or self.copy()
if self.encrypt_key:
nonce_len, nonce_b64len = self.encrypt_nonce_size
nonce = b64encode(os.urandom(nonce_len))[:nonce_b64len]
encrypt_key = crypto.generateCryptoKeys(self.encrypt_key,
self.validate_key + nonce,
1,
self.crypto_module.getKeyLength())
data = self.serializer.dumps(session_data)
return nonce + b64encode(self.crypto_module.aesEncrypt(data, encrypt_key))
else:
data = self.serializer.dumps(session_data)
return b64encode(data)
def _decrypt_data(self, session_data):
"""Base64, decipher, then un-serialize the data for the session
dict"""
if self.encrypt_key:
__, nonce_b64len = self.encrypt_nonce_size
nonce = session_data[:nonce_b64len]
encrypt_key = crypto.generateCryptoKeys(self.encrypt_key,
self.validate_key + nonce,
1,
self.crypto_module.getKeyLength())
payload = b64decode(session_data[nonce_b64len:])
data = self.crypto_module.aesDecrypt(payload, encrypt_key)
else:
data = b64decode(session_data)
return self.serializer.loads(data)
def _delete_cookie(self):
self.request['set_cookie'] = True
expires = datetime.utcnow() - timedelta(365)
self._set_cookie_values(expires)
self._update_cookie_out()
def delete(self):
"""Deletes the session from the persistent storage, and sends
an expired cookie out"""
if self.use_cookies:
self._delete_cookie()
self.clear()
def invalidate(self):
"""Invalidates this session, creates a new session id, returns
to the is_new state"""
self.clear()
self.was_invalidated = True
self._create_id()
self.load()
def load(self):
"Loads the data from this session from persistent storage"
self.namespace = self.namespace_class(self.id,
data_dir=self.data_dir,
digest_filenames=False,
**self.namespace_args)
now = time.time()
if self.use_cookies:
self.request['set_cookie'] = True
self.namespace.acquire_read_lock()
timed_out = False
try:
self.clear()
try:
session_data = self.namespace['session']
if session_data is not None:
session_data = self._decrypt_data(session_data)
# Memcached always returns a key, its None when its not
# present
if session_data is None:
session_data = {
'_creation_time': now,
'_accessed_time': now
}
self.is_new = True
except (KeyError, TypeError):
session_data = {
'_creation_time': now,
'_accessed_time': now
}
self.is_new = True
if session_data is None or len(session_data) == 0:
session_data = {
'_creation_time': now,
'_accessed_time': now
}
self.is_new = True
if self.timeout is not None and \
'_accessed_time' in session_data and \
now - session_data['_accessed_time'] > self.timeout:
timed_out = True
else:
# Properly set the last_accessed time, which is different
# than the *currently* _accessed_time
if self.is_new or '_accessed_time' not in session_data:
self.last_accessed = None
else:
self.last_accessed = session_data['_accessed_time']
# Update the current _accessed_time
session_data['_accessed_time'] = now
self.update(session_data)
self.accessed_dict = session_data.copy()
finally:
self.namespace.release_read_lock()
if timed_out:
self.invalidate()
def save(self, accessed_only=False):
"""Saves the data for this session to persistent storage
If accessed_only is True, then only the original data loaded
at the beginning of the request will be saved, with the updated
last accessed time.
"""
# Look to see if its a new session that was only accessed
# Don't save it under that case
if accessed_only and (self.is_new or not self.save_atime):
return None
# this session might not have a namespace yet or the session id
# might have been regenerated
if not hasattr(self, 'namespace') or self.namespace.namespace != self.id:
self.namespace = self.namespace_class(
self.id,
data_dir=self.data_dir,
digest_filenames=False,
**self.namespace_args)
self.namespace.acquire_write_lock(replace=True)
try:
if accessed_only:
data = dict(self.accessed_dict.items())
else:
data = dict(self.items())
data = self._encrypt_data(data)
# Save the data
if not data and 'session' in self.namespace:
del self.namespace['session']
else:
self.namespace['session'] = data
finally:
self.namespace.release_write_lock()
if self.use_cookies and self.is_new:
self.request['set_cookie'] = True
def revert(self):
"""Revert the session to its original state from its first
access in the request"""
self.clear()
self.update(self.accessed_dict)
def regenerate_id(self):
"""
creates a new session id, retains all session data
Its a good security practice to regenerate the id after a client
elevates privileges.
"""
self._create_id(set_new=False)
# TODO: I think both these methods should be removed. They're from
# the original mod_python code i was ripping off but they really
# have no use here.
def lock(self):
"""Locks this session against other processes/threads. This is
automatic when load/save is called.
***use with caution*** and always with a corresponding 'unlock'
inside a "finally:" block, as a stray lock typically cannot be
unlocked without shutting down the whole application.
"""
self.namespace.acquire_write_lock()
def unlock(self):
"""Unlocks this session against other processes/threads. This
is automatic when load/save is called.
***use with caution*** and always within a "finally:" block, as
a stray lock typically cannot be unlocked without shutting down
the whole application.
"""
self.namespace.release_write_lock()
class CookieSession(Session):
"""Pure cookie-based session
Options recognized when using cookie-based sessions are slightly
more restricted than general sessions.
:param key: The name the cookie should be set to.
:param timeout: How long session data is considered valid. This is used
regardless of the cookie being present or not to determine
whether session data is still valid.
:type timeout: int
:param save_accessed_time: Whether beaker should save the session's access
time (True) or only modification time (False).
Defaults to True.
:param cookie_expires: Expiration date for cookie
:param cookie_domain: Domain to use for the cookie.
:param cookie_path: Path to use for the cookie.
:param data_serializer: If ``"json"`` or ``"pickle"`` should be used
to serialize data. Can also be an object with
``loads` and ``dumps`` methods. By default
``"pickle"`` is used.
:param secure: Whether or not the cookie should only be sent over SSL.
:param httponly: Whether or not the cookie should only be accessible by
the browser not by JavaScript.
:param encrypt_key: The key to use for the local session encryption, if not
provided the session will not be encrypted.
:param validate_key: The key used to sign the local encrypted session
:param invalidate_corrupt: How to handle corrupt data when loading. When
set to True, then corrupt data will be silently
invalidated and a new session created,
otherwise invalid data will cause an exception.
:type invalidate_corrupt: bool
:param crypto_type: The crypto module to use.
:param samesite: SameSite value for the cookie -- should be either 'Lax',
'Strict', or None.
"""
def __init__(self, request, key='beaker.session.id', timeout=None,
save_accessed_time=True, cookie_expires=True, cookie_domain=None,
cookie_path='/', encrypt_key=None, validate_key=None, secure=False,
httponly=False, data_serializer='pickle',
encrypt_nonce_bits=DEFAULT_NONCE_BITS, invalidate_corrupt=False,
crypto_type='default', samesite='Lax',
**kwargs):
_ConfigurableSession.__init__(
self,
cookie_domain=cookie_domain,
cookie_path=cookie_path
)
self.clear()
self.crypto_module = get_crypto_module(crypto_type)
if encrypt_key and not self.crypto_module.has_aes:
raise InvalidCryptoBackendError("No AES library is installed, can't generate "
"encrypted cookie-only Session.")
self.request = request
self.key = key
self.timeout = timeout
self.save_atime = save_accessed_time
self.cookie_expires = cookie_expires
self.encrypt_key = encrypt_key
self.validate_key = validate_key
self.encrypt_nonce_size = get_nonce_size(encrypt_nonce_bits)
self.request['set_cookie'] = False
self.secure = secure
self.httponly = httponly
self.samesite = samesite
self.invalidate_corrupt = invalidate_corrupt
self._set_serializer(data_serializer)
try:
cookieheader = request['cookie']
except KeyError:
cookieheader = ''
if validate_key is None:
raise BeakerException("No validate_key specified for Cookie only "
"Session.")
if timeout and not save_accessed_time:
raise BeakerException("timeout requires save_accessed_time")
try:
self.cookie = SignedCookie(
validate_key,
input=cookieheader,
)
except http_cookies.CookieError:
self.cookie = SignedCookie(
validate_key,
input=None,
)
self['_id'] = _session_id()
self.is_new = True
# If we have a cookie, load it
if self.key in self.cookie and self.cookie[self.key].value is not None:
self.is_new = False
try:
cookie_data = self.cookie[self.key].value
if cookie_data is InvalidSignature:
raise BeakerException("Invalid signature")
self.update(self._decrypt_data(cookie_data))
except Exception as e:
if self.invalidate_corrupt:
util.warn(
"Invalidating corrupt session %s; "
"error was: %s. Set invalidate_corrupt=False "
"to propagate this exception." % (self.id, e))
self.invalidate()
else:
raise
if self.timeout is not None:
now = time.time()
last_accessed_time = self.get('_accessed_time', now)
if now - last_accessed_time > self.timeout:
self.clear()
self.accessed_dict = self.copy()
self._create_cookie()
def created(self):
return self['_creation_time']
created = property(created)
def id(self):
return self['_id']
id = property(id)
def _set_domain(self, domain):
self['_domain'] = domain
def _get_domain(self):
return self['_domain']
domain = property(_get_domain, _set_domain)
def _set_path(self, path):
self['_path'] = path
def _get_path(self):
return self['_path']
path = property(_get_path, _set_path)
def save(self, accessed_only=False):
"""Saves the data for this session to persistent storage"""
if accessed_only and (self.is_new or not self.save_atime):
return
if accessed_only:
self.clear()
self.update(self.accessed_dict)
self._create_cookie()
def expire(self):
"""Delete the 'expires' attribute on this Session, if any."""
self.pop('_expires', None)
def _create_cookie(self):
if '_creation_time' not in self:
self['_creation_time'] = time.time()
if '_id' not in self:
self['_id'] = _session_id()
self['_accessed_time'] = time.time()
val = self._encrypt_data()
if len(val) > 4064:
raise BeakerException("Cookie value is too long to store")
self.cookie[self.key] = val
if '_expires' in self:
expires = self['_expires']
else:
expires = None
expires = self._set_cookie_expires(expires)
if expires is not None:
self['_expires'] = expires
if self.domain:
self.cookie[self.key]['domain'] = self.domain
if self.secure:
self.cookie[self.key]['secure'] = True
if self.samesite:
self.cookie[self.key]['samesite'] = self.samesite
self._set_cookie_http_only()
self.cookie[self.key]['path'] = self.get('_path', '/')
cookie_out = self.cookie[self.key].output(header='')
if not isinstance(cookie_out, str):
cookie_out = cookie_out.encode('latin1')
self.request['cookie_out'] = cookie_out
self.request['set_cookie'] = True
def delete(self):
"""Delete the cookie, and clear the session"""
# Send a delete cookie request
self._delete_cookie()
self.clear()
def invalidate(self):
"""Clear the contents and start a new session"""
self.clear()
self['_id'] = _session_id()
class SessionObject(object):
"""Session proxy/lazy creator
This object proxies access to the actual session object, so that in
the case that the session hasn't been used before, it will be
setup. This avoid creating and loading the session from persistent
storage unless its actually used during the request.
"""
def __init__(self, environ, **params):
self.__dict__['_params'] = params
self.__dict__['_environ'] = environ
self.__dict__['_sess'] = None
self.__dict__['_headers'] = {}
def _session(self):
"""Lazy initial creation of session object"""
if self.__dict__['_sess'] is None:
params = self.__dict__['_params']
environ = self.__dict__['_environ']
self.__dict__['_headers'] = req = {'cookie_out': None}
req['cookie'] = environ.get('HTTP_COOKIE')
session_cls = params.get('session_class', None)
if session_cls is None:
if params.get('type') == 'cookie':
session_cls = CookieSession
else:
session_cls = Session
else:
assert issubclass(session_cls, Session),\
"Not a Session: " + session_cls
self.__dict__['_sess'] = session_cls(req, **params)
return self.__dict__['_sess']
def __getattr__(self, attr):
return getattr(self._session(), attr)
def __setattr__(self, attr, value):
setattr(self._session(), attr, value)
def __delattr__(self, name):
self._session().__delattr__(name)
def __getitem__(self, key):
return self._session()[key]
def __setitem__(self, key, value):
self._session()[key] = value
def __delitem__(self, key):
self._session().__delitem__(key)
def __repr__(self):
return self._session().__repr__()
def __iter__(self):
"""Only works for proxying to a dict"""
return iter(self._session().keys())
def __contains__(self, key):
return key in self._session()
def has_key(self, key):
return key in self._session()
def get_by_id(self, id):
"""Loads a session given a session ID"""
params = self.__dict__['_params']
session = Session({}, use_cookies=False, id=id, **params)
if session.is_new:
return None
return session
def save(self):
self.__dict__['_dirty'] = True
def delete(self):
self.__dict__['_dirty'] = True
self._session().delete()
def persist(self):
"""Persist the session to the storage
Always saves the whole session if save() or delete() have been called.
If they haven't:
- If autosave is set to true, saves the the entire session regardless.
- If save_accessed_time is set to true or unset, only saves the updated
access time.
- If save_accessed_time is set to false, doesn't save anything.
"""
if self.__dict__['_params'].get('auto'):
self._session().save()
elif self.__dict__['_params'].get('save_accessed_time', True):
if self.dirty():
self._session().save()
else:
self._session().save(accessed_only=True)
else: # save_accessed_time is false
if self.dirty():
self._session().save()
def dirty(self):
"""Returns True if save() or delete() have been called"""
return self.__dict__.get('_dirty', False)
def accessed(self):
"""Returns whether or not the session has been accessed"""
return self.__dict__['_sess'] is not None
beaker-1.12.1/beaker/synchronization.py000066400000000000000000000264121436751141500201050ustar00rootroot00000000000000"""Synchronization functions.
File- and mutex-based mutual exclusion synchronizers are provided,
as well as a name-based mutex which locks within an application
based on a string name.
"""
import errno
import os
import sys
import tempfile
try:
import threading as _threading
except ImportError:
import dummy_threading as _threading
# check for fcntl module
try:
sys.getwindowsversion()
has_flock = False
except:
try:
import fcntl
has_flock = True
except ImportError:
has_flock = False
from beaker import util
from beaker.exceptions import LockError
__all__ = ["file_synchronizer", "mutex_synchronizer", "null_synchronizer",
"NameLock", "_threading"]
class NameLock(object):
"""a proxy for an RLock object that is stored in a name based
registry.
Multiple threads can get a reference to the same RLock based on the
name alone, and synchronize operations related to that name.
"""
locks = util.WeakValuedRegistry()
class NLContainer(object):
def __init__(self, reentrant):
if reentrant:
self.lock = _threading.RLock()
else:
self.lock = _threading.Lock()
def __call__(self):
return self.lock
def __init__(self, identifier=None, reentrant=False):
if identifier is None:
self._lock = NameLock.NLContainer(reentrant)
else:
self._lock = NameLock.locks.get(identifier, NameLock.NLContainer,
reentrant)
def acquire(self, wait=True):
return self._lock().acquire(wait)
def release(self):
self._lock().release()
_synchronizers = util.WeakValuedRegistry()
def _synchronizer(identifier, cls, **kwargs):
return _synchronizers.sync_get((identifier, cls), cls, identifier, **kwargs)
def file_synchronizer(identifier, **kwargs):
if not has_flock or 'lock_dir' not in kwargs:
return mutex_synchronizer(identifier)
else:
return _synchronizer(identifier, FileSynchronizer, **kwargs)
def mutex_synchronizer(identifier, **kwargs):
return _synchronizer(identifier, ConditionSynchronizer, **kwargs)
class null_synchronizer(object):
"""A 'null' synchronizer, which provides the :class:`.SynchronizerImpl` interface
without any locking.
"""
def acquire_write_lock(self, wait=True):
return True
def acquire_read_lock(self):
pass
def release_write_lock(self):
pass
def release_read_lock(self):
pass
acquire = acquire_write_lock
release = release_write_lock
class SynchronizerImpl(object):
"""Base class for a synchronization object that allows
multiple readers, single writers.
"""
def __init__(self):
self._state = util.ThreadLocal()
class SyncState(object):
__slots__ = 'reentrantcount', 'writing', 'reading'
def __init__(self):
self.reentrantcount = 0
self.writing = False
self.reading = False
def state(self):
if not self._state.has():
state = SynchronizerImpl.SyncState()
self._state.put(state)
return state
else:
return self._state.get()
state = property(state)
def release_read_lock(self):
state = self.state
if state.writing:
raise LockError("lock is in writing state")
if not state.reading:
raise LockError("lock is not in reading state")
if state.reentrantcount == 1:
self.do_release_read_lock()
state.reading = False
state.reentrantcount -= 1
def acquire_read_lock(self, wait=True):
state = self.state
if state.writing:
raise LockError("lock is in writing state")
if state.reentrantcount == 0:
x = self.do_acquire_read_lock(wait)
if (wait or x):
state.reentrantcount += 1
state.reading = True
return x
elif state.reading:
state.reentrantcount += 1
return True
def release_write_lock(self):
state = self.state
if state.reading:
raise LockError("lock is in reading state")
if not state.writing:
raise LockError("lock is not in writing state")
if state.reentrantcount == 1:
self.do_release_write_lock()
state.writing = False
state.reentrantcount -= 1
release = release_write_lock
def acquire_write_lock(self, wait=True):
state = self.state
if state.reading:
raise LockError("lock is in reading state")
if state.reentrantcount == 0:
x = self.do_acquire_write_lock(wait)
if (wait or x):
state.reentrantcount += 1
state.writing = True
return x
elif state.writing:
state.reentrantcount += 1
return True
acquire = acquire_write_lock
def do_release_read_lock(self):
raise NotImplementedError()
def do_acquire_read_lock(self, wait):
raise NotImplementedError()
def do_release_write_lock(self):
raise NotImplementedError()
def do_acquire_write_lock(self, wait):
raise NotImplementedError()
class FileSynchronizer(SynchronizerImpl):
"""A synchronizer which locks using flock().
"""
def __init__(self, identifier, lock_dir):
super(FileSynchronizer, self).__init__()
self._filedescriptor = util.ThreadLocal()
if lock_dir is None:
lock_dir = tempfile.gettempdir()
else:
lock_dir = lock_dir
self.filename = util.encoded_path(
lock_dir,
[identifier],
extension='.lock'
)
self.lock_dir = os.path.dirname(self.filename)
def _filedesc(self):
return self._filedescriptor.get()
_filedesc = property(_filedesc)
def _ensuredir(self):
if not os.path.exists(self.lock_dir):
util.verify_directory(self.lock_dir)
def _open(self, mode):
filedescriptor = self._filedesc
if filedescriptor is None:
self._ensuredir()
filedescriptor = os.open(self.filename, mode)
self._filedescriptor.put(filedescriptor)
return filedescriptor
def do_acquire_read_lock(self, wait):
filedescriptor = self._open(os.O_CREAT | os.O_RDONLY)
if not wait:
try:
fcntl.flock(filedescriptor, fcntl.LOCK_SH | fcntl.LOCK_NB)
return True
except IOError:
os.close(filedescriptor)
self._filedescriptor.remove()
return False
else:
fcntl.flock(filedescriptor, fcntl.LOCK_SH)
return True
def do_acquire_write_lock(self, wait):
filedescriptor = self._open(os.O_CREAT | os.O_WRONLY)
if not wait:
try:
fcntl.flock(filedescriptor, fcntl.LOCK_EX | fcntl.LOCK_NB)
return True
except IOError:
os.close(filedescriptor)
self._filedescriptor.remove()
return False
else:
fcntl.flock(filedescriptor, fcntl.LOCK_EX)
return True
def do_release_read_lock(self):
self._release_all_locks()
def do_release_write_lock(self):
self._release_all_locks()
def _release_all_locks(self):
filedescriptor = self._filedesc
if filedescriptor is not None:
fcntl.flock(filedescriptor, fcntl.LOCK_UN)
os.close(filedescriptor)
self._filedescriptor.remove()
class ConditionSynchronizer(SynchronizerImpl):
"""a synchronizer using a Condition."""
def __init__(self, identifier):
super(ConditionSynchronizer, self).__init__()
# counts how many asynchronous methods are executing
self.asynch = 0
# pointer to thread that is the current sync operation
self.current_sync_operation = None
# condition object to lock on
self.condition = _threading.Condition(_threading.Lock())
def do_acquire_read_lock(self, wait=True):
self.condition.acquire()
try:
# see if a synchronous operation is waiting to start
# or is already running, in which case we wait (or just
# give up and return)
if wait:
while self.current_sync_operation is not None:
self.condition.wait()
else:
if self.current_sync_operation is not None:
return False
self.asynch += 1
finally:
self.condition.release()
if not wait:
return True
def do_release_read_lock(self):
self.condition.acquire()
try:
self.asynch -= 1
# check if we are the last asynchronous reader thread
# out the door.
if self.asynch == 0:
# yes. so if a sync operation is waiting, notify_all to wake
# it up
if self.current_sync_operation is not None:
self.condition.notify_all()
elif self.asynch < 0:
raise LockError("Synchronizer error - too many "
"release_read_locks called")
finally:
self.condition.release()
def do_acquire_write_lock(self, wait=True):
self.condition.acquire()
try:
# here, we are not a synchronous reader, and after returning,
# assuming waiting or immediate availability, we will be.
if wait:
# if another sync is working, wait
while self.current_sync_operation is not None:
self.condition.wait()
else:
# if another sync is working,
# we dont want to wait, so forget it
if self.current_sync_operation is not None:
return False
# establish ourselves as the current sync
# this indicates to other read/write operations
# that they should wait until this is None again
self.current_sync_operation = _threading.current_thread()
# now wait again for asyncs to finish
if self.asynch > 0:
if wait:
# wait
self.condition.wait()
else:
# we dont want to wait, so forget it
self.current_sync_operation = None
return False
finally:
self.condition.release()
if not wait:
return True
def do_release_write_lock(self):
self.condition.acquire()
try:
if self.current_sync_operation is not _threading.current_thread():
raise LockError("Synchronizer error - current thread doesnt "
"have the write lock")
# reset the current sync operation so
# another can get it
self.current_sync_operation = None
# tell everyone to get ready
self.condition.notify_all()
finally:
# everyone go !!
self.condition.release()
beaker-1.12.1/beaker/util.py000066400000000000000000000413421436751141500156200ustar00rootroot00000000000000"""Beaker utilities"""
import hashlib
import socket
import binascii
from ._compat import PY2, string_type, unicode_text, NoneType, dictkeyslist, im_class, im_func, pickle, func_signature, \
default_im_func
try:
import threading as _threading
except ImportError:
import dummy_threading as _threading
from datetime import datetime, timedelta
import os
import re
import string
import types
import weakref
import warnings
import sys
import inspect
import json
import zlib
from beaker.converters import asbool
from beaker import exceptions
from threading import local as _tlocal
DEFAULT_CACHE_KEY_LENGTH = 250
__all__ = ["ThreadLocal", "WeakValuedRegistry", "SyncDict", "encoded_path",
"verify_directory",
"serialize", "deserialize"]
def function_named(fn, name):
"""Return a function with a given __name__.
Will assign to __name__ and return the original function if possible on
the Python implementation, otherwise a new function will be constructed.
"""
fn.__name__ = name
return fn
def skip_if(predicate, reason=None):
"""Skip a test if predicate is true."""
reason = reason or predicate.__name__
from unittest import SkipTest
def decorate(fn):
fn_name = fn.__name__
def maybe(*args, **kw):
if predicate():
msg = "'%s' skipped: %s" % (
fn_name, reason)
raise SkipTest(msg)
else:
return fn(*args, **kw)
return function_named(maybe, fn_name)
return decorate
def assert_raises(except_cls, callable_, *args, **kw):
"""Assert the given exception is raised by the given function + arguments."""
try:
callable_(*args, **kw)
success = False
except except_cls:
success = True
# assert outside the block so it works for AssertionError too !
assert success, "Callable did not raise an exception"
def verify_directory(dir):
"""verifies and creates a directory. tries to
ignore collisions with other threads and processes."""
tries = 0
while not os.access(dir, os.F_OK):
try:
tries += 1
os.makedirs(dir)
except:
if tries > 5:
raise
def has_self_arg(func):
"""Return True if the given function has a 'self' argument."""
args = list(func_signature(func).parameters)
if args and args[0] in ('self', 'cls'):
return True
else:
return False
def warn(msg, stacklevel=3):
"""Issue a warning."""
if isinstance(msg, string_type):
warnings.warn(msg, exceptions.BeakerWarning, stacklevel=stacklevel)
else:
warnings.warn(msg, stacklevel=stacklevel)
def deprecated(message):
def wrapper(fn):
def deprecated_method(*args, **kargs):
warnings.warn(message, DeprecationWarning, 2)
return fn(*args, **kargs)
# TODO: use decorator ? functools.wrapper ?
deprecated_method.__name__ = fn.__name__
deprecated_method.__doc__ = "%s\n\n%s" % (message, fn.__doc__)
return deprecated_method
return wrapper
class ThreadLocal(object):
"""stores a value on a per-thread basis"""
__slots__ = '_tlocal'
def __init__(self):
self._tlocal = _tlocal()
def put(self, value):
self._tlocal.value = value
def has(self):
return hasattr(self._tlocal, 'value')
def get(self, default=None):
return getattr(self._tlocal, 'value', default)
def remove(self):
del self._tlocal.value
class SyncDict(object):
"""
An efficient/threadsafe singleton map algorithm, a.k.a.
"get a value based on this key, and create if not found or not
valid" paradigm:
exists && isvalid ? get : create
Designed to work with weakref dictionaries to expect items
to asynchronously disappear from the dictionary.
Use python 2.3.3 or greater ! a major bug was just fixed in Nov.
2003 that was driving me nuts with garbage collection/weakrefs in
this section.
"""
def __init__(self):
self.mutex = _threading.Lock()
self.dict = {}
def get(self, key, createfunc, *args, **kwargs):
try:
if key in self.dict:
return self.dict[key]
else:
return self.sync_get(key, createfunc, *args, **kwargs)
except KeyError:
return self.sync_get(key, createfunc, *args, **kwargs)
def sync_get(self, key, createfunc, *args, **kwargs):
self.mutex.acquire()
try:
try:
if key in self.dict:
return self.dict[key]
else:
return self._create(key, createfunc, *args, **kwargs)
except KeyError:
return self._create(key, createfunc, *args, **kwargs)
finally:
self.mutex.release()
def _create(self, key, createfunc, *args, **kwargs):
self[key] = obj = createfunc(*args, **kwargs)
return obj
def has_key(self, key):
return key in self.dict
def __contains__(self, key):
return self.dict.__contains__(key)
def __getitem__(self, key):
return self.dict.__getitem__(key)
def __setitem__(self, key, value):
self.dict.__setitem__(key, value)
def __delitem__(self, key):
return self.dict.__delitem__(key)
def clear(self):
self.dict.clear()
class WeakValuedRegistry(SyncDict):
def __init__(self):
self.mutex = _threading.RLock()
self.dict = weakref.WeakValueDictionary()
sha1 = None
def encoded_path(root, identifiers, extension=".enc", depth=3,
digest_filenames=True):
"""Generate a unique file-accessible path from the given list of
identifiers starting at the given root directory."""
ident = "_".join(identifiers)
global sha1
if sha1 is None:
from beaker.crypto import sha1
if digest_filenames:
if isinstance(ident, unicode_text):
ident = sha1(ident.encode('utf-8')).hexdigest()
else:
ident = sha1(ident).hexdigest()
ident = os.path.basename(ident)
tokens = []
for d in range(1, depth):
tokens.append(ident[0:d])
dir = os.path.join(root, *tokens)
verify_directory(dir)
return os.path.join(dir, ident + extension)
def asint(obj):
if isinstance(obj, int):
return obj
elif isinstance(obj, string_type) and re.match(r'^\d+$', obj):
return int(obj)
else:
raise Exception("This is not a proper int")
def verify_options(opt, types, error):
if not isinstance(opt, types):
if not isinstance(types, tuple):
types = (types,)
coerced = False
for typ in types:
try:
if typ in (list, tuple):
opt = [x.strip() for x in opt.split(',')]
else:
if typ == bool:
typ = asbool
elif typ == int:
typ = asint
elif typ in (timedelta, datetime):
if not isinstance(opt, typ):
raise Exception("%s requires a timedelta type", typ)
opt = typ(opt)
coerced = True
except:
pass
if coerced:
break
if not coerced:
raise Exception(error)
elif isinstance(opt, str) and not opt.strip():
raise Exception("Empty strings are invalid for: %s" % error)
return opt
def verify_rules(params, ruleset):
for key, types, message in ruleset:
if key in params:
params[key] = verify_options(params[key], types, message)
return params
def coerce_session_params(params):
rules = [
('data_dir', (str, NoneType), "data_dir must be a string referring to a directory."),
('lock_dir', (str, NoneType), "lock_dir must be a string referring to a directory."),
('type', (str, NoneType), "Session type must be a string."),
('cookie_expires', (bool, datetime, timedelta, int),
"Cookie expires was not a boolean, datetime, int, or timedelta instance."),
('cookie_domain', (str, NoneType), "Cookie domain must be a string."),
('cookie_path', (str, NoneType), "Cookie path must be a string."),
('id', (str,), "Session id must be a string."),
('key', (str,), "Session key must be a string."),
('secret', (str, NoneType), "Session secret must be a string."),
('validate_key', (str, NoneType), "Session encrypt_key must be a string."),
('encrypt_key', (str, NoneType), "Session validate_key must be a string."),
('encrypt_nonce_bits', (int, NoneType), "Session encrypt_nonce_bits must be a number"),
('secure', (bool, NoneType), "Session secure must be a boolean."),
('httponly', (bool, NoneType), "Session httponly must be a boolean."),
('timeout', (int, NoneType), "Session timeout must be an integer."),
('save_accessed_time', (bool, NoneType),
"Session save_accessed_time must be a boolean (defaults to true)."),
('auto', (bool, NoneType), "Session is created if accessed."),
('webtest_varname', (str, NoneType), "Session varname must be a string."),
('data_serializer', (str,), "data_serializer must be a string.")
]
opts = verify_rules(params, rules)
cookie_expires = opts.get('cookie_expires')
if cookie_expires and isinstance(cookie_expires, int) and \
not isinstance(cookie_expires, bool):
opts['cookie_expires'] = timedelta(seconds=cookie_expires)
if opts.get('timeout') is not None and not opts.get('save_accessed_time', True):
raise Exception("save_accessed_time must be true to use timeout")
return opts
def coerce_cache_params(params):
rules = [
('data_dir', (str, NoneType), "data_dir must be a string referring to a directory."),
('lock_dir', (str, NoneType), "lock_dir must be a string referring to a directory."),
('type', (str,), "Cache type must be a string."),
('enabled', (bool, NoneType), "enabled must be true/false if present."),
('expire', (int, NoneType),
"expire must be an integer representing how many seconds the cache is valid for"),
('regions', (list, tuple, NoneType),
"Regions must be a comma separated list of valid regions"),
('key_length', (int, NoneType),
"key_length must be an integer which indicates the longest a key can be before hashing"),
]
return verify_rules(params, rules)
def coerce_memcached_behaviors(behaviors):
rules = [
('cas', (bool, int), 'cas must be a boolean or an integer'),
('no_block', (bool, int), 'no_block must be a boolean or an integer'),
('receive_timeout', (int,), 'receive_timeout must be an integer'),
('send_timeout', (int,), 'send_timeout must be an integer'),
('ketama_hash', (str,),
'ketama_hash must be a string designating a valid hashing strategy option'),
('_poll_timeout', (int,), '_poll_timeout must be an integer'),
('auto_eject', (bool, int), 'auto_eject must be an integer'),
('retry_timeout', (int,), 'retry_timeout must be an integer'),
('_sort_hosts', (bool, int), '_sort_hosts must be an integer'),
('_io_msg_watermark', (int,), '_io_msg_watermark must be an integer'),
('ketama', (bool, int), 'ketama must be a boolean or an integer'),
('ketama_weighted', (bool, int), 'ketama_weighted must be a boolean or an integer'),
('_io_key_prefetch', (int, bool), '_io_key_prefetch must be a boolean or an integer'),
('_hash_with_prefix_key', (bool, int),
'_hash_with_prefix_key must be a boolean or an integer'),
('tcp_nodelay', (bool, int), 'tcp_nodelay must be a boolean or an integer'),
('failure_limit', (int,), 'failure_limit must be an integer'),
('buffer_requests', (bool, int), 'buffer_requests must be a boolean or an integer'),
('_socket_send_size', (int,), '_socket_send_size must be an integer'),
('num_replicas', (int,), 'num_replicas must be an integer'),
('remove_failed', (int,), 'remove_failed must be an integer'),
('_noreply', (bool, int), '_noreply must be a boolean or an integer'),
('_io_bytes_watermark', (int,), '_io_bytes_watermark must be an integer'),
('_socket_recv_size', (int,), '_socket_recv_size must be an integer'),
('distribution', (str,),
'distribution must be a string designating a valid distribution option'),
('connect_timeout', (int,), 'connect_timeout must be an integer'),
('hash', (str,), 'hash must be a string designating a valid hashing option'),
('verify_keys', (bool, int), 'verify_keys must be a boolean or an integer'),
('dead_timeout', (int,), 'dead_timeout must be an integer')
]
return verify_rules(behaviors, rules)
def parse_cache_config_options(config, include_defaults=True):
"""Parse configuration options and validate for use with the
CacheManager"""
# Load default cache options
if include_defaults:
options = dict(type='memory', data_dir=None, expire=None,
log_file=None)
else:
options = {}
for key, val in config.items():
if key.startswith('beaker.cache.'):
options[key[13:]] = val
if key.startswith('cache.'):
options[key[6:]] = val
coerce_cache_params(options)
# Set cache to enabled if not turned off
if 'enabled' not in options and include_defaults:
options['enabled'] = True
# Configure region dict if regions are available
regions = options.pop('regions', None)
if regions:
region_configs = {}
for region in regions:
if not region: # ensure region name is valid
continue
# Setup the default cache options
region_options = dict(data_dir=options.get('data_dir'),
lock_dir=options.get('lock_dir'),
type=options.get('type'),
enabled=options['enabled'],
expire=options.get('expire'),
key_length=options.get('key_length', DEFAULT_CACHE_KEY_LENGTH))
region_prefix = '%s.' % region
region_len = len(region_prefix)
for key in dictkeyslist(options):
if key.startswith(region_prefix):
region_options[key[region_len:]] = options.pop(key)
coerce_cache_params(region_options)
region_configs[region] = region_options
options['cache_regions'] = region_configs
return options
def parse_memcached_behaviors(config):
"""Parse behavior options and validate for use with pylibmc
client/PylibMCNamespaceManager, or potentially other memcached
NamespaceManagers that support behaviors"""
behaviors = {}
for key, val in config.items():
if key.startswith('behavior.'):
behaviors[key[9:]] = val
coerce_memcached_behaviors(behaviors)
return behaviors
def func_namespace(func):
"""Generates a unique namespace for a function"""
kls = None
if hasattr(func, 'im_func') or hasattr(func, '__func__'):
kls = im_class(func)
func = im_func(func)
if kls:
return '%s.%s' % (kls.__module__, kls.__name__)
else:
return '%s|%s' % (inspect.getsourcefile(func), func.__name__)
class PickleSerializer(object):
def loads(self, data_string):
return pickle.loads(data_string)
def dumps(self, data):
return pickle.dumps(data, 2)
class JsonSerializer(object):
def loads(self, data_string):
return json.loads(zlib.decompress(data_string).decode('utf-8'))
def dumps(self, data):
return zlib.compress(json.dumps(data).encode('utf-8'))
def serialize(data, method):
if method == 'json':
serializer = JsonSerializer()
else:
serializer = PickleSerializer()
return serializer.dumps(data)
def deserialize(data_string, method):
if method == 'json':
serializer = JsonSerializer()
else:
serializer = PickleSerializer()
return serializer.loads(data_string)
def machine_identifier():
machine_hash = hashlib.md5()
if not PY2:
machine_hash.update(socket.gethostname().encode())
else:
machine_hash.update(socket.gethostname())
return binascii.hexlify(machine_hash.digest()[0:3]).decode('ascii')
def safe_write (filepath, contents):
if os.name == 'posix':
tempname = '%s.temp' % (filepath)
fh = open(tempname, 'wb')
fh.write(contents)
fh.close()
os.rename(tempname, filepath)
else:
fh = open(filepath, 'wb')
fh.write(contents)
fh.close()
beaker-1.12.1/setup.cfg000066400000000000000000000003501436751141500146530ustar00rootroot00000000000000#[egg_info]
#tag_build = dev
#tag_svn_revision = false
[tool:pytest]
where=tests
verbose=True
detailed-errors=True
with-doctest=True
#with-coverage=True
cover-package=beaker
cover-inclusive=True
ignore-files=annotated_functions.py
beaker-1.12.1/setup.py000066400000000000000000000070531436751141500145530ustar00rootroot00000000000000import os
import sys
import re
import inspect
from setuptools import setup, find_packages
py_version = sys.version_info[:2]
here = os.path.abspath(os.path.dirname(__file__))
v = open(os.path.join(here, 'beaker', '__init__.py'))
VERSION = re.compile(r".*__version__ = '(.*?)'", re.S).match(v.read()).group(1)
v.close()
try:
README = open(os.path.join(here, 'README.rst')).read()
except IOError:
README = ''
INSTALL_REQUIRES = []
if not hasattr(inspect, 'signature'):
# On Python 2.6, 2.7 and 3.2 we need funcsigs dependency
INSTALL_REQUIRES.append('funcsigs')
TESTS_REQUIRE = ['pytest', 'Mock', 'pycryptodome']
if py_version == (2, 6):
TESTS_REQUIRE.append('WebTest<2.0.24')
TESTS_REQUIRE.append('pycparser==2.18')
else:
TESTS_REQUIRE.append('webtest')
if py_version == (3, 2):
TESTS_REQUIRE.append('coverage < 4.0')
else:
TESTS_REQUIRE.append('coverage')
if py_version == (3, 3):
TESTS_REQUIRE.append('cryptography < 2.1.0')
else:
TESTS_REQUIRE.append('cryptography')
if not sys.platform.startswith('java') and not sys.platform == 'cli':
if py_version == (2, 6):
TESTS_REQUIRE.append('sqlalchemy < 1.2')
else:
TESTS_REQUIRE.append('sqlalchemy')
TESTS_REQUIRE.extend(['pymongo', 'redis'])
try:
import sqlite3
except ImportError:
TESTS_REQUIRE.append('pysqlite')
TESTS_REQUIRE.extend(['pylibmc', 'python-memcached'])
setup(name='Beaker',
version=VERSION,
description="A Session and Caching library with WSGI Middleware",
long_description=README,
classifiers=[
'Development Status :: 5 - Production/Stable',
'Environment :: Web Environment',
'Intended Audience :: Developers',
'License :: OSI Approved :: BSD License',
'Programming Language :: Python',
'Programming Language :: Python :: 3.7',
'Programming Language :: Python :: 3.8',
'Programming Language :: Python :: 3.9',
'Programming Language :: Python :: 3.10',
'Programming Language :: Python :: 3.11',
'Topic :: Internet :: WWW/HTTP :: Dynamic Content',
'Topic :: Internet :: WWW/HTTP :: Dynamic Content',
'Topic :: Internet :: WWW/HTTP :: Dynamic Content',
'Topic :: Internet :: WWW/HTTP :: WSGI',
'Topic :: Internet :: WWW/HTTP :: WSGI :: Middleware',
],
keywords='wsgi myghty session web cache middleware',
author='Ben Bangert, Mike Bayer, Philip Jenvey, Alessandro Molina',
author_email='ben@groovie.org, pjenvey@groovie.org, amol@turbogears.org',
url='https://beaker.readthedocs.io/',
license='BSD',
license_files=['LICENSE'],
packages=find_packages(exclude=['ez_setup', 'examples', 'tests', 'tests.*']),
zip_safe=False,
install_requires=INSTALL_REQUIRES,
extras_require={
'crypto': ['pycryptopp>=0.5.12'],
'pycrypto': ['pycrypto'],
'pycryptodome': ['pycryptodome'],
'cryptography': ['cryptography'],
'testsuite': [TESTS_REQUIRE]
},
test_suite='tests',
tests_require=TESTS_REQUIRE,
entry_points="""
[paste.filter_factory]
beaker_session = beaker.middleware:session_filter_factory
[paste.filter_app_factory]
beaker_session = beaker.middleware:session_filter_app_factory
[beaker.backends]
database = beaker.ext.database:DatabaseNamespaceManager
memcached = beaker.ext.memcached:MemcachedNamespaceManager
google = beaker.ext.google:GoogleNamespaceManager
sqla = beaker.ext.sqla:SqlaNamespaceManager
"""
)
beaker-1.12.1/tests/000077500000000000000000000000001436751141500141765ustar00rootroot00000000000000beaker-1.12.1/tests/__init__.py000066400000000000000000000000001436751141500162750ustar00rootroot00000000000000beaker-1.12.1/tests/annotated_functions.py000066400000000000000000000007611436751141500206210ustar00rootroot00000000000000# -*- coding: utf-8 -*-
"""This is a collection of annotated functions used by tests.
They are grouped here to provide an easy way to import them at runtime
to check whenever tests for annotated functions should be skipped or not
on current python version.
"""
from beaker.cache import cache_region
import time
class AnnotatedAlfredCacher(object):
@cache_region('short_term')
def alfred_self(self, xx: int, y=None) -> str:
return str(time.time()) + str(self) + str(xx) + str(y)
beaker-1.12.1/tests/test_cache.py000066400000000000000000000235461436751141500166640ustar00rootroot00000000000000# coding: utf-8
from beaker._compat import u_, bytes_
import os
import platform
import shutil
import tarfile
import tempfile
import time
from beaker.middleware import CacheMiddleware
from beaker import util
from beaker.cache import Cache
from unittest import SkipTest
from beaker.util import skip_if
import base64
import zlib
try:
from webtest import TestApp as WebTestApp
except ImportError:
WebTestApp = None
# Tarballs of the output of:
# >>> from beaker.cache import Cache
# >>> c = Cache('test', data_dir='db', type='dbm')
# >>> c['foo'] = 'bar'
# in the old format, Beaker @ revision: 24f57102d310
dbm_cache_tar = bytes_("""\
eJzt3EtOwkAAgOEBjTHEBDfu2ekKZ6bTTnsBL+ABzPRB4osSRBMXHsNruXDl3nMYLaEbpYRAaIn6
f8kwhFcn/APLSeNTUTdZsL4/m4Pg21wSqiCt9D1PC6mUZ7Xo+bWvrHB/N3HjXk+MrrLhQ/a48HXL
nv+l0vg0yYcTdznMxhdpfFvHbpj1lyv0N8oq+jdhrr/b/A5Yo79R9G9ERX8XbXgLrNHfav7/G1Hd
30XGhYPMT5JYRbELVGISGVov9SKVRaGNQj2I49TrF+8oxpJrTAMHxizob+b7ay+Y/v5lE1/AP+8v
9o5ccdsWYvdViMPpIwdCtMRsiP3yTrucd8r5pJxbz8On9/KT2uVo3H5rG1cFAAAAAOD3aIuP7lv3
pRjbXgkAAAAAAFjVyc1Idc6U1lYGgbSmL0Mjpe248+PYjY87I91x/UGeb3udAAAAAACgfh+fAAAA
AADgr/t5/sPFTZ5cb/38D19Lzn9pRHX/zR4CtEZ/o+nfiEX9N3kI0Gr9vWl/W0z0BwAAAAAAAAAA
AAAAAAAAqPAFyOvcKA==
""")
dbm_cache_tar = zlib.decompress(base64.b64decode(dbm_cache_tar))
# dumbdbm format
dumbdbm_cache_tar = bytes_("""\
eJzt191qgzAYBmCPvYqc2UGx+ZKY6A3scCe7gJKoha6binOD3f2yn5Ouf3TTlNH3AQlEJcE3nyGV
W0RT457Jsq9W6632W0Se0JI49/1E0vCIZZPPzHt5HmzPWNQ91M1r/XbwuVP3/6nKLcq2Gey6qftl
5Z6mWA3n56/IKOQfwk7+dvwV8Iv8FSH/IPbkb4uRl8BZ+fvg/WUE8g9if/62UDZf1VlZOiqc1VSq
kudGVrKgushNkYuVc5VM/Rups5vjY3wErJU6nD+Z7fyFNFpEjIf4AFeef7Jq22TOZnzOpLiJLz0d
CGyE+q/scHyMk/Wv+E79G0L9hzC7JSFMpv0PN0+J4rv7xNk+iTuKh07E6aXnB9Mao/7X/fExzt//
FecS9R8C9v/r9rP+l49tubnk+e/z/J8JjvMfAAAAAAAAAADAn70DFJAAwQ==
""")
dumbdbm_cache_tar = zlib.decompress(base64.b64decode(dumbdbm_cache_tar))
def simple_app(environ, start_response):
clear = False
if environ.get('beaker.clear'):
clear = True
cache = environ['beaker.cache'].get_cache('testcache')
if clear:
cache.clear()
try:
value = cache.get_value('value')
except:
value = 0
cache.set_value('value', value+1)
start_response('200 OK', [('Content-type', 'text/plain')])
msg = 'The current value is: %s' % cache.get_value('value')
return [msg.encode('utf-8')]
def cache_manager_app(environ, start_response):
cm = environ['beaker.cache']
cm.get_cache('test')['test_key'] = 'test value'
start_response('200 OK', [('Content-type', 'text/plain')])
yield ("test_key is: %s\n" % cm.get_cache('test')['test_key']).encode('utf-8')
cm.get_cache('test').clear()
try:
test_value = cm.get_cache('test')['test_key']
except KeyError:
yield "test_key cleared".encode('utf-8')
else:
test_value = cm.get_cache('test')['test_key']
yield ("test_key wasn't cleared, is: %s\n" % test_value).encode('utf-8')
def test_has_key():
cache = Cache('test', data_dir='./cache', type='dbm')
o = object()
cache.set_value("test", o)
assert cache.has_key("test")
assert "test" in cache
assert not cache.has_key("foo")
assert "foo" not in cache
cache.remove_value("test")
assert not cache.has_key("test")
def test_expire_changes():
cache = Cache('test_bar', data_dir='./cache', type='dbm')
cache.set_value('test', 10)
assert cache.has_key('test')
assert cache['test'] == 10
# ensure that we can change a never-expiring value
cache.set_value('test', 20, expiretime=1)
assert cache.has_key('test')
assert cache['test'] == 20
time.sleep(1)
assert not cache.has_key('test')
# test that we can change it before its expired
cache.set_value('test', 30, expiretime=50)
assert cache.has_key('test')
assert cache['test'] == 30
cache.set_value('test', 40, expiretime=3)
assert cache.has_key('test')
assert cache['test'] == 40
time.sleep(3)
assert not cache.has_key('test')
def test_fresh_createfunc():
cache = Cache('test_foo', data_dir='./cache', type='dbm')
x = cache.get_value('test', createfunc=lambda: 10, expiretime=2)
assert x == 10
x = cache.get_value('test', createfunc=lambda: 12, expiretime=2)
assert x == 10
x = cache.get_value('test', createfunc=lambda: 14, expiretime=2)
assert x == 10
time.sleep(2)
x = cache.get_value('test', createfunc=lambda: 16, expiretime=2)
assert x == 16
x = cache.get_value('test', createfunc=lambda: 18, expiretime=2)
assert x == 16
cache.remove_value('test')
assert not cache.has_key('test')
x = cache.get_value('test', createfunc=lambda: 20, expiretime=2)
assert x == 20
def test_has_key_multicache():
cache = Cache('test', data_dir='./cache', type='dbm')
o = object()
cache.set_value("test", o)
assert cache.has_key("test")
assert "test" in cache
cache = Cache('test', data_dir='./cache', type='dbm')
assert cache.has_key("test")
def test_unicode_keys():
cache = Cache('test', data_dir='./cache', type='dbm')
o = object()
cache.set_value(u_('hiŏ'), o)
assert u_('hiŏ') in cache
assert u_('hŏa') not in cache
cache.remove_value(u_('hiŏ'))
assert u_('hiŏ') not in cache
def test_remove_stale():
"""test that remove_value() removes even if the value is expired."""
cache = Cache('test', type='memory')
o = object()
cache.namespace[b'key'] = (time.time() - 60, 5, o)
container = cache._get_value('key')
assert not container.has_current_value()
assert b'key' in container.namespace
cache.remove_value('key')
assert b'key' not in container.namespace
# safe to call again
cache.remove_value('key')
def test_multi_keys():
cache = Cache('newtests', data_dir='./cache', type='dbm')
cache.clear()
called = {}
def create_func():
called['here'] = True
return 'howdy'
try:
cache.get_value('key1')
except KeyError:
pass
else:
raise Exception("Failed to keyerror on nonexistent key")
assert 'howdy' == cache.get_value('key2', createfunc=create_func)
assert called['here'] == True
del called['here']
try:
cache.get_value('key3')
except KeyError:
pass
else:
raise Exception("Failed to keyerror on nonexistent key")
try:
cache.get_value('key1')
except KeyError:
pass
else:
raise Exception("Failed to keyerror on nonexistent key")
assert 'howdy' == cache.get_value('key2', createfunc=create_func)
assert called == {}
@skip_if(lambda: WebTestApp is None, "webtest not installed")
def test_increment():
app = WebTestApp(CacheMiddleware(simple_app))
res = app.get('/', extra_environ={'beaker.type':type, 'beaker.clear':True})
assert 'current value is: 1' in res
res = app.get('/')
assert 'current value is: 2' in res
res = app.get('/')
assert 'current value is: 3' in res
@skip_if(lambda: WebTestApp is None, "webtest not installed")
def test_cache_manager():
app = WebTestApp(CacheMiddleware(cache_manager_app))
res = app.get('/')
assert 'test_key is: test value' in res
assert 'test_key cleared' in res
def test_clsmap_nonexistent():
from beaker.cache import clsmap
try:
clsmap['fake']
assert False
except KeyError:
pass
def test_clsmap_present():
from beaker.cache import clsmap
assert clsmap['memory']
def test_legacy_cache():
cache = Cache('newtests', data_dir='./cache', type='dbm')
cache.set_value('x', '1')
assert cache.get_value('x') == '1'
cache.set_value('x', '2', type='file', data_dir='./cache')
assert cache.get_value('x') == '1'
assert cache.get_value('x', type='file', data_dir='./cache') == '2'
cache.remove_value('x')
cache.remove_value('x', type='file', data_dir='./cache')
assert cache.get_value('x', expiretime=1, createfunc=lambda: '5') == '5'
assert cache.get_value('x', expiretime=1, createfunc=lambda: '6', type='file', data_dir='./cache') == '6'
assert cache.get_value('x', expiretime=1, createfunc=lambda: '7') == '5'
assert cache.get_value('x', expiretime=1, createfunc=lambda: '8', type='file', data_dir='./cache') == '6'
time.sleep(1)
assert cache.get_value('x', expiretime=1, createfunc=lambda: '9') == '9'
assert cache.get_value('x', expiretime=1, createfunc=lambda: '10', type='file', data_dir='./cache') == '10'
assert cache.get_value('x', expiretime=1, createfunc=lambda: '11') == '9'
assert cache.get_value('x', expiretime=1, createfunc=lambda: '12', type='file', data_dir='./cache') == '10'
def test_upgrade():
# If we're on OSX, lets run this since its OSX dump files, otherwise
# we have to skip it
if platform.system() != 'Darwin':
return
for test in _test_upgrade_has_key, _test_upgrade_in, _test_upgrade_setitem:
for mod, tar in (('dbm', dbm_cache_tar),
('dumbdbm', dumbdbm_cache_tar)):
try:
__import__(mod)
except ImportError:
continue
dir = tempfile.mkdtemp()
fd, name = tempfile.mkstemp(dir=dir)
fp = os.fdopen(fd, 'wb')
fp.write(tar)
fp.close()
tar = tarfile.open(name)
for member in tar.getmembers():
tar.extract(member, dir)
tar.close()
try:
test(os.path.join(dir, 'db'))
finally:
shutil.rmtree(dir)
def _test_upgrade_has_key(dir):
cache = Cache('test', data_dir=dir, type='dbm')
assert cache.has_key('foo')
assert cache.has_key('foo')
def _test_upgrade_in(dir):
cache = Cache('test', data_dir=dir, type='dbm')
assert 'foo' in cache
assert 'foo' in cache
def _test_upgrade_setitem(dir):
cache = Cache('test', data_dir=dir, type='dbm')
assert cache['foo'] == 'bar'
assert cache['foo'] == 'bar'
def teardown_module():
import shutil
shutil.rmtree('./cache', True)
beaker-1.12.1/tests/test_cache_decorator.py000066400000000000000000000172201436751141500207160ustar00rootroot00000000000000import time
from datetime import datetime
from beaker.cache import CacheManager, cache_region, region_invalidate
from beaker import util
from unittest import SkipTest
defaults = {'cache.data_dir':'./cache', 'cache.type':'dbm', 'cache.expire': 2}
def teardown_module():
import shutil
shutil.rmtree('./cache', True)
@cache_region('short_term')
def fred(x):
return time.time()
@cache_region('short_term')
def george(x):
return time.time()
@cache_region('short_term')
def albert(x):
"""A doc string"""
return time.time()
@cache_region('short_term')
def alfred(x, xx, y=None):
return str(time.time()) + str(x) + str(xx) + str(y)
class AlfredCacher(object):
@cache_region('short_term')
def alfred_self(self, xx, y=None):
return str(time.time()) + str(self) + str(xx) + str(y)
try:
from .annotated_functions import AnnotatedAlfredCacher
except (ImportError, SyntaxError):
AnnotatedAlfredCacher = None
def make_cache_obj(**kwargs):
opts = defaults.copy()
opts.update(kwargs)
cache = CacheManager(**util.parse_cache_config_options(opts))
return cache
def make_cached_func(**opts):
cache = make_cache_obj(**opts)
@cache.cache()
def load(person):
now = datetime.now()
return "Hi there %s, its currently %s" % (person, now)
return cache, load
def make_region_cached_func():
opts = {}
opts['cache.regions'] = 'short_term, long_term'
opts['cache.short_term.expire'] = '2'
cache = make_cache_obj(**opts)
@cache_region('short_term', 'region_loader')
def load(person):
now = datetime.now()
return "Hi there %s, its currently %s" % (person, now)
return load
def make_region_cached_func_2():
opts = {}
opts['cache.regions'] = 'short_term, long_term'
opts['cache.short_term.expire'] = '2'
cache = make_cache_obj(**opts)
@cache_region('short_term')
def load_person(person):
now = datetime.now()
return "Hi there %s, its currently %s" % (person, now)
return load_person
def test_check_region_decorator():
func = make_region_cached_func()
result = func('Fred')
assert 'Fred' in result
result2 = func('Fred')
assert result == result2
result3 = func('George')
assert 'George' in result3
result4 = func('George')
assert result3 == result4
time.sleep(2) # Now it should have expired as cache is 2secs
result2 = func('Fred')
assert result != result2
def test_different_default_names():
result = fred(1)
time.sleep(0.1)
result2 = george(1)
assert result != result2
def test_check_invalidate_region():
func = make_region_cached_func()
result = func('Fred')
assert 'Fred' in result
result2 = func('Fred')
assert result == result2
region_invalidate(func, None, 'region_loader', 'Fred')
result3 = func('Fred')
assert result3 != result2
result2 = func('Fred')
assert result3 == result2
# Invalidate a non-existent key
region_invalidate(func, None, 'region_loader', 'Fredd')
assert result3 == result2
def test_check_invalidate_region_2():
func = make_region_cached_func_2()
result = func('Fred')
assert 'Fred' in result
result2 = func('Fred')
assert result == result2
region_invalidate(func, None, 'Fred')
result3 = func('Fred')
assert result3 != result2
result2 = func('Fred')
assert result3 == result2
# Invalidate a non-existent key
region_invalidate(func, None, 'Fredd')
assert result3 == result2
def test_invalidate_cache():
cache, func = make_cached_func()
val = func('foo')
time.sleep(0.1)
val2 = func('foo')
assert val == val2
cache.invalidate(func, 'foo')
val3 = func('foo')
assert val3 != val
def test_class_key_cache():
cache = make_cache_obj()
class Foo(object):
@cache.cache('method')
def go(self, x, y):
return "hi foo"
@cache.cache('standalone')
def go(x, y):
return "hi standalone"
x = Foo().go(1, 2)
y = go(1, 2)
ns = go._arg_namespace
assert cache.get_cache(ns).get('method 1 2') == x
assert cache.get_cache(ns).get('standalone 1 2') == y
def test_func_namespace():
def go(x, y):
return "hi standalone"
assert 'test_cache_decorator' in util.func_namespace(go)
assert util.func_namespace(go).endswith('go')
def test_class_key_region():
opts = {}
opts['cache.regions'] = 'short_term'
opts['cache.short_term.expire'] = '2'
cache = make_cache_obj(**opts)
class Foo(object):
@cache_region('short_term', 'method')
def go(self, x, y):
return "hi foo"
@cache_region('short_term', 'standalone')
def go(x, y):
return "hi standalone"
x = Foo().go(1, 2)
y = go(1, 2)
ns = go._arg_namespace
assert cache.get_cache_region(ns, 'short_term').get('method 1 2') == x
assert cache.get_cache_region(ns, 'short_term').get('standalone 1 2') == y
def test_classmethod_key_region():
opts = {}
opts['cache.regions'] = 'short_term'
opts['cache.short_term.expire'] = '2'
cache = make_cache_obj(**opts)
class Foo(object):
@classmethod
@cache_region('short_term', 'method')
def go(cls, x, y):
return "hi"
x = Foo.go(1, 2)
ns = Foo.go._arg_namespace
assert cache.get_cache_region(ns, 'short_term').get('method 1 2') == x
def test_class_key_region_invalidate():
opts = {}
opts['cache.regions'] = 'short_term'
opts['cache.short_term.expire'] = '2'
cache = make_cache_obj(**opts)
class Foo(object):
@cache_region('short_term', 'method')
def go(self, x, y):
now = datetime.now()
return "hi %s" % now
def invalidate(self, x, y):
region_invalidate(self.go, None, "method", x, y)
x = Foo().go(1, 2)
time.sleep(0.1)
y = Foo().go(1, 2)
Foo().invalidate(1, 2)
z = Foo().go(1, 2)
assert x == y
assert x != z
def test_check_region_decorator_keeps_docstring_and_name():
result = albert(1)
time.sleep(0.1)
result2 = albert(1)
assert result == result2
assert albert.__doc__ == "A doc string"
assert albert.__name__ == "albert"
def test_check_region_decorator_with_kwargs():
result = alfred(1, xx=5, y=3)
time.sleep(0.1)
result2 = alfred(1, y=3, xx=5)
assert result == result2
result3 = alfred(1, 5, y=5)
assert result != result3
result4 = alfred(1, 5, 3)
assert result == result4
result5 = alfred(1, 5, y=3)
assert result == result5
def test_check_region_decorator_with_kwargs_and_self():
a1 = AlfredCacher()
a2 = AlfredCacher()
result = a1.alfred_self(xx=5, y='blah')
time.sleep(0.1)
result2 = a2.alfred_self(y='blah', xx=5)
assert result == result2
result3 = a2.alfred_self(5, y=5)
assert result != result3
result4 = a2.alfred_self(5, 'blah')
assert result == result4
result5 = a2.alfred_self(5, y='blah')
assert result == result5
result6 = a2.alfred_self(6, 'blah')
assert result != result6
def test_check_region_decorator_with_kwargs_self_and_annotations():
if AnnotatedAlfredCacher is None:
raise SkipTest('Python version not supporting annotations')
a1 = AnnotatedAlfredCacher()
a2 = AnnotatedAlfredCacher()
result = a1.alfred_self(xx=5, y='blah')
time.sleep(0.1)
result2 = a2.alfred_self(y='blah', xx=5)
assert result == result2
result3 = a2.alfred_self(5, y=5)
assert result != result3
result4 = a2.alfred_self(5, 'blah')
assert result == result4
result5 = a2.alfred_self(5, y='blah')
assert result == result5
result6 = a2.alfred_self(6, 'blah')
assert result != result6
beaker-1.12.1/tests/test_cachemanager.py000066400000000000000000000140541436751141500202110ustar00rootroot00000000000000import time
from datetime import datetime
import shutil
from beaker.cache import CacheManager, cache_regions
from beaker.util import parse_cache_config_options
defaults = {'cache.data_dir':'./cache', 'cache.type':'dbm', 'cache.expire': 2}
def teardown_module():
import shutil
shutil.rmtree('./cache', True)
def make_cache_obj(**kwargs):
opts = defaults.copy()
opts.update(kwargs)
cache = CacheManager(**parse_cache_config_options(opts))
return cache
def make_region_cached_func():
global _cache_obj
opts = {}
opts['cache.regions'] = 'short_term, long_term'
opts['cache.short_term.expire'] = '2'
cache = make_cache_obj(**opts)
@cache.region('short_term', 'region_loader')
def load(person):
now = datetime.now()
return "Hi there %s, its currently %s" % (person, now)
_cache_obj = cache
return load
def make_cached_func():
global _cache_obj
cache = make_cache_obj()
@cache.cache('loader')
def load(person):
now = datetime.now()
return "Hi there %s, its currently %s" % (person, now)
_cache_obj = cache
return load
def test_parse_doesnt_allow_none():
opts = {}
opts['cache.regions'] = 'short_term, long_term'
for region, params in parse_cache_config_options(opts)['cache_regions'].items():
for k, v in params.items():
assert v != 'None', k
def test_parse_doesnt_allow_empty_region_name():
opts = {}
opts['cache.regions'] = ''
regions = parse_cache_config_options(opts)['cache_regions']
assert len(regions) == 0
def test_decorators():
for func in (make_region_cached_func, make_cached_func):
check_decorator(func())
def check_decorator(func):
result = func('Fred')
assert 'Fred' in result
result2 = func('Fred')
assert result == result2
result3 = func('George')
assert 'George' in result3
result4 = func('George')
assert result3 == result4
time.sleep(2)
result2 = func('Fred')
assert result != result2
def test_check_invalidate_region():
func = make_region_cached_func()
result = func('Fred')
assert 'Fred' in result
result2 = func('Fred')
assert result == result2
_cache_obj.region_invalidate(func, None, 'region_loader', 'Fred')
result3 = func('Fred')
assert result3 != result2
result2 = func('Fred')
assert result3 == result2
# Invalidate a non-existent key
_cache_obj.region_invalidate(func, None, 'region_loader', 'Fredd')
assert result3 == result2
def test_check_invalidate():
func = make_cached_func()
result = func('Fred')
assert 'Fred' in result
result2 = func('Fred')
assert result == result2
_cache_obj.invalidate(func, 'loader', 'Fred')
result3 = func('Fred')
assert result3 != result2
result2 = func('Fred')
assert result3 == result2
# Invalidate a non-existent key
_cache_obj.invalidate(func, 'loader', 'Fredd')
assert result3 == result2
def test_long_name():
func = make_cached_func()
name = 'Fred' * 250
result = func(name)
assert name in result
result2 = func(name)
assert result == result2
# This won't actually invalidate it since the key won't be sha'd
_cache_obj.invalidate(func, 'loader', name, key_length=8000)
result3 = func(name)
assert result3 == result2
# And now this should invalidate it
_cache_obj.invalidate(func, 'loader', name)
result4 = func(name)
assert result3 != result4
def test_cache_region_has_default_key_length():
try:
cache = CacheManager(cache_regions={
'short_term_without_key_length':{
'expire': 60,
'type': 'memory'
}
})
# Check CacheManager registered the region in global regions
assert 'short_term_without_key_length' in cache_regions
@cache.region('short_term_without_key_length')
def load_without_key_length(person):
now = datetime.now()
return "Hi there %s, its currently %s" % (person, now)
# Ensure that same person gets same time
msg = load_without_key_length('fred')
msg2 = load_without_key_length('fred')
assert msg == msg2, (msg, msg2)
# Ensure that different person gets different time
msg3 = load_without_key_length('george')
assert msg3.split(',')[-1] != msg2.split(',')[-1]
finally:
# throw away region for this test
cache_regions.pop('short_term_without_key_length', None)
def test_cache_region_expire_is_always_int():
try:
cache = CacheManager(cache_regions={
'short_term_with_string_expire': {
'expire': '60',
'type': 'memory'
}
})
# Check CacheManager registered the region in global regions
assert 'short_term_with_string_expire' in cache_regions
@cache.region('short_term_with_string_expire')
def load_with_str_expire(person):
now = datetime.now()
return "Hi there %s, its currently %s" % (person, now)
# Ensure that same person gets same time
msg = load_with_str_expire('fred')
msg2 = load_with_str_expire('fred')
assert msg == msg2, (msg, msg2)
finally:
# throw away region for this test
cache_regions.pop('short_term_with_string_expire', None)
def test_directory_goes_away():
cache = CacheManager(cache_regions={
'short_term_without_key_length':{
'expire': 60,
'type': 'dbm',
'data_dir': '/tmp/beaker-tests/cache/data',
'lock_dir': '/tmp/beaker-tests/cache/lock'
}
})
@cache.region('short_term_without_key_length')
def load_with_str_expire(person):
now = datetime.now()
return "Hi there %s, its currently %s" % (person, now)
# Ensure that same person gets same time
msg = load_with_str_expire('fred')
msg2 = load_with_str_expire('fred')
shutil.rmtree('/tmp/beaker-tests')
msg3 = load_with_str_expire('fred')
assert msg == msg2, (msg, msg2)
assert msg2 != msg3, (msg2, msg3)
beaker-1.12.1/tests/test_container.py000066400000000000000000000125461436751141500176010ustar00rootroot00000000000000import os
import pickle
import random
import shutil
import sys
import time
import pytest
from beaker.container import *
from beaker.synchronization import _synchronizers
from beaker.cache import clsmap
from threading import Thread
class CachedWidget(object):
totalcreates = 0
delay = 0
def __init__(self):
CachedWidget.totalcreates += 1
time.sleep(CachedWidget.delay)
self.time = time.time()
def _run_container_test(cls, totaltime, expiretime, delay, threadlocal):
print("\ntesting %s for %d secs with expiretime %s delay %d" % (
cls, totaltime, expiretime, delay))
CachedWidget.totalcreates = 0
CachedWidget.delay = delay
# allow for python overhead when checking current time against expire times
fudge = 10
starttime = time.time()
running = [True]
class RunThread(Thread):
def run(self):
print("%s starting" % self)
if threadlocal:
localvalue = Value(
'test',
cls('test', data_dir='./cache'),
createfunc=CachedWidget,
expiretime=expiretime,
starttime=starttime)
localvalue.clear_value()
else:
localvalue = value
try:
while running[0]:
item = localvalue.get_value()
if expiretime is not None:
currenttime = time.time()
itemtime = item.time
assert itemtime + expiretime + delay + fudge >= currenttime, \
"created: %f expire: %f delay: %f currenttime: %f" % \
(itemtime, expiretime, delay, currenttime)
time.sleep(random.random() * .00001)
except:
running[0] = False
raise
print("%s finishing" % self)
if not threadlocal:
value = Value(
'test',
cls('test', data_dir='./cache'),
createfunc=CachedWidget,
expiretime=expiretime,
starttime=starttime)
value.clear_value()
else:
value = None
threads = [RunThread() for i in range(1, 8)]
for t in threads:
t.start()
time.sleep(totaltime)
failed = not running[0]
running[0] = False
for t in threads:
t.join()
assert not failed, "One or more threads failed"
if expiretime is None:
expected = 1
else:
expected = totaltime / expiretime + 1
assert CachedWidget.totalcreates <= expected, \
"Number of creates %d exceeds expected max %d" % (CachedWidget.totalcreates, expected)
def test_memory_container(totaltime=10, expiretime=None, delay=0, threadlocal=False):
_run_container_test(clsmap['memory'],
totaltime, expiretime, delay, threadlocal)
def test_dbm_container(totaltime=10, expiretime=None, delay=0):
_run_container_test(clsmap['dbm'], totaltime, expiretime, delay, False)
def test_file_container(totaltime=10, expiretime=None, delay=0, threadlocal=False):
_run_container_test(clsmap['file'], totaltime, expiretime, delay, threadlocal)
def test_memory_container_tlocal():
test_memory_container(expiretime=15, delay=2, threadlocal=True)
def test_memory_container_2():
test_memory_container(expiretime=12)
def test_memory_container_3():
test_memory_container(expiretime=15, delay=2)
def test_dbm_container_2():
test_dbm_container(expiretime=12)
def test_dbm_container_3():
test_dbm_container(expiretime=15, delay=2)
def test_file_container_2():
test_file_container(expiretime=12)
def test_file_container_3():
test_file_container(expiretime=15, delay=2)
def test_file_container_tlocal():
test_file_container(expiretime=15, delay=2, threadlocal=True)
@pytest.mark.skipif(sys.version_info < (3, 6),
reason="Cryptography not supported on Python 3 lower than 3.6")
def test_file_open_bug():
"""ensure errors raised during reads or writes don't lock the namespace open."""
value = Value('test', clsmap['file']('reentrant_test', data_dir='./cache'))
if os.path.exists(value.namespace.file):
os.remove(value.namespace.file)
value.set_value("x")
f = open(value.namespace.file, 'w')
f.write("BLAH BLAH BLAH")
f.close()
with pytest.raises(pickle.UnpicklingError):
value.set_value("y")
_synchronizers.clear()
value = Value('test', clsmap['file']('reentrant_test', data_dir='./cache'))
# TODO: do we have an assertRaises() in nose to use here ?
with pytest.raises(pickle.UnpicklingError):
value.set_value("z")
def test_removing_file_refreshes():
"""test that the cache doesn't ignore file removals"""
x = [0]
def create():
x[0] += 1
return x[0]
value = Value('test',
clsmap['file']('refresh_test', data_dir='./cache'),
createfunc=create, starttime=time.time()
)
if os.path.exists(value.namespace.file):
os.remove(value.namespace.file)
assert value.get_value() == 1
assert value.get_value() == 1
os.remove(value.namespace.file)
assert value.get_value() == 2
def teardown_module():
shutil.rmtree('./cache', True)
beaker-1.12.1/tests/test_converters.py000066400000000000000000000032241436751141500200020ustar00rootroot00000000000000from beaker._compat import u_
import unittest
from beaker.converters import asbool, aslist
class AsBool(unittest.TestCase):
def test_truth_str(self):
for v in ('true', 'yes', 'on', 'y', 't', '1'):
self.assertTrue(asbool(v), "%s should be considered True" % (v,))
v = v.upper()
self.assertTrue(asbool(v), "%s should be considered True" % (v,))
def test_false_str(self):
for v in ('false', 'no', 'off', 'n', 'f', '0'):
self.assertFalse(asbool(v), v)
v = v.upper()
self.assertFalse(asbool(v), v)
def test_coerce(self):
"""Things that can coerce right straight to booleans."""
self.assertTrue(asbool(True))
self.assertTrue(asbool(1))
self.assertTrue(asbool(42))
self.assertFalse(asbool(False))
self.assertFalse(asbool(0))
def test_bad_values(self):
self.assertRaises(ValueError, asbool, ('mommy!'))
self.assertRaises(ValueError, asbool, (u_('Blargl?')))
class AsList(unittest.TestCase):
def test_string(self):
self.assertEqual(aslist('abc'), ['abc'])
self.assertEqual(aslist('1a2a3', 'a'), ['1', '2', '3'])
def test_None(self):
self.assertEqual(aslist(None), [])
def test_listy_noops(self):
"""Lists and tuples should come back unchanged."""
x = [1, 2, 3]
self.assertEqual(aslist(x), x)
y = ('z', 'y', 'x')
self.assertEqual(aslist(y), y)
def test_listify(self):
"""Other objects should just result in a single item list."""
self.assertEqual(aslist(dict()), [{}])
if __name__ == '__main__':
unittest.main()
beaker-1.12.1/tests/test_cookie_domain_only.py000066400000000000000000000044461436751141500214600ustar00rootroot00000000000000import pytest
from beaker.middleware import SessionMiddleware
from beaker import crypto
webtest = pytest.importorskip("webtest")
pytest.mark.skipif(not crypto.get_crypto_module('default').has_aes,
reason="No AES library is installed, can't test " +
"cookie-only Sessions")
def simple_app(environ, start_response):
session = environ['beaker.session']
if 'value' not in session:
session['value'] = 0
session['value'] += 1
domain = environ.get('domain')
if domain:
session.domain = domain
if not environ['PATH_INFO'].startswith('/nosave'):
session.save()
start_response('200 OK', [('Content-type', 'text/plain')])
msg = 'The current value is: %d and cookie is %s' % (session['value'], session)
return [msg.encode('utf-8')]
def test_increment():
options = {'session.validate_key':'hoobermas',
'session.type':'cookie'}
app = webtest.TestApp(SessionMiddleware(simple_app, **options))
res = app.get('/')
assert 'current value is: 1' in res
res = app.get('/', extra_environ=dict(domain='.hoop.com',
HTTP_HOST='www.hoop.com'))
assert 'current value is: 1' in res
assert 'Domain=.hoop.com' in res.headers['Set-Cookie']
res = app.get('/', extra_environ=dict(HTTP_HOST='www.hoop.com'))
assert 'Domain=.hoop.com' in res.headers['Set-Cookie']
assert 'current value is: 2' in res
def test_cookie_attributes_are_preserved():
options = {'session.type': 'memory',
'session.httponly': True,
'session.secure': True,
'session.cookie_path': '/app',
'session.cookie_domain': 'localhost'}
app = webtest.TestApp(SessionMiddleware(simple_app, **options))
res = app.get('/app', extra_environ=dict(
HTTP_COOKIE='beaker.session.id=oldsessid', domain='.hoop.com'))
cookie = res.headers['Set-Cookie']
assert 'domain=.hoop.com' in cookie.lower()
assert 'path=/app' in cookie.lower()
assert 'secure' in cookie.lower()
assert 'httponly' in cookie.lower()
assert 'samesite=lax' in cookie.lower()
if __name__ == '__main__':
from paste import httpserver
wsgi_app = SessionMiddleware(simple_app, {})
httpserver.serve(wsgi_app, host='127.0.0.1', port=8080)
beaker-1.12.1/tests/test_cookie_expires.py000066400000000000000000000050541436751141500206230ustar00rootroot00000000000000from beaker.middleware import SessionMiddleware
from beaker.session import Session
import datetime
import re
def test_cookie_expires():
"""Explore valid arguments for cookie_expires."""
def app(*args, **kw):
pass
key = 'beaker.session.cookie_expires'
now = datetime.datetime.now()
values = ['300', 300,
True, 'True', 'true', 't',
False, 'False', 'false', 'f',
datetime.timedelta(minutes=5), now]
expected = [datetime.timedelta(seconds=300),
datetime.timedelta(seconds=300),
True, True, True, True,
False, False, False, False,
datetime.timedelta(minutes=5), now]
actual = []
for pos, v in enumerate(values):
try:
s = SessionMiddleware(app, config={key:v})
val = s.options['cookie_expires']
except:
val = None
assert val == expected[pos]
def cookie_expiration(session):
cookie = session.cookie.output()
expiry_m = re.match('Set-Cookie: beaker.session.id=[0-9a-f]{32}(; expires=[^;]+)?; Path=/', cookie)
assert expiry_m
expiry = expiry_m.group(1)
if expiry is None:
return True
if re.match('; expires=(Mon|Tue), 1[89]-Jan-2038 [0-9:]{8} GMT', expiry):
return False
else:
return expiry[10:]
def test_cookie_exprires_2():
"""Exhibit Set-Cookie: values."""
expires = cookie_expiration(Session({}, cookie_expires=True))
assert expires is True, expires
no_expires = cookie_expiration(Session({}, cookie_expires=False))
assert no_expires is False, no_expires
def test_cookie_expires_different_locale():
from locale import setlocale, LC_TIME
expires_date = datetime.datetime(2019, 5, 22)
setlocale(LC_TIME, 'it_IT.UTF-8')
# if you get locale.Error: unsupported locale setting. you have to enable that locale in your OS.
assert expires_date.strftime("%a, %d-%b-%Y %H:%M:%S GMT").startswith('mer,')
session = Session({}, cookie_expires=True, validate_key='validate_key')
assert session._set_cookie_expires(expires_date)
expires = cookie_expiration(session)
assert expires == 'Wed, 22-May-2019 00:00:00 GMT', expires
setlocale(LC_TIME, '') # restore default locale for further tests
def test_set_cookie_expires():
"""Exhibit Set-Cookie: values."""
session = Session({}, cookie_expires=True)
assert cookie_expiration(session) is True
session._set_cookie_expires(False)
assert cookie_expiration(session) is False
session._set_cookie_expires(True)
assert cookie_expiration(session) is True
beaker-1.12.1/tests/test_cookie_only.py000066400000000000000000000316321436751141500201260ustar00rootroot00000000000000import datetime
import time
import re
import json
import beaker.session
import beaker.util
from beaker.session import SignedCookie
from beaker._compat import b64decode
from beaker.middleware import SessionMiddleware
from unittest import SkipTest
try:
from webtest import TestApp as WebTestApp
except ImportError:
raise SkipTest("webtest not installed")
from beaker import crypto
if not crypto.get_crypto_module('default').has_aes:
raise SkipTest("No AES library is installed, can't test cookie-only "
"Sessions")
def simple_app(environ, start_response):
session = environ['beaker.session']
if 'value' not in session:
session['value'] = 0
session['value'] += 1
if not environ['PATH_INFO'].startswith('/nosave'):
session.save()
start_response('200 OK', [('Content-type', 'text/plain')])
msg = 'The current value is: %d and cookie is %s' % (session['value'], session)
return [msg.encode('UTF-8')]
def test_increment():
options = {'session.validate_key':'hoobermas', 'session.type':'cookie'}
app = WebTestApp(SessionMiddleware(simple_app, **options))
res = app.get('/')
assert 'current value is: 1' in res
res = app.get('/')
assert 'current value is: 2' in res
res = app.get('/')
assert 'current value is: 3' in res
def test_invalid_cookie():
# This is not actually a cookie only session, but we still test the cookie part.
options = {'session.validate_key':'hoobermas'}
app = WebTestApp(SessionMiddleware(simple_app, **options))
res = app.get('/')
assert 'current value is: 1' in res
# Set an invalid cookie.
app.set_cookie('cb_/zabbix/actionconf.php_parts', 'HI')
res = app.get('/')
assert 'current value is: 2' in res, res
res = app.get('/')
assert 'current value is: 3' in res, res
def test_invalid_cookie_cookietype():
# This is not actually a cookie only session, but we still test the cookie part.
options = {'session.validate_key':'hoobermas', 'session.type':'cookie'}
app = WebTestApp(SessionMiddleware(simple_app, **options))
res = app.get('/')
assert 'current value is: 1' in res
# Set an invalid cookie.
app.set_cookie('cb_/zabbix/actionconf.php_parts', 'HI')
res = app.get('/')
assert 'current value is: 2' in res, res
res = app.get('/')
assert 'current value is: 3' in res, res
def test_json_serializer():
options = {'session.validate_key':'hoobermas', 'session.type':'cookie', 'data_serializer': 'json'}
app = WebTestApp(SessionMiddleware(simple_app, **options))
res = app.get('/')
assert 'current value is: 1' in res
res = app.get('/')
cookie = SignedCookie('hoobermas')
session_data = cookie.value_decode(app.cookies['beaker.session.id'])[0]
session_data = b64decode(session_data)
data = beaker.util.deserialize(session_data, 'json')
assert data['value'] == 2
res = app.get('/')
assert 'current value is: 3' in res
def test_pickle_serializer():
options = {'session.validate_key':'hoobermas', 'session.type':'cookie', 'data_serializer': 'pickle'}
app = WebTestApp(SessionMiddleware(simple_app, **options))
res = app.get('/')
assert 'current value is: 1' in res
res = app.get('/')
cookie = SignedCookie('hoobermas')
session_data = cookie.value_decode(app.cookies['beaker.session.id'])[0]
session_data = b64decode(session_data)
data = beaker.util.deserialize(session_data, 'pickle')
assert data['value'] == 2
res = app.get('/')
assert 'current value is: 3' in res
def test_custom_serializer():
was_used = [False, False]
class CustomSerializer(object):
def loads(self, data_string):
was_used[0] = True
return json.loads(data_string.decode('utf-8'))
def dumps(self, data):
was_used[1] = True
return json.dumps(data).encode('utf-8')
serializer = CustomSerializer()
options = {'session.validate_key':'hoobermas', 'session.type':'cookie', 'data_serializer': serializer}
app = WebTestApp(SessionMiddleware(simple_app, **options))
res = app.get('/')
assert 'current value is: 1' in res
res = app.get('/')
cookie = SignedCookie('hoobermas')
session_data = cookie.value_decode(app.cookies['beaker.session.id'])[0]
session_data = b64decode(session_data)
data = serializer.loads(session_data)
assert data['value'] == 2
res = app.get('/')
assert 'current value is: 3' in res
assert all(was_used)
def test_expires():
options = {'session.validate_key':'hoobermas', 'session.type':'cookie',
'session.cookie_expires': datetime.timedelta(days=1)}
app = WebTestApp(SessionMiddleware(simple_app, **options))
res = app.get('/')
assert 'expires=' in res.headers.getall('Set-Cookie')[0]
assert 'current value is: 1' in res
def test_different_sessions():
options = {'session.validate_key':'hoobermas', 'session.type':'cookie'}
app = WebTestApp(SessionMiddleware(simple_app, **options))
app2 = WebTestApp(SessionMiddleware(simple_app, **options))
res = app.get('/')
assert 'current value is: 1' in res
res = app2.get('/')
assert 'current value is: 1' in res
res = app2.get('/')
res = app2.get('/')
res = app2.get('/')
res2 = app.get('/')
assert 'current value is: 2' in res2
assert 'current value is: 4' in res
def test_nosave():
options = {'session.validate_key':'hoobermas', 'session.type':'cookie'}
app = WebTestApp(SessionMiddleware(simple_app, **options))
res = app.get('/nosave')
assert 'current value is: 1' in res
assert [] == res.headers.getall('Set-Cookie')
res = app.get('/nosave')
assert 'current value is: 1' in res
res = app.get('/')
assert 'current value is: 1' in res
assert len(res.headers.getall('Set-Cookie')) > 0
res = app.get('/')
assert 'current value is: 2' in res
def test_increment_with_encryption():
options = {'session.encrypt_key':'666a19cf7f61c64c', 'session.validate_key':'hoobermas',
'session.type':'cookie'}
app = WebTestApp(SessionMiddleware(simple_app, **options))
res = app.get('/')
assert 'current value is: 1' in res
res = app.get('/')
assert 'current value is: 2' in res
res = app.get('/')
assert 'current value is: 3' in res
def test_different_sessions_with_encryption():
options = {'session.encrypt_key':'666a19cf7f61c64c', 'session.validate_key':'hoobermas',
'session.type':'cookie'}
app = WebTestApp(SessionMiddleware(simple_app, **options))
app2 = WebTestApp(SessionMiddleware(simple_app, **options))
res = app.get('/')
assert 'current value is: 1' in res
res = app2.get('/')
assert 'current value is: 1' in res
res = app2.get('/')
res = app2.get('/')
res = app2.get('/')
res2 = app.get('/')
assert 'current value is: 2' in res2
assert 'current value is: 4' in res
def test_nosave_with_encryption():
options = {'session.encrypt_key':'666a19cf7f61c64c', 'session.validate_key':'hoobermas',
'session.type':'cookie'}
app = WebTestApp(SessionMiddleware(simple_app, **options))
res = app.get('/nosave')
assert 'current value is: 1' in res
assert [] == res.headers.getall('Set-Cookie')
res = app.get('/nosave')
assert 'current value is: 1' in res
res = app.get('/')
assert 'current value is: 1' in res
assert len(res.headers.getall('Set-Cookie')) > 0
res = app.get('/')
assert 'current value is: 2' in res
def test_cookie_id():
options = {'session.encrypt_key':'666a19cf7f61c64c', 'session.validate_key':'hoobermas',
'session.type':'cookie'}
app = WebTestApp(SessionMiddleware(simple_app, **options))
res = app.get('/')
assert "_id':" in res
sess_id = re.sub(r".*'_id': '(.*?)'.*", r'\1', res.body.decode('utf-8'))
res = app.get('/')
new_id = re.sub(r".*'_id': '(.*?)'.*", r'\1', res.body.decode('utf-8'))
assert new_id == sess_id
def test_invalidate_with_save_does_not_delete_session():
def invalidate_session_app(environ, start_response):
session = environ['beaker.session']
session.invalidate()
session.save()
start_response('200 OK', [('Content-type', 'text/plain')])
return [('Cookie is %s' % session).encode('UTF-8')]
options = {'session.encrypt_key':'666a19cf7f61c64c', 'session.validate_key':'hoobermas',
'session.type':'cookie'}
app = WebTestApp(SessionMiddleware(invalidate_session_app, **options))
res = app.get('/')
assert 'expires=' not in res.headers.getall('Set-Cookie')[0]
def test_changing_encrypt_key_with_timeout():
COMMON_ENCRYPT_KEY = '666a19cf7f61c64c'
DIFFERENT_ENCRYPT_KEY = 'hello-world'
options = {'session.encrypt_key': COMMON_ENCRYPT_KEY,
'session.timeout': 300,
'session.validate_key': 'hoobermas',
'session.type': 'cookie'}
app = WebTestApp(SessionMiddleware(simple_app, **options))
res = app.get('/')
assert 'The current value is: 1' in res, res
# Get the session cookie, so we can reuse it.
cookies = res.headers['Set-Cookie']
# Check that we get the same session with the same cookie
options = {'session.encrypt_key': COMMON_ENCRYPT_KEY,
'session.timeout': 300,
'session.validate_key': 'hoobermas',
'session.type': 'cookie'}
app = WebTestApp(SessionMiddleware(simple_app, **options))
res = app.get('/', headers={'Cookie': cookies})
assert 'The current value is: 2' in res, res
# Now that we are sure that it reuses the same session,
# change the encrypt_key so that it is unable to understand the cookie.
options = {'session.encrypt_key': DIFFERENT_ENCRYPT_KEY,
'session.timeout': 300,
'session.validate_key': 'hoobermas',
'session.type': 'cookie'}
app = WebTestApp(SessionMiddleware(simple_app, **options))
res = app.get('/', headers={'Cookie': cookies})
# Let's check it created a new session as the old one is invalid
# in the past it just crashed.
assert 'The current value is: 1' in res, res
def test_cookie_properly_expires():
COMMON_ENCRYPT_KEY = '666a19cf7f61c64c'
options = {'session.encrypt_key': COMMON_ENCRYPT_KEY,
'session.timeout': 1,
'session.validate_key': 'hoobermas',
'session.type': 'cookie'}
app = WebTestApp(SessionMiddleware(simple_app, **options))
res = app.get('/')
assert 'The current value is: 1' in res, res
res = app.get('/')
assert 'The current value is: 2' in res, res
# Wait session to expire and check it starts with a clean one
time.sleep(1)
res = app.get('/')
assert 'The current value is: 1' in res, res
def test_cookie_attributes_are_preserved():
options = {'session.type': 'cookie',
'session.validate_key': 'hoobermas',
'session.httponly': True,
'session.secure': True,
'session.samesite': 'Strict'}
app = WebTestApp(SessionMiddleware(simple_app, options))
res = app.get('/')
cookie = res.headers['Set-Cookie']
assert 'secure' in cookie.lower()
assert 'httponly' in cookie.lower()
assert 'samesite=strict' in cookie.lower()
def test_cookie_path_properly_set_after_init():
COOKIE_PATH = '/app'
options = {
'session.validate_key': 'hoobermas',
'session.type': 'cookie',
'session.cookie_path': COOKIE_PATH,
}
app = WebTestApp(SessionMiddleware(simple_app, **options))
res = app.get('/app')
cookie = res.headers['Set-Cookie']
assert ('path=%s' % COOKIE_PATH) in cookie.lower()
def test_cookie_path_properly_set_after_load():
COOKIE_PATH = '/app'
options = {
'session.validate_key': 'hoobermas',
'session.type': 'cookie',
'session.cookie_path': COOKIE_PATH,
}
app = WebTestApp(SessionMiddleware(simple_app, **options))
# Perform one request to set the cookie
res = app.get('/app')
# Perform another request to load the previous session from the cookie
res = app.get('/app')
cookie = res.headers['Set-Cookie']
assert ('path=%s' % COOKIE_PATH) in cookie.lower()
def test_cookie_path_properly_set_after_delete():
COOKIE_PATH = '/app'
def delete_session_app(environ, start_response):
session = environ['beaker.session']
session.delete()
start_response('200 OK', [('Content-type', 'text/plain')])
return [('Cookie is %s' % session).encode('UTF-8')]
options = {
'session.validate_key': 'hoobermas',
'session.type': 'cookie',
'session.cookie_path': COOKIE_PATH,
}
app = WebTestApp(SessionMiddleware(delete_session_app, **options))
res = app.get('/app')
cookie = res.headers['Set-Cookie']
assert ('path=%s' % COOKIE_PATH) in cookie.lower()
if __name__ == '__main__':
from paste import httpserver
wsgi_app = SessionMiddleware(simple_app, {})
httpserver.serve(wsgi_app, host='127.0.0.1', port=8080)
beaker-1.12.1/tests/test_database.py000066400000000000000000000066311436751141500173610ustar00rootroot00000000000000# coding: utf-8
from beaker._compat import u_
from beaker.cache import clsmap, Cache, util
from beaker.exceptions import InvalidCacheBackendError
from beaker.middleware import CacheMiddleware
from unittest import SkipTest
try:
from webtest import TestApp as WebTestApp
except ImportError:
WebTestApp = None
try:
clsmap['ext:database']._init_dependencies()
except InvalidCacheBackendError:
raise SkipTest("an appropriate SQLAlchemy backend is not installed")
db_url = 'sqlite:///test.db'
def simple_app(environ, start_response):
extra_args = {}
clear = False
if environ.get('beaker.clear'):
clear = True
extra_args['type'] = 'ext:database'
extra_args['url'] = db_url
extra_args['data_dir'] = './cache'
cache = environ['beaker.cache'].get_cache('testcache', **extra_args)
if clear:
cache.clear()
try:
value = cache.get_value('value')
except:
value = 0
cache.set_value('value', value+1)
start_response('200 OK', [('Content-type', 'text/plain')])
return [('The current value is: %s' % cache.get_value('value')).encode('utf-8')]
def cache_manager_app(environ, start_response):
cm = environ['beaker.cache']
cm.get_cache('test')['test_key'] = 'test value'
start_response('200 OK', [('Content-type', 'text/plain')])
yield ("test_key is: %s\n" % cm.get_cache('test')['test_key']).encode('utf-8')
cm.get_cache('test').clear()
try:
test_value = cm.get_cache('test')['test_key']
except KeyError:
yield ("test_key cleared").encode('utf-8')
else:
yield ("test_key wasn't cleared, is: %s\n" % test_value).encode('utf-8')
def test_has_key():
cache = Cache('test', data_dir='./cache', url=db_url, type='ext:database')
o = object()
cache.set_value("test", o)
assert "test" in cache
assert "test" in cache
assert "foo" not in cache
assert "foo" not in cache
cache.remove_value("test")
assert "test" not in cache
def test_has_key_multicache():
cache = Cache('test', data_dir='./cache', url=db_url, type='ext:database')
o = object()
cache.set_value("test", o)
assert "test" in cache
assert "test" in cache
cache = Cache('test', data_dir='./cache', url=db_url, type='ext:database')
assert "test" in cache
cache.remove_value('test')
def test_clear():
cache = Cache('test', data_dir='./cache', url=db_url, type='ext:database')
o = object()
cache.set_value("test", o)
assert "test" in cache
cache.clear()
assert "test" not in cache
def test_unicode_keys():
cache = Cache('test', data_dir='./cache', url=db_url, type='ext:database')
o = object()
cache.set_value(u_('hiŏ'), o)
assert u_('hiŏ') in cache
assert u_('hŏa') not in cache
cache.remove_value(u_('hiŏ'))
assert u_('hiŏ') not in cache
@util.skip_if(lambda: WebTestApp is None, "webtest not installed")
def test_increment():
app = WebTestApp(CacheMiddleware(simple_app))
res = app.get('/', extra_environ={'beaker.clear':True})
assert 'current value is: 1' in res
res = app.get('/')
assert 'current value is: 2' in res
res = app.get('/')
assert 'current value is: 3' in res
@util.skip_if(lambda: WebTestApp is None, "webtest not installed")
def test_cache_manager():
app = WebTestApp(CacheMiddleware(cache_manager_app))
res = app.get('/')
assert 'test_key is: test value' in res
assert 'test_key cleared' in res
beaker-1.12.1/tests/test_domain_setting.py000066400000000000000000000045641436751141500206240ustar00rootroot00000000000000from beaker.middleware import SessionMiddleware
from unittest import SkipTest
try:
from webtest import TestApp as WebTestApp
except ImportError:
raise SkipTest("webtest not installed")
def teardown_module():
import shutil
shutil.rmtree('./cache', True)
def simple_app(environ, start_response):
session = environ['beaker.session']
domain = environ.get('domain')
if domain:
session.domain = domain
if 'value' not in session:
session['value'] = 0
session['value'] += 1
if not environ['PATH_INFO'].startswith('/nosave'):
session.save()
start_response('200 OK', [('Content-type', 'text/plain')])
msg = 'The current value is: %s, session id is %s' % (session.get('value', 0),
session.id)
return [msg.encode('utf-8')]
def test_same_domain():
options = {'session.data_dir':'./cache',
'session.secret':'blah',
'session.cookie_domain': '.hoop.com'}
app = WebTestApp(SessionMiddleware(simple_app, **options))
res = app.get('/', extra_environ=dict(HTTP_HOST='subdomain.hoop.com'))
assert 'current value is: 1' in res
assert 'Domain=.hoop.com' in res.headers['Set-Cookie']
res = app.get('/', extra_environ=dict(HTTP_HOST='another.hoop.com'))
assert 'current value is: 2' in res
assert [] == res.headers.getall('Set-Cookie')
res = app.get('/', extra_environ=dict(HTTP_HOST='more.subdomain.hoop.com'))
assert 'current value is: 3' in res
def test_different_domain():
options = {'session.data_dir':'./cache',
'session.secret':'blah'}
app = WebTestApp(SessionMiddleware(simple_app, **options))
res = app.get('/', extra_environ=dict(domain='.hoop.com',
HTTP_HOST='www.hoop.com'))
res = app.get('/', extra_environ=dict(domain='.hoop.co.uk',
HTTP_HOST='www.hoop.com'))
assert 'Domain=.hoop.co.uk' in res.headers['Set-Cookie']
assert 'current value is: 2' in res
res = app.get('/', extra_environ=dict(domain='.hoop.co.uk',
HTTP_HOST='www.test.com'))
assert 'current value is: 1' in res
if __name__ == '__main__':
from paste import httpserver
wsgi_app = SessionMiddleware(simple_app, {})
httpserver.serve(wsgi_app, host='127.0.0.1', port=8080)
beaker-1.12.1/tests/test_increment.py000066400000000000000000000147151436751141500176030ustar00rootroot00000000000000import re
import unittest
from beaker.middleware import SessionMiddleware
try:
from webtest import TestApp as WebTestApp
except ImportError:
raise unittest.SkipTest("webtest not installed")
def teardown_module():
import shutil
shutil.rmtree('./cache', True)
def no_save_app(environ, start_response):
session = environ['beaker.session']
sess_id = environ.get('SESSION_ID')
start_response('200 OK', [('Content-type', 'text/plain')])
msg = 'The current value is: %s, session id is %s' % (session.get('value'),
session.id)
return [msg.encode('utf-8')]
def simple_app(environ, start_response):
session = environ['beaker.session']
sess_id = environ.get('SESSION_ID')
if sess_id:
session = session.get_by_id(sess_id)
if not session:
start_response('200 OK', [('Content-type', 'text/plain')])
return [("No session id of %s found." % sess_id).encode('utf-8')]
if not 'value' in session:
session['value'] = 0
session['value'] += 1
if not environ['PATH_INFO'].startswith('/nosave'):
session.save()
start_response('200 OK', [('Content-type', 'text/plain')])
msg = 'The current value is: %s, session id is %s' % (session.get('value'),
session.id)
return [msg.encode('utf-8')]
def simple_auto_app(environ, start_response):
"""Like the simple_app, but assume that sessions auto-save"""
session = environ['beaker.session']
sess_id = environ.get('SESSION_ID')
if sess_id:
session = session.get_by_id(sess_id)
if not session:
start_response('200 OK', [('Content-type', 'text/plain')])
return [("No session id of %s found." % sess_id).encode('utf-8')]
if not 'value' in session:
session['value'] = 0
session['value'] += 1
if environ['PATH_INFO'].startswith('/nosave'):
session.revert()
start_response('200 OK', [('Content-type', 'text/plain')])
msg = 'The current value is: %s, session id is %s' % (session.get('value', 0),
session.id)
return [msg.encode('utf-8')]
def test_no_save():
options = {'session.data_dir':'./cache', 'session.secret':'blah'}
app = WebTestApp(SessionMiddleware(no_save_app, **options))
res = app.get('/')
assert 'current value is: None' in res
assert [] == res.headers.getall('Set-Cookie')
def test_increment():
options = {'session.data_dir':'./cache', 'session.secret':'blah'}
app = WebTestApp(SessionMiddleware(simple_app, **options))
res = app.get('/')
assert 'current value is: 1' in res
res = app.get('/')
assert 'current value is: 2' in res
res = app.get('/')
assert 'current value is: 3' in res
def test_increment_auto():
options = {'session.data_dir':'./cache', 'session.secret':'blah'}
app = WebTestApp(SessionMiddleware(simple_auto_app, auto=True, **options))
res = app.get('/')
assert 'current value is: 1' in res
res = app.get('/')
assert 'current value is: 2' in res
res = app.get('/')
assert 'current value is: 3' in res
def test_different_sessions():
options = {'session.data_dir':'./cache', 'session.secret':'blah'}
app = WebTestApp(SessionMiddleware(simple_app, **options))
app2 = WebTestApp(SessionMiddleware(simple_app, **options))
res = app.get('/')
assert 'current value is: 1' in res
res = app2.get('/')
assert 'current value is: 1' in res
res = app2.get('/')
res = app2.get('/')
res = app2.get('/')
res2 = app.get('/')
assert 'current value is: 2' in res2
assert 'current value is: 4' in res
def test_different_sessions_auto():
options = {'session.data_dir':'./cache', 'session.secret':'blah'}
app = WebTestApp(SessionMiddleware(simple_auto_app, auto=True, **options))
app2 = WebTestApp(SessionMiddleware(simple_auto_app, auto=True, **options))
res = app.get('/')
assert 'current value is: 1' in res
res = app2.get('/')
assert 'current value is: 1' in res
res = app2.get('/')
res = app2.get('/')
res = app2.get('/')
res2 = app.get('/')
assert 'current value is: 2' in res2
assert 'current value is: 4' in res
def test_nosave():
options = {'session.data_dir':'./cache', 'session.secret':'blah'}
app = WebTestApp(SessionMiddleware(simple_app, **options))
res = app.get('/nosave')
assert 'current value is: 1' in res
res = app.get('/nosave')
assert 'current value is: 1' in res
res = app.get('/')
assert 'current value is: 1' in res
res = app.get('/')
assert 'current value is: 2' in res
def test_revert():
options = {'session.data_dir':'./cache', 'session.secret':'blah'}
app = WebTestApp(SessionMiddleware(simple_auto_app, auto=True, **options))
res = app.get('/nosave')
assert 'current value is: 0' in res
res = app.get('/nosave')
assert 'current value is: 0' in res
res = app.get('/')
assert 'current value is: 1' in res
assert [] == res.headers.getall('Set-Cookie')
res = app.get('/')
assert [] == res.headers.getall('Set-Cookie')
assert 'current value is: 2' in res
# Finally, ensure that reverting shows the proper one
res = app.get('/nosave')
assert [] == res.headers.getall('Set-Cookie')
assert 'current value is: 2' in res
def test_load_session_by_id():
options = {'session.data_dir':'./cache', 'session.secret':'blah'}
app = WebTestApp(SessionMiddleware(simple_app, **options))
res = app.get('/')
assert 'current value is: 1' in res
res = app.get('/')
res = app.get('/')
assert 'current value is: 3' in res
old_id = re.sub(r'^.*?session id is (\S+)$', r'\1', res.body.decode('utf-8'), re.M)
# Clear the cookies and do a new request
app = WebTestApp(SessionMiddleware(simple_app, **options))
res = app.get('/')
assert 'current value is: 1' in res
# Load a bogus session to see that its not there
res = app.get('/', extra_environ={'SESSION_ID': 'jil2j34il2j34ilj23'})
assert 'No session id of' in res
# Saved session was at 3, now it'll be 4
res = app.get('/', extra_environ={'SESSION_ID': str(old_id)})
assert 'current value is: 4' in res
# Prior request is now up to 2
res = app.get('/')
assert 'current value is: 2' in res
if __name__ == '__main__':
from paste import httpserver
wsgi_app = SessionMiddleware(simple_app, {})
httpserver.serve(wsgi_app, host='127.0.0.1', port=8080)
beaker-1.12.1/tests/test_managers/000077500000000000000000000000001436751141500170325ustar00rootroot00000000000000beaker-1.12.1/tests/test_managers/__init__.py000066400000000000000000000000001436751141500211310ustar00rootroot00000000000000beaker-1.12.1/tests/test_managers/base.py000066400000000000000000000244651436751141500203310ustar00rootroot00000000000000# coding: utf-8
import threading
import unittest
import time
import datetime
from beaker._compat import u_
from beaker.cache import Cache
from beaker.middleware import SessionMiddleware, CacheMiddleware
from webtest import TestApp as WebTestApp
class CacheManagerBaseTests(unittest.TestCase):
SUPPORTS_EXPIRATION = True
SUPPORTS_TIMEOUT = True
CACHE_ARGS = {}
@classmethod
def setUpClass(cls):
def simple_session_app(environ, start_response):
session = environ['beaker.session']
sess_id = environ.get('SESSION_ID')
if environ['PATH_INFO'].startswith('/invalid'):
# Attempt to access the session
id = session.id
session['value'] = 2
else:
if sess_id:
session = session.get_by_id(sess_id)
if not session:
start_response('200 OK', [('Content-type', 'text/plain')])
return [("No session id of %s found." % sess_id).encode('utf-8')]
if not session.has_key('value'):
session['value'] = 0
session['value'] += 1
if not environ['PATH_INFO'].startswith('/nosave'):
session.save()
start_response('200 OK', [('Content-type', 'text/plain')])
return [('The current value is: %d, session id is %s' % (session['value'],
session.id)).encode('utf-8')]
def simple_app(environ, start_response):
extra_args = cls.CACHE_ARGS
clear = False
if environ.get('beaker.clear'):
clear = True
cache = environ['beaker.cache'].get_cache('testcache', **extra_args)
if clear:
cache.clear()
try:
value = cache.get_value('value')
except:
value = 0
cache.set_value('value', value + 1)
start_response('200 OK', [('Content-type', 'text/plain')])
return [('The current value is: %s' % cache.get_value('value')).encode('utf-8')]
def using_none_app(environ, start_response):
extra_args = cls.CACHE_ARGS
clear = False
if environ.get('beaker.clear'):
clear = True
cache = environ['beaker.cache'].get_cache('testcache', **extra_args)
if clear:
cache.clear()
try:
value = cache.get_value('value')
except:
value = 10
cache.set_value('value', None)
start_response('200 OK', [('Content-type', 'text/plain')])
return [('The current value is: %s' % value).encode('utf-8')]
def cache_manager_app(environ, start_response):
cm = environ['beaker.cache']
cm.get_cache('test')['test_key'] = 'test value'
start_response('200 OK', [('Content-type', 'text/plain')])
yield ("test_key is: %s\n" % cm.get_cache('test')['test_key']).encode('utf-8')
cm.get_cache('test').clear()
try:
test_value = cm.get_cache('test')['test_key']
except KeyError:
yield "test_key cleared".encode('utf-8')
else:
yield (
"test_key wasn't cleared, is: %s\n" % cm.get_cache('test')['test_key']
).encode('utf-8')
cls.simple_session_app = staticmethod(simple_session_app)
cls.simple_app = staticmethod(simple_app)
cls.using_none_app = staticmethod(using_none_app)
cls.cache_manager_app = staticmethod(cache_manager_app)
def setUp(self):
Cache('test', **self.CACHE_ARGS).clear()
def test_session(self):
app = WebTestApp(SessionMiddleware(self.simple_session_app, **self.CACHE_ARGS))
res = app.get('/')
assert 'current value is: 1' in res
res = app.get('/')
assert 'current value is: 2' in res
res = app.get('/')
assert 'current value is: 3' in res
def test_session_invalid(self):
app = WebTestApp(SessionMiddleware(self.simple_session_app, **self.CACHE_ARGS))
res = app.get('/invalid', headers=dict(
Cookie='beaker.session.id=df7324911e246b70b5781c3c58328442; Path=/'))
assert 'current value is: 2' in res
def test_session_timeout(self):
app = WebTestApp(SessionMiddleware(self.simple_session_app, timeout=1, **self.CACHE_ARGS))
session = app.app._get_session()
session.save()
if self.SUPPORTS_TIMEOUT:
assert session.namespace.timeout == 121
res = app.get('/')
assert 'current value is: 1' in res
res = app.get('/')
assert 'current value is: 2' in res
res = app.get('/')
assert 'current value is: 3' in res
def test_has_key(self):
cache = Cache('test', **self.CACHE_ARGS)
o = object()
cache.set_value("test", o)
assert cache.has_key("test")
assert "test" in cache
assert not cache.has_key("foo")
assert "foo" not in cache
cache.remove_value("test")
assert not cache.has_key("test")
def test_clear(self):
cache = Cache('test', **self.CACHE_ARGS)
cache.set_value('test', 20)
cache.set_value('fred', 10)
assert cache.has_key('test')
assert 'test' in cache
assert cache.has_key('fred')
cache.clear()
assert not cache.has_key("test")
def test_has_key_multicache(self):
cache = Cache('test', **self.CACHE_ARGS)
o = object()
cache.set_value("test", o)
assert cache.has_key("test")
assert "test" in cache
cache = Cache('test', **self.CACHE_ARGS)
assert cache.has_key("test")
def test_unicode_keys(self):
cache = Cache('test', **self.CACHE_ARGS)
o = object()
cache.set_value(u_('hiŏ'), o)
assert u_('hiŏ') in cache
assert u_('hŏa') not in cache
cache.remove_value(u_('hiŏ'))
assert u_('hiŏ') not in cache
def test_long_unicode_keys(self):
cache = Cache('test', **self.CACHE_ARGS)
o = object()
long_str = u_(
'Очень длинная строка, которая не влезает в сто двадцать восемь байт и поэтому не проходит ограничение в check_key, что очень прискорбно, не правда ли, друзья? Давайте же скорее исправим это досадное недоразумение!'
)
cache.set_value(long_str, o)
assert long_str in cache
cache.remove_value(long_str)
assert long_str not in cache
def test_spaces_in_unicode_keys(self):
cache = Cache('test', **self.CACHE_ARGS)
o = object()
cache.set_value(u_('hi ŏ'), o)
assert u_('hi ŏ') in cache
assert u_('hŏa') not in cache
cache.remove_value(u_('hi ŏ'))
assert u_('hi ŏ') not in cache
def test_spaces_in_keys(self):
cache = Cache('test', **self.CACHE_ARGS)
cache.set_value("has space", 24)
assert cache.has_key("has space")
assert 24 == cache.get_value("has space")
cache.set_value("hasspace", 42)
assert cache.has_key("hasspace")
assert 42 == cache.get_value("hasspace")
def test_increment(self):
app = WebTestApp(CacheMiddleware(self.simple_app))
res = app.get('/', extra_environ={'beaker.clear': True})
assert 'current value is: 1' in res
res = app.get('/')
assert 'current value is: 2' in res
res = app.get('/')
assert 'current value is: 3' in res
app = WebTestApp(CacheMiddleware(self.simple_app))
res = app.get('/', extra_environ={'beaker.clear': True})
assert 'current value is: 1' in res
res = app.get('/')
assert 'current value is: 2' in res
res = app.get('/')
assert 'current value is: 3' in res
def test_cache_manager(self):
app = WebTestApp(CacheMiddleware(self.cache_manager_app))
res = app.get('/')
assert 'test_key is: test value' in res
assert 'test_key cleared' in res
def test_store_none(self):
app = WebTestApp(CacheMiddleware(self.using_none_app))
res = app.get('/', extra_environ={'beaker.clear': True})
assert 'current value is: 10' in res
res = app.get('/')
assert 'current value is: None' in res
def test_expiretime(self):
cache = Cache('test', **self.CACHE_ARGS)
cache.set_value("has space", 24, expiretime=1)
assert cache.has_key("has space")
time.sleep(1.1)
assert not cache.has_key("has space")
def test_expiretime_automatic(self):
if not self.SUPPORTS_EXPIRATION:
self.skipTest('NamespaceManager does not support automatic expiration')
cache = Cache('test', **self.CACHE_ARGS)
cache.set_value("has space", 24, expiretime=1)
assert cache.namespace.has_key("has space")
time.sleep(1.1)
assert not cache.namespace.has_key("has space")
def test_createfunc(self):
cache = Cache('test', **self.CACHE_ARGS)
def createfunc():
createfunc.count += 1
return createfunc.count
createfunc.count = 0
def keepitlocked():
lock = cache.namespace.get_creation_lock('test')
lock.acquire()
keepitlocked.acquired = True
time.sleep(1.0)
lock.release()
keepitlocked.acquired = False
v0 = cache.get_value('test', createfunc=createfunc)
self.assertEqual(v0, 1)
v0 = cache.get_value('test', createfunc=createfunc)
self.assertEqual(v0, 1)
cache.remove_value('test')
begin = datetime.datetime.utcnow()
t = threading.Thread(target=keepitlocked)
t.start()
while not keepitlocked.acquired:
# Wait for the thread that should lock the cache to start.
time.sleep(0.001)
v0 = cache.get_value('test', createfunc=createfunc)
self.assertEqual(v0, 2)
# Ensure that the `get_value` was blocked by the concurrent thread.
assert datetime.datetime.utcnow() - begin > datetime.timedelta(seconds=1)
t.join()
beaker-1.12.1/tests/test_managers/test_ext_mongodb.py000066400000000000000000000007541436751141500227560ustar00rootroot00000000000000from beaker.cache import Cache
from . import base
class TestMongoDB(base.CacheManagerBaseTests):
SUPPORTS_TIMEOUT = False
CACHE_ARGS = {
'type': 'ext:mongodb',
'url': 'mongodb://localhost:27017/beaker_testdb'
}
def test_client_reuse(self):
cache1 = Cache('test1', **self.CACHE_ARGS)
cli1 = cache1.namespace.client
cache2 = Cache('test2', **self.CACHE_ARGS)
cli2 = cache2.namespace.client
self.assertTrue(cli1 is cli2)beaker-1.12.1/tests/test_managers/test_ext_redis.py000066400000000000000000000006751436751141500224410ustar00rootroot00000000000000from beaker.cache import Cache
from . import base
class TestRedis(base.CacheManagerBaseTests):
CACHE_ARGS = {
'type': 'ext:redis',
'url': 'redis://localhost:6379/13'
}
def test_client_reuse(self):
cache1 = Cache('test1', **self.CACHE_ARGS)
cli1 = cache1.namespace.client
cache2 = Cache('test2', **self.CACHE_ARGS)
cli2 = cache2.namespace.client
self.assertTrue(cli1 is cli2)beaker-1.12.1/tests/test_memcached.py000066400000000000000000000313361436751141500175230ustar00rootroot00000000000000# coding: utf-8
from beaker._compat import u_
import mock
from beaker.cache import Cache, CacheManager, util
from beaker.middleware import CacheMiddleware, SessionMiddleware
from beaker.exceptions import InvalidCacheBackendError
from beaker.util import parse_cache_config_options
import unittest
try:
from webtest import TestApp as WebTestApp
except ImportError:
WebTestApp = None
try:
from beaker.ext import memcached
client = memcached._load_client()
except InvalidCacheBackendError:
raise unittest.SkipTest("an appropriate memcached backend is not installed")
mc_url = '127.0.0.1:11211'
c =client.Client([mc_url])
c.set('x', 'y')
if not c.get('x'):
raise unittest.SkipTest("Memcached is not running at %s" % mc_url)
def teardown_module():
import shutil
shutil.rmtree('./cache', True)
def simple_session_app(environ, start_response):
session = environ['beaker.session']
sess_id = environ.get('SESSION_ID')
if environ['PATH_INFO'].startswith('/invalid'):
# Attempt to access the session
id = session.id
session['value'] = 2
else:
if sess_id:
session = session.get_by_id(sess_id)
if not session:
start_response('200 OK', [('Content-type', 'text/plain')])
return ["No session id of %s found." % sess_id]
if not session.has_key('value'):
session['value'] = 0
session['value'] += 1
if not environ['PATH_INFO'].startswith('/nosave'):
session.save()
start_response('200 OK', [('Content-type', 'text/plain')])
return [
('The current value is: %d, session id is %s' % (
session['value'], session.id
)).encode('utf-8')
]
def simple_app(environ, start_response):
extra_args = {}
clear = False
if environ.get('beaker.clear'):
clear = True
extra_args['type'] = 'ext:memcached'
extra_args['url'] = mc_url
extra_args['data_dir'] = './cache'
cache = environ['beaker.cache'].get_cache('testcache', **extra_args)
if clear:
cache.clear()
try:
value = cache.get_value('value')
except:
value = 0
cache.set_value('value', value+1)
start_response('200 OK', [('Content-type', 'text/plain')])
return [
('The current value is: %s' % cache.get_value('value')).encode('utf-8')
]
def using_none_app(environ, start_response):
extra_args = {}
clear = False
if environ.get('beaker.clear'):
clear = True
extra_args['type'] = 'ext:memcached'
extra_args['url'] = mc_url
extra_args['data_dir'] = './cache'
cache = environ['beaker.cache'].get_cache('testcache', **extra_args)
if clear:
cache.clear()
try:
value = cache.get_value('value')
except:
value = 10
cache.set_value('value', None)
start_response('200 OK', [('Content-type', 'text/plain')])
return [
('The current value is: %s' % value).encode('utf-8')
]
def cache_manager_app(environ, start_response):
cm = environ['beaker.cache']
cm.get_cache('test')['test_key'] = 'test value'
start_response('200 OK', [('Content-type', 'text/plain')])
yield (
"test_key is: %s\n" % cm.get_cache('test')['test_key']
).encode('utf-8')
cm.get_cache('test').clear()
try:
test_value = cm.get_cache('test')['test_key']
except KeyError:
yield "test_key cleared".encode('utf-8')
else:
yield ("test_key wasn't cleared, is: %s\n" % (
cm.get_cache('test')['test_key'],
)).encode('utf-8')
@util.skip_if(lambda: WebTestApp is None, "webtest not installed")
def test_session():
app = WebTestApp(SessionMiddleware(simple_session_app, data_dir='./cache', type='ext:memcached', url=mc_url))
res = app.get('/')
assert 'current value is: 1' in res
res = app.get('/')
assert 'current value is: 2' in res
res = app.get('/')
assert 'current value is: 3' in res
@util.skip_if(lambda: WebTestApp is None, "webtest not installed")
def test_session_invalid():
app = WebTestApp(SessionMiddleware(simple_session_app, data_dir='./cache', type='ext:memcached', url=mc_url))
res = app.get('/invalid', headers=dict(Cookie='beaker.session.id=df7324911e246b70b5781c3c58328442; Path=/'))
assert 'current value is: 2' in res
def test_has_key():
cache = Cache('test', data_dir='./cache', url=mc_url, type='ext:memcached')
o = object()
cache.set_value("test", o)
assert cache.has_key("test")
assert "test" in cache
assert not cache.has_key("foo")
assert "foo" not in cache
cache.remove_value("test")
assert not cache.has_key("test")
def test_dropping_keys():
cache = Cache('test', data_dir='./cache', url=mc_url, type='ext:memcached')
cache.set_value('test', 20)
cache.set_value('fred', 10)
assert cache.has_key('test')
assert 'test' in cache
assert cache.has_key('fred')
# Directly nuke the actual key, to simulate it being removed by memcached
cache.namespace.mc.delete('test_test')
assert not cache.has_key('test')
assert cache.has_key('fred')
# Nuke the keys dict, it might die, who knows
cache.namespace.mc.delete('test:keys')
assert cache.has_key('fred')
# And we still need clear to work, even if it won't work well
cache.clear()
def test_deleting_keys():
cache = Cache('test', data_dir='./cache', url=mc_url, type='ext:memcached')
cache.set_value('test', 20)
# Nuke the keys dict, it might die, who knows
cache.namespace.mc.delete('test:keys')
assert cache.has_key('test')
# make sure we can still delete keys even though our keys dict got nuked
del cache['test']
assert not cache.has_key('test')
def test_has_key_multicache():
cache = Cache('test', data_dir='./cache', url=mc_url, type='ext:memcached')
o = object()
cache.set_value("test", o)
assert cache.has_key("test")
assert "test" in cache
cache = Cache('test', data_dir='./cache', url=mc_url, type='ext:memcached')
assert cache.has_key("test")
def test_unicode_keys():
cache = Cache('test', data_dir='./cache', url=mc_url, type='ext:memcached')
o = object()
cache.set_value(u_('hiŏ'), o)
assert u_('hiŏ') in cache
assert u_('hŏa') not in cache
cache.remove_value(u_('hiŏ'))
assert u_('hiŏ') not in cache
def test_long_unicode_keys():
cache = Cache('test', data_dir='./cache', url=mc_url, type='ext:memcached')
o = object()
long_str = u_('Очень длинная строка, которая не влезает в сто двадцать восемь байт и поэтому не проходит ограничение в check_key, что очень прискорбно, не правда ли, друзья? Давайте же скорее исправим это досадное недоразумение!')
cache.set_value(long_str, o)
assert long_str in cache
cache.remove_value(long_str)
assert long_str not in cache
def test_spaces_in_unicode_keys():
cache = Cache('test', data_dir='./cache', url=mc_url, type='ext:memcached')
o = object()
cache.set_value(u_('hi ŏ'), o)
assert u_('hi ŏ') in cache
assert u_('hŏa') not in cache
cache.remove_value(u_('hi ŏ'))
assert u_('hi ŏ') not in cache
def test_spaces_in_keys():
cache = Cache('test', data_dir='./cache', url=mc_url, type='ext:memcached')
cache.set_value("has space", 24)
assert cache.has_key("has space")
assert 24 == cache.get_value("has space")
cache.set_value("hasspace", 42)
assert cache.has_key("hasspace")
assert 42 == cache.get_value("hasspace")
@util.skip_if(lambda: WebTestApp is None, "webtest not installed")
def test_increment():
app = WebTestApp(CacheMiddleware(simple_app))
res = app.get('/', extra_environ={'beaker.clear':True})
assert 'current value is: 1' in res.text
res = app.get('/')
assert 'current value is: 2' in res.text
res = app.get('/')
assert 'current value is: 3' in res.text
app = WebTestApp(CacheMiddleware(simple_app))
res = app.get('/', extra_environ={'beaker.clear':True})
assert 'current value is: 1' in res
res = app.get('/')
assert 'current value is: 2' in res
res = app.get('/')
assert 'current value is: 3' in res
@util.skip_if(lambda: WebTestApp is None, "webtest not installed")
def test_cache_manager():
app = WebTestApp(CacheMiddleware(cache_manager_app))
res = app.get('/')
assert 'test_key is: test value' in res.text
assert 'test_key cleared' in res.text
@util.skip_if(lambda: WebTestApp is None, "webtest not installed")
def test_store_none():
app = WebTestApp(CacheMiddleware(using_none_app))
res = app.get('/', extra_environ={'beaker.clear':True})
assert 'current value is: 10' in res.text
res = app.get('/')
assert 'current value is: None' in res.text
class TestPylibmcInit(unittest.TestCase):
def setUp(self):
from beaker.ext import memcached
try:
import pylibmc as memcache
except:
import memcache
memcached._client_libs['pylibmc'] = memcached.pylibmc = memcache
from contextlib import contextmanager
class ThreadMappedPool(dict):
"a mock of pylibmc's ThreadMappedPool"
def __init__(self, master):
self.master = master
@contextmanager
def reserve(self):
yield self.master
memcache.ThreadMappedPool = ThreadMappedPool
def test_uses_pylibmc_client(self):
from beaker.ext import memcached
cache = Cache('test', data_dir='./cache',
memcache_module='pylibmc',
url=mc_url, type="ext:memcached")
assert isinstance(cache.namespace, memcached.PyLibMCNamespaceManager)
def test_dont_use_pylibmc_client(self):
from beaker.ext.memcached import _load_client
load_mock = mock.Mock()
load_mock.return_value = _load_client('memcache')
with mock.patch('beaker.ext.memcached._load_client', load_mock):
cache = Cache('test', data_dir='./cache', url=mc_url, type="ext:memcached")
assert not isinstance(cache.namespace, memcached.PyLibMCNamespaceManager)
assert isinstance(cache.namespace, memcached.MemcachedNamespaceManager)
def test_client(self):
cache = Cache('test', data_dir='./cache', url=mc_url, type="ext:memcached",
protocol='binary')
o = object()
cache.set_value("test", o)
assert cache.has_key("test")
assert "test" in cache
assert not cache.has_key("foo")
assert "foo" not in cache
cache.remove_value("test")
assert not cache.has_key("test")
def test_client_behaviors(self):
config = {
'cache.lock_dir':'./lock',
'cache.data_dir':'./cache',
'cache.type':'ext:memcached',
'cache.url':mc_url,
'cache.memcache_module':'pylibmc',
'cache.protocol':'binary',
'cache.behavior.ketama': 'True',
'cache.behavior.cas':False,
'cache.behavior.receive_timeout':'3600',
'cache.behavior.send_timeout':1800,
'cache.behavior.tcp_nodelay':1,
'cache.behavior.auto_eject':"0"
}
cache_manager = CacheManager(**parse_cache_config_options(config))
cache = cache_manager.get_cache('test_behavior', expire=6000)
with cache.namespace.pool.reserve() as mc:
assert "ketama" in mc.behaviors
assert mc.behaviors["ketama"] == 1
assert "cas" in mc.behaviors
assert mc.behaviors["cas"] == 0
assert "receive_timeout" in mc.behaviors
assert mc.behaviors["receive_timeout"] == 3600
assert "send_timeout" in mc.behaviors
assert mc.behaviors["send_timeout"] == 1800
assert "tcp_nodelay" in mc.behaviors
assert mc.behaviors["tcp_nodelay"] == 1
assert "auto_eject" in mc.behaviors
assert mc.behaviors["auto_eject"] == 0
def test_pylibmc_pool_sharing(self):
from beaker.ext import memcached
cache_1a = Cache('test_1a', data_dir='./cache',
memcache_module='pylibmc',
url=mc_url, type="ext:memcached")
cache_1b = Cache('test_1b', data_dir='./cache',
memcache_module='pylibmc',
url=mc_url, type="ext:memcached")
cache_2 = Cache('test_2', data_dir='./cache',
memcache_module='pylibmc',
url='127.0.0.1:11212', type="ext:memcached")
assert (cache_1a.namespace.pool is cache_1b.namespace.pool)
assert (cache_1a.namespace.pool is not cache_2.namespace.pool)
beaker-1.12.1/tests/test_namespacing.py000066400000000000000000000003751436751141500201010ustar00rootroot00000000000000import os
import sys
def teardown_module():
import shutil
shutil.rmtree('./cache', True)
def test_consistent_namespacing():
sys.path.append(os.path.dirname(__file__))
from tests.test_namespacing_files.namespace_go import go
go()
beaker-1.12.1/tests/test_namespacing_files/000077500000000000000000000000001436751141500207045ustar00rootroot00000000000000beaker-1.12.1/tests/test_namespacing_files/__init__.py000066400000000000000000000000001436751141500230030ustar00rootroot00000000000000beaker-1.12.1/tests/test_namespacing_files/namespace_get.py000066400000000000000000000006761436751141500240620ustar00rootroot00000000000000from beaker.cache import CacheManager
from beaker.util import parse_cache_config_options
from datetime import datetime
defaults = {'cache.data_dir':'./cache', 'cache.type':'dbm', 'cache.expire': 60, 'cache.regions': 'short_term'}
cache = CacheManager(**parse_cache_config_options(defaults))
def get_cached_value():
@cache.region('short_term', 'test_namespacing')
def get_value():
return datetime.now()
return get_value()
beaker-1.12.1/tests/test_namespacing_files/namespace_go.py000066400000000000000000000012161436751141500236770ustar00rootroot00000000000000from __future__ import print_function
import time
def go():
from . import namespace_get
a = namespace_get.get_cached_value()
time.sleep(0.3)
b = namespace_get.get_cached_value()
time.sleep(0.3)
from ..test_namespacing_files import namespace_get as upper_ns_get
c = upper_ns_get.get_cached_value()
time.sleep(0.3)
d = upper_ns_get.get_cached_value()
print(a)
print(b)
print(c)
print(d)
assert a == b, 'Basic caching problem - should never happen'
assert c == d, 'Basic caching problem - should never happen'
assert a == c, 'Namespaces not consistent when using different import paths'
beaker-1.12.1/tests/test_pbkdf2.py000066400000000000000000000025271436751141500167650ustar00rootroot00000000000000from __future__ import unicode_literals
from binascii import b2a_hex, a2b_hex
from beaker.crypto.pbkdf2 import pbkdf2
def test_pbkdf2_test1():
result = pbkdf2("password", "ATHENA.MIT.EDUraeburn", 1, dklen=16)
expected = a2b_hex(b"cdedb5281bb2f801565a1122b2563515")
assert result == expected, (result, expected)
def test_pbkdf2_test2():
result = b2a_hex(pbkdf2("password", "ATHENA.MIT.EDUraeburn", 1200, dklen=32))
expected = b"5c08eb61fdf71e4e4ec3cf6ba1f5512ba7e52ddbc5e5142f708a31e2e62b1e13"
assert result == expected, (result, expected)
def test_pbkdf2_test3():
result = b2a_hex(pbkdf2("X"*64, "pass phrase equals block size", 1200, dklen=32))
expected = b"139c30c0966bc32ba55fdbf212530ac9c5ec59f1a452f5cc9ad940fea0598ed1"
assert result == expected, (result, expected)
def test_pbkdf2_test4():
result = b2a_hex(pbkdf2("X"*65, "pass phrase exceeds block size", 1200, dklen=32))
expected = b"9ccad6d468770cd51b10e6a68721be611a8b4d282601db3b36be9246915ec82a"
assert result == expected, (result, expected)
def test_pbkd2_issue81():
"""Test for Regression on Incorrect behavior of bytes_() under Python3.4
https://github.com/bbangert/beaker/issues/81
"""
result = pbkdf2("MASTER_KEY", b"SALT", 1)
expected = pbkdf2("MASTER_KEY", "SALT", 1)
assert result == expected, (result, expected)
beaker-1.12.1/tests/test_session.py000066400000000000000000000524011436751141500172740ustar00rootroot00000000000000# -*- coding: utf-8 -*-
from beaker._compat import u_, pickle, b64decode
import binascii
import shutil
import sys
import time
import unittest
import warnings
import pytest
from beaker.container import MemoryNamespaceManager
from beaker.crypto import get_crypto_module
from beaker.exceptions import BeakerException
from beaker.session import CookieSession, Session, SessionObject
from beaker.util import deserialize
def get_session(**kwargs):
"""A shortcut for creating :class:`Session` instance"""
options = {}
options.update(**kwargs)
return Session({}, **options)
COOKIE_REQUEST = {}
def setup_cookie_request():
COOKIE_REQUEST.clear()
def get_cookie_session(**kwargs):
"""A shortcut for creating :class:`CookieSession` instance"""
options = {'validate_key': 'test_key'}
options.update(**kwargs)
if COOKIE_REQUEST.get('set_cookie'):
COOKIE_REQUEST['cookie'] = COOKIE_REQUEST.get('cookie_out')
return CookieSession(COOKIE_REQUEST, **options)
def test_session():
setup_cookie_request()
for test_case in (
check_save_load,
check_save_load_encryption,
check_save_load_encryption_cryptography,
check_decryption_failure,
check_delete,
check_revert,
check_invalidate,
check_timeout,
):
for session_getter in (get_session, get_cookie_session,):
setup_cookie_request()
test_case(session_getter)
def check_save_load(session_getter):
"""Test if the data is actually persistent across requests"""
session = session_getter()
session[u_('Suomi')] = u_('Kimi Räikkönen')
session[u_('Great Britain')] = u_('Jenson Button')
session[u_('Deutchland')] = u_('Sebastian Vettel')
session.save()
session = session_getter(id=session.id)
assert u_('Suomi') in session
assert u_('Great Britain') in session
assert u_('Deutchland') in session
assert session[u_('Suomi')] == u_('Kimi Räikkönen')
assert session[u_('Great Britain')] == u_('Jenson Button')
assert session[u_('Deutchland')] == u_('Sebastian Vettel')
@pytest.mark.skipif(not get_crypto_module('default').has_aes)
def check_save_load_encryption(session_getter):
"""Test if the data is actually persistent across requests"""
session = session_getter(encrypt_key='666a19cf7f61c64c',
validate_key='hoobermas')
session[u_('Suomi')] = u_('Kimi Räikkönen')
session[u_('Great Britain')] = u_('Jenson Button')
session[u_('Deutchland')] = u_('Sebastian Vettel')
session.save()
session = session_getter(id=session.id, encrypt_key='666a19cf7f61c64c',
validate_key='hoobermas')
assert u_('Suomi') in session
assert u_('Great Britain') in session
assert u_('Deutchland') in session
assert session[u_('Suomi')] == u_('Kimi Räikkönen')
assert session[u_('Great Britain')] == u_('Jenson Button')
assert session[u_('Deutchland')] == u_('Sebastian Vettel')
# cryptography only works for py3.3+, so skip for python 3.2
@pytest.mark.skipif(sys.version_info[0] == 3 and sys.version_info[1] < 3,
reason="Cryptography not supported on Python 3 lower than 3.3")
def check_save_load_encryption_cryptography(session_getter):
"""Test if the data is actually persistent across requests"""
try:
get_crypto_module('cryptography').has_aes
except BeakerException:
raise unittest.SkipTest()
session = session_getter(
encrypt_key='666a19cf7f61c64c',
validate_key='hoobermas',
crypto_type='cryptography')
session[u_('Suomi')] = u_('Kimi Räikkönen')
session[u_('Great Britain')] = u_('Jenson Button')
session[u_('Deutchland')] = u_('Sebastian Vettel')
session.save()
session = session_getter(
id=session.id, encrypt_key='666a19cf7f61c64c',
validate_key='hoobermas',
crypto_type='cryptography')
assert u_('Suomi') in session
assert u_('Great Britain') in session
assert u_('Deutchland') in session
assert session[u_('Suomi')] == u_('Kimi Räikkönen')
assert session[u_('Great Britain')] == u_('Jenson Button')
assert session[u_('Deutchland')] == u_('Sebastian Vettel')
@pytest.mark.skipif(not get_crypto_module('default').has_aes)
def check_decryption_failure(session_getter):
"""Test if the data fails without the right keys"""
session = session_getter(encrypt_key='666a19cf7f61c64c',
validate_key='hoobermas')
session[u_('Suomi')] = u_('Kimi Räikkönen')
session[u_('Great Britain')] = u_('Jenson Button')
session[u_('Deutchland')] = u_('Sebastian Vettel')
session.save()
session = session_getter(id=session.id, encrypt_key='asfdasdfadsfsadf',
validate_key='hoobermas', invalidate_corrupt=True)
assert u_('Suomi') not in session
assert u_('Great Britain') not in session
def check_delete(session_getter):
"""Test :meth:`Session.delete`"""
session = session_getter()
session[u_('Suomi')] = u_('Kimi Räikkönen')
session[u_('Great Britain')] = u_('Jenson Button')
session[u_('Deutchland')] = u_('Sebastian Vettel')
session.delete()
assert u_('Suomi') not in session
assert u_('Great Britain') not in session
assert u_('Deutchland') not in session
def check_revert(session_getter):
"""Test :meth:`Session.revert`"""
session = session_getter()
session[u_('Suomi')] = u_('Kimi Räikkönen')
session[u_('Great Britain')] = u_('Jenson Button')
session[u_('Deutchland')] = u_('Sebastian Vettel')
session.save()
session = session_getter(id=session.id)
del session[u_('Suomi')]
session[u_('Great Britain')] = u_('Lewis Hamilton')
session[u_('Deutchland')] = u_('Michael Schumacher')
session[u_('España')] = u_('Fernando Alonso')
session.revert()
assert session[u_('Suomi')] == u_('Kimi Räikkönen')
assert session[u_('Great Britain')] == u_('Jenson Button')
assert session[u_('Deutchland')] == u_('Sebastian Vettel')
assert u_('España') not in session
def check_invalidate(session_getter):
"""Test :meth:`Session.invalidate`"""
session = session_getter()
session.save()
id = session.id
created = session.created
session[u_('Suomi')] = u_('Kimi Räikkönen')
session[u_('Great Britain')] = u_('Jenson Button')
session[u_('Deutchland')] = u_('Sebastian Vettel')
session.invalidate()
session.save()
assert session.id != id
assert session.created != created
assert u_('Suomi') not in session
assert u_('Great Britain') not in session
assert u_('Deutchland') not in session
def test_regenerate_id():
"""Test :meth:`Session.regenerate_id`"""
# new session & save
setup_cookie_request()
session = get_session()
orig_id = session.id
session[u_('foo')] = u_('bar')
session.save()
# load session
session = get_session(id=session.id)
# data should still be there
assert session[u_('foo')] == u_('bar')
# regenerate the id
session.regenerate_id()
assert session.id != orig_id
# data is still there
assert session[u_('foo')] == u_('bar')
# should be the new id
assert 'beaker.session.id=%s' % session.id in session.request['cookie_out']
# get a new session before calling save
bunk_sess = get_session(id=session.id)
assert u_('foo') not in bunk_sess
# save it
session.save()
# make sure we get the data back
session = get_session(id=session.id)
assert session[u_('foo')] == u_('bar')
def check_timeout(session_getter):
"""Test if the session times out properly"""
session = session_getter(timeout=2)
session.save()
id = session.id
created = session.created
session[u_('Suomi')] = u_('Kimi Räikkönen')
session[u_('Great Britain')] = u_('Jenson Button')
session[u_('Deutchland')] = u_('Sebastian Vettel')
session.save()
session = session_getter(id=session.id, timeout=2)
assert session.id == id
assert session.created == created
assert session[u_('Suomi')] == u_('Kimi Räikkönen')
assert session[u_('Great Britain')] == u_('Jenson Button')
assert session[u_('Deutchland')] == u_('Sebastian Vettel')
time.sleep(2)
session = session_getter(id=session.id, timeout=2)
assert session.id != id
assert session.created != created
assert u_('Suomi') not in session
assert u_('Great Britain') not in session
assert u_('Deutchland') not in session
def test_timeout_requires_accessed_time():
"""Test that it doesn't allow setting save_accessed_time to False with
timeout enabled
"""
setup_cookie_request()
get_session(timeout=None, save_accessed_time=True) # is ok
get_session(timeout=None, save_accessed_time=False) # is ok
with pytest.raises(BeakerException):
get_session(timeout=2, save_accessed_time=False)
def test_cookies_enabled():
"""
Test if cookies are sent out properly when ``use_cookies``
is set to ``True``
"""
setup_cookie_request()
session = get_session(use_cookies=True)
assert 'cookie_out' in session.request
assert not session.request['set_cookie']
session.domain = 'example.com'
session.path = '/example'
assert session.request['set_cookie']
assert 'beaker.session.id=%s' % session.id in session.request['cookie_out']
assert 'Domain=example.com' in session.request['cookie_out']
assert 'Path=/' in session.request['cookie_out']
session = get_session(use_cookies=True)
session.save()
assert session.request['set_cookie']
assert 'beaker.session.id=%s' % session.id in session.request['cookie_out']
session = get_session(use_cookies=True, id=session.id)
session.delete()
assert session.request['set_cookie']
assert 'beaker.session.id=%s' % session.id in session.request['cookie_out']
assert 'expires=' in session.request['cookie_out']
# test for secure
session = get_session(use_cookies=True, secure=True)
cookie = session.request['cookie_out'].lower() # Python3.4.3 outputs "Secure", while previous output "secure"
assert 'secure' in cookie, cookie
# test for httponly
class ShowWarning(object):
def __init__(self):
self.msg = None
def __call__(self, message, category, filename, lineno, file=None, line=None):
self.msg = str(message)
orig_sw = warnings.showwarning
sw = ShowWarning()
warnings.showwarning = sw
session = get_session(use_cookies=True, httponly=True)
if sys.version_info < (2, 6):
assert sw.msg == 'Python 2.6+ is required to use httponly'
else:
# Python3.4.3 outputs "HttpOnly", while previous output "httponly"
cookie = session.request['cookie_out'].lower()
assert 'httponly' in cookie, cookie
warnings.showwarning = orig_sw
def test_cookies_disabled():
"""
Test that no cookies are sent when ``use_cookies`` is set to ``False``
"""
session = get_session(use_cookies=False)
assert 'set_cookie' not in session.request
assert 'cookie_out' not in session.request
session.save()
assert 'set_cookie' not in session.request
assert 'cookie_out' not in session.request
session = get_session(use_cookies=False, id=session.id)
assert 'set_cookie' not in session.request
assert 'cookie_out' not in session.request
session.delete()
assert 'set_cookie' not in session.request
assert 'cookie_out' not in session.request
def test_file_based_replace_optimization():
"""Test the file-based backend with session,
which includes the 'replace' optimization.
"""
setup_cookie_request()
session = get_session(use_cookies=False, type='file',
data_dir='./cache')
session['foo'] = 'foo'
session['bar'] = 'bar'
session.save()
session = get_session(use_cookies=False, type='file',
data_dir='./cache', id=session.id)
assert session['foo'] == 'foo'
assert session['bar'] == 'bar'
session['bar'] = 'bat'
session['bat'] = 'hoho'
session.save()
session.namespace.do_open('c', False)
session.namespace['test'] = 'some test'
session.namespace.do_close()
session = get_session(use_cookies=False, type='file',
data_dir='./cache', id=session.id)
session.namespace.do_open('r', False)
assert session.namespace['test'] == 'some test'
session.namespace.do_close()
assert session['foo'] == 'foo'
assert session['bar'] == 'bat'
assert session['bat'] == 'hoho'
session.save()
# the file has been replaced, so our out-of-session
# key is gone
session.namespace.do_open('r', False)
assert 'test' not in session.namespace
session.namespace.do_close()
def test_use_json_serializer_without_encryption_key():
setup_cookie_request()
so = get_session(use_cookies=False, type='file', data_dir='./cache', data_serializer='json')
so['foo'] = 'bar'
so.save()
session = get_session(id=so.id, use_cookies=False, type='file', data_dir='./cache', data_serializer='json')
assert 'foo' in session
serialized_session = open(session.namespace.file, 'rb').read()
memory_state = pickle.loads(serialized_session)
session_data = b64decode(memory_state.get('session'))
data = deserialize(session_data, 'json')
assert 'foo' in data
def test_invalidate_corrupt():
setup_cookie_request()
session = get_session(use_cookies=False, type='file',
data_dir='./cache')
session['foo'] = 'bar'
session.save()
f = open(session.namespace.file, 'w')
f.write("crap")
f.close()
with pytest.raises((pickle.UnpicklingError, EOFError, TypeError, binascii.Error,)):
get_session(use_cookies=False, type='file',
data_dir='./cache', id=session.id)
session = get_session(use_cookies=False, type='file',
invalidate_corrupt=True,
data_dir='./cache', id=session.id)
assert "foo" not in dict(session)
def test_invalidate_empty_cookie():
setup_cookie_request()
kwargs = {'validate_key': 'test_key', 'encrypt_key': 'encrypt'}
session = get_cookie_session(**kwargs)
session['foo'] = 'bar'
session.save()
COOKIE_REQUEST['cookie_out'] = ' beaker.session.id='
session = get_cookie_session(id=session.id, invalidate_corrupt=False, **kwargs)
assert "foo" not in dict(session)
def test_unrelated_cookie():
setup_cookie_request()
kwargs = {'validate_key': 'test_key', 'encrypt_key': 'encrypt'}
session = get_cookie_session(**kwargs)
session['foo'] = 'bar'
session.save()
COOKIE_REQUEST['cookie_out'] = COOKIE_REQUEST['cookie_out'] + '; some.other=cookie'
session = get_cookie_session(id=session.id, invalidate_corrupt=False, **kwargs)
assert "foo" in dict(session)
def test_invalidate_invalid_signed_cookie():
setup_cookie_request()
kwargs = {'validate_key': 'test_key', 'encrypt_key': 'encrypt'}
session = get_cookie_session(**kwargs)
session['foo'] = 'bar'
session.save()
COOKIE_REQUEST['cookie_out'] = (
COOKIE_REQUEST['cookie_out'][:20] +
'aaaaa' +
COOKIE_REQUEST['cookie_out'][25:]
)
with pytest.raises(BeakerException):
get_cookie_session(id=session.id, invalidate_corrupt=False)
def test_invalidate_invalid_signed_cookie_invalidate_corrupt():
setup_cookie_request()
kwargs = {'validate_key': 'test_key', 'encrypt_key': 'encrypt'}
session = get_cookie_session(**kwargs)
session['foo'] = 'bar'
session.save()
COOKIE_REQUEST['cookie_out'] = (
COOKIE_REQUEST['cookie_out'][:20] +
'aaaaa' +
COOKIE_REQUEST['cookie_out'][25:]
)
session = get_cookie_session(id=session.id, invalidate_corrupt=True, **kwargs)
assert "foo" not in dict(session)
def test_load_deleted_from_storage_session__not_loaded():
req = {'cookie': {'beaker.session.id': 123}}
session = Session(req, timeout=1)
session.delete()
session.save()
Session(req, timeout=1)
class TestSaveAccessedTime(unittest.TestCase):
# These tests can't use the memory session type since it seems that loading
# winds up with references to the underlying storage and makes changes to
# sessions even though they aren't save()ed.
def setUp(self):
# Ignore errors because in most cases the dir won't exist.
shutil.rmtree('./cache', ignore_errors=True)
def tearDown(self):
shutil.rmtree('./cache')
def test_saves_if_session_written_and_accessed_time_false(self):
session = get_session(data_dir='./cache', save_accessed_time=False)
# New sessions are treated a little differently so save the session
# before getting into the meat of the test.
session.save()
session = get_session(data_dir='./cache', save_accessed_time=False,
id=session.id)
last_accessed = session.last_accessed
session.save(accessed_only=False)
session = get_session(data_dir='./cache', save_accessed_time=False,
id=session.id)
# If the second save saved, we'll have a new last_accessed time.
# Python 2.6 doesn't have assertGreater :-(
assert session.last_accessed > last_accessed, (
'%r is not greater than %r' %
(session.last_accessed, last_accessed))
def test_saves_if_session_not_written_and_accessed_time_true(self):
session = get_session(data_dir='./cache', save_accessed_time=True)
# New sessions are treated a little differently so save the session
# before getting into the meat of the test.
session.save()
session = get_session(data_dir='./cache', save_accessed_time=True,
id=session.id)
last_accessed = session.last_accessed
session.save(accessed_only=True) # this is the save we're really testing
session = get_session(data_dir='./cache', save_accessed_time=True,
id=session.id)
# If the second save saved, we'll have a new last_accessed time.
# Python 2.6 doesn't have assertGreater :-(
assert session.last_accessed > last_accessed, (
'%r is not greater than %r' %
(session.last_accessed, last_accessed))
def test_doesnt_save_if_session_not_written_and_accessed_time_false(self):
session = get_session(data_dir='./cache', save_accessed_time=False)
# New sessions are treated a little differently so save the session
# before getting into the meat of the test.
session.save()
session = get_session(data_dir='./cache', save_accessed_time=False,
id=session.id)
last_accessed = session.last_accessed
session.save(accessed_only=True) # this shouldn't actually save
session = get_session(data_dir='./cache', save_accessed_time=False,
id=session.id)
self.assertEqual(session.last_accessed, last_accessed)
class TestSessionObject(unittest.TestCase):
def setUp(self):
# San check that we are in fact using the memory backend...
assert get_session().namespace_class == MemoryNamespaceManager
# so we can be sure we're clearing the right state.
MemoryNamespaceManager.namespaces.clear()
def test_no_autosave_saves_atime_without_save(self):
so = SessionObject({}, auto=False)
so['foo'] = 'bar'
so.persist()
session = get_session(id=so.id)
assert '_accessed_time' in session
assert 'foo' not in session # because we didn't save()
def test_no_autosave_saves_with_save(self):
so = SessionObject({}, auto=False)
so['foo'] = 'bar'
so.save()
so.persist()
session = get_session(id=so.id)
assert '_accessed_time' in session
assert 'foo' in session
def test_no_autosave_saves_with_delete(self):
req = {'cookie': {'beaker.session.id': 123}}
so = SessionObject(req, auto=False)
so['foo'] = 'bar'
so.save()
so.persist()
session = get_session(id=so.id)
assert 'foo' in session
so2 = SessionObject(req, auto=False)
so2.delete()
so2.persist()
session = get_session(id=so2.id)
assert 'foo' not in session
def test_auto_save_saves_without_save(self):
so = SessionObject({}, auto=True)
so['foo'] = 'bar'
# look ma, no save()!
so.persist()
session = get_session(id=so.id)
assert 'foo' in session
def test_accessed_time_off_saves_atime_when_saving(self):
so = SessionObject({}, save_accessed_time=False)
atime = so['_accessed_time']
so['foo'] = 'bar'
so.save()
so.persist()
session = get_session(id=so.id, save_accessed_time=False)
assert 'foo' in session
assert '_accessed_time' in session
self.assertEqual(session.last_accessed, atime)
def test_accessed_time_off_doesnt_save_without_save(self):
req = {'cookie': {'beaker.session.id': 123}}
so = SessionObject(req, save_accessed_time=False)
so.persist() # so we can do a set on a non-new session
so2 = SessionObject(req, save_accessed_time=False)
so2['foo'] = 'bar'
# no save()
so2.persist()
session = get_session(id=so.id, save_accessed_time=False)
assert 'foo' not in session
beaker-1.12.1/tests/test_sqla.py000066400000000000000000000071611436751141500165540ustar00rootroot00000000000000# coding: utf-8
from beaker._compat import u_
from beaker.cache import clsmap, Cache, util
from beaker.exceptions import InvalidCacheBackendError
from beaker.middleware import CacheMiddleware
from unittest import SkipTest
try:
from webtest import TestApp as WebTestApp
except ImportError:
WebTestApp = None
try:
clsmap['ext:sqla']._init_dependencies()
except InvalidCacheBackendError:
raise SkipTest("an appropriate SQLAlchemy backend is not installed")
import sqlalchemy as sa
from beaker.ext.sqla import make_cache_table
engine = sa.create_engine('sqlite://')
metadata = sa.MetaData()
cache_table = make_cache_table(metadata)
metadata.create_all(engine)
def simple_app(environ, start_response):
extra_args = {}
clear = False
if environ.get('beaker.clear'):
clear = True
extra_args['type'] = 'ext:sqla'
extra_args['bind'] = engine
extra_args['table'] = cache_table
extra_args['data_dir'] = './cache'
cache = environ['beaker.cache'].get_cache('testcache', **extra_args)
if clear:
cache.clear()
try:
value = cache.get_value('value')
except:
value = 0
cache.set_value('value', value+1)
start_response('200 OK', [('Content-type', 'text/plain')])
return [('The current value is: %s' % cache.get_value('value')).encode('utf-8')]
def cache_manager_app(environ, start_response):
cm = environ['beaker.cache']
cm.get_cache('test')['test_key'] = 'test value'
start_response('200 OK', [('Content-type', 'text/plain')])
yield ("test_key is: %s\n" % cm.get_cache('test')['test_key']).encode('utf-8')
cm.get_cache('test').clear()
try:
test_value = cm.get_cache('test')['test_key']
except KeyError:
yield ("test_key cleared").encode('utf-8')
else:
test_value = cm.get_cache('test')['test_key']
yield ("test_key wasn't cleared, is: %s\n" % test_value).encode('utf-8')
def make_cache():
"""Return a ``Cache`` for use by the unit tests."""
return Cache('test', data_dir='./cache', bind=engine, table=cache_table,
type='ext:sqla')
def test_has_key():
cache = make_cache()
o = object()
cache.set_value("test", o)
assert cache.has_key("test")
assert "test" in cache
assert not cache.has_key("foo")
assert "foo" not in cache
cache.remove_value("test")
assert not cache.has_key("test")
def test_has_key_multicache():
cache = make_cache()
o = object()
cache.set_value("test", o)
assert cache.has_key("test")
assert "test" in cache
cache = make_cache()
assert cache.has_key("test")
cache.remove_value('test')
def test_clear():
cache = make_cache()
o = object()
cache.set_value("test", o)
assert cache.has_key("test")
cache.clear()
assert not cache.has_key("test")
def test_unicode_keys():
cache = make_cache()
o = object()
cache.set_value(u_('hiŏ'), o)
assert u_('hiŏ') in cache
assert u_('hŏa') not in cache
cache.remove_value(u_('hiŏ'))
assert u_('hiŏ') not in cache
@util.skip_if(lambda: WebTestApp is None, "webtest not installed")
def test_increment():
app = WebTestApp(CacheMiddleware(simple_app))
res = app.get('/', extra_environ={'beaker.clear': True})
assert 'current value is: 1' in res
res = app.get('/')
assert 'current value is: 2' in res
res = app.get('/')
assert 'current value is: 3' in res
@util.skip_if(lambda: WebTestApp is None, "webtest not installed")
def test_cache_manager():
app = WebTestApp(CacheMiddleware(cache_manager_app))
res = app.get('/')
assert 'test_key is: test value' in res
assert 'test_key cleared' in res
beaker-1.12.1/tests/test_syncdict.py000066400000000000000000000036371436751141500174400ustar00rootroot00000000000000from beaker.util import SyncDict, WeakValuedRegistry
import random, time, weakref
import threading
class Value(object):
values = {}
def do_something(self, id):
Value.values[id] = self
def stop_doing_something(self, id):
del Value.values[id]
mutex = threading.Lock()
def create(id):
assert not Value.values, "values still remain"
global totalcreates
totalcreates += 1
return Value()
def threadtest(s, id):
print("create thread %d starting" % id)
global running
global totalgets
while running:
try:
value = s.get('test', lambda: create(id))
value.do_something(id)
except Exception as e:
print("Error", e)
running = False
break
else:
totalgets += 1
time.sleep(random.random() * .01)
value.stop_doing_something(id)
del value
time.sleep(random.random() * .01)
def runtest(s):
global values
values = {}
global totalcreates
totalcreates = 0
global totalgets
totalgets = 0
global running
running = True
threads = []
for id_ in range(1, 20):
t = threading.Thread(target=threadtest, args=(s, id_))
t.start()
threads.append(t)
for i in range(0, 10):
if not running:
break
time.sleep(1)
failed = not running
running = False
for t in threads:
t.join()
assert not failed, "test failed"
print("total object creates %d" % totalcreates)
print("total object gets %d" % totalgets)
def test_dict():
# normal dictionary test, where we will remove the value
# periodically. the number of creates should be equal to
# the number of removes plus one.
print("\ntesting with normal dict")
runtest(SyncDict())
def test_weakdict():
print("\ntesting with weak dict")
runtest(WeakValuedRegistry())
beaker-1.12.1/tests/test_synchronizer.py000066400000000000000000000012241436751141500203430ustar00rootroot00000000000000from beaker.synchronization import *
# TODO: spawn threads, test locking.
def teardown_module():
import shutil
shutil.rmtree('./cache', True)
def test_reentrant_file():
sync1 = file_synchronizer('test', lock_dir='./cache')
sync2 = file_synchronizer('test', lock_dir='./cache')
sync1.acquire_write_lock()
sync2.acquire_write_lock()
sync2.release_write_lock()
sync1.release_write_lock()
def test_null():
sync = null_synchronizer()
assert sync.acquire_write_lock()
sync.release_write_lock()
def test_mutex():
sync = mutex_synchronizer('someident')
sync.acquire_write_lock()
sync.release_write_lock()
beaker-1.12.1/tests/test_unicode_cache_keys.py000066400000000000000000000113101436751141500214070ustar00rootroot00000000000000# coding: utf-8
"""If we try to use a character not in ascii range as a cache key, we get an
unicodeencode error. See
https://bitbucket.org/bbangert/beaker/issue/31/cached-function-decorators-break-when-some
for more on this
"""
from beaker._compat import u_
from beaker.cache import CacheManager
def eq_(a, b, msg=''):
assert a == b, msg
memory_cache = CacheManager(type='memory')
@memory_cache.cache('foo')
def foo(whatever):
return whatever
class bar(object):
@memory_cache.cache('baz')
def baz(self, qux):
return qux
@classmethod
@memory_cache.cache('bar')
def quux(cls, garply):
return garply
def test_A_unicode_encode_key_str():
eq_(foo('Espanol'), 'Espanol')
eq_(foo(12334), 12334)
eq_(foo(u_('Espanol')), u_('Espanol'))
eq_(foo(u_('Español')), u_('Español'))
b = bar()
eq_(b.baz('Espanol'), 'Espanol')
eq_(b.baz(12334), 12334)
eq_(b.baz(u_('Espanol')), u_('Espanol'))
eq_(b.baz(u_('Español')), u_('Español'))
eq_(b.quux('Espanol'), 'Espanol')
eq_(b.quux(12334), 12334)
eq_(b.quux(u_('Espanol')), u_('Espanol'))
eq_(b.quux(u_('Español')), u_('Español'))
def test_B_replacing_non_ascii():
"""we replace the offending character with other non ascii one. Since
the function distinguishes between the two it should not return the
past value
"""
assert foo(u_('Espaáol')) != u_('Español')
eq_(foo(u_('Espaáol')), u_('Espaáol'))
def test_C_more_unicode():
"""We again test the same stuff but this time we use
http://tools.ietf.org/html/draft-josefsson-idn-test-vectors-00#section-5
as keys"""
keys = [
# arabic (egyptian)
u_("\u0644\u064a\u0647\u0645\u0627\u0628\u062a\u0643\u0644\u0645\u0648\u0634\u0639\u0631\u0628\u064a\u061f"),
# Chinese (simplified)
u_("\u4ed6\u4eec\u4e3a\u4ec0\u4e48\u4e0d\u8bf4\u4e2d\u6587"),
# Chinese (traditional)
u_("\u4ed6\u5011\u7232\u4ec0\u9ebd\u4e0d\u8aaa\u4e2d\u6587"),
# czech
u_("\u0050\u0072\u006f\u010d\u0070\u0072\u006f\u0073\u0074\u011b\u006e\u0065\u006d\u006c\u0075\u0076\u00ed\u010d\u0065\u0073\u006b\u0079"),
# hebrew
u_("\u05dc\u05de\u05d4\u05d4\u05dd\u05e4\u05e9\u05d5\u05d8\u05dc\u05d0\u05de\u05d3\u05d1\u05e8\u05d9\u05dd\u05e2\u05d1\u05e8\u05d9\u05ea"),
# Hindi (Devanagari)
u_("\u092f\u0939\u0932\u094b\u0917\u0939\u093f\u0928\u094d\u0926\u0940\u0915\u094d\u092f\u094b\u0902\u0928\u0939\u0940\u0902\u092c\u094b\u0932\u0938\u0915\u0924\u0947\u0939\u0948\u0902"),
# Japanese (kanji and hiragana)
u_("\u306a\u305c\u307f\u3093\u306a\u65e5\u672c\u8a9e\u3092\u8a71\u3057\u3066\u304f\u308c\u306a\u3044\u306e\u304b"),
# Russian (Cyrillic)
u_("\u043f\u043e\u0447\u0435\u043c\u0443\u0436\u0435\u043e\u043d\u0438\u043d\u0435\u0433\u043e\u0432\u043e\u0440\u044f\u0442\u043f\u043e\u0440\u0443\u0441\u0441\u043a\u0438"),
# Spanish
u_("\u0050\u006f\u0072\u0071\u0075\u00e9\u006e\u006f\u0070\u0075\u0065\u0064\u0065\u006e\u0073\u0069\u006d\u0070\u006c\u0065\u006d\u0065\u006e\u0074\u0065\u0068\u0061\u0062\u006c\u0061\u0072\u0065\u006e\u0045\u0073\u0070\u0061\u00f1\u006f\u006c"),
# Vietnamese
u_("\u0054\u1ea1\u0069\u0073\u0061\u006f\u0068\u1ecd\u006b\u0068\u00f4\u006e\u0067\u0074\u0068\u1ec3\u0063\u0068\u1ec9\u006e\u00f3\u0069\u0074\u0069\u1ebf\u006e\u0067\u0056\u0069\u1ec7\u0074"),
# Japanese
u_("\u0033\u5e74\u0042\u7d44\u91d1\u516b\u5148\u751f"),
# Japanese
u_("\u5b89\u5ba4\u5948\u7f8e\u6075\u002d\u0077\u0069\u0074\u0068\u002d\u0053\u0055\u0050\u0045\u0052\u002d\u004d\u004f\u004e\u004b\u0045\u0059\u0053"),
# Japanese
u_("\u0048\u0065\u006c\u006c\u006f\u002d\u0041\u006e\u006f\u0074\u0068\u0065\u0072\u002d\u0057\u0061\u0079\u002d\u305d\u308c\u305e\u308c\u306e\u5834\u6240"),
# Japanese
u_("\u3072\u3068\u3064\u5c4b\u6839\u306e\u4e0b\u0032"),
# Japanese
u_("\u004d\u0061\u006a\u0069\u3067\u004b\u006f\u0069\u3059\u308b\u0035\u79d2\u524d"),
# Japanese
u_("\u30d1\u30d5\u30a3\u30fc\u0064\u0065\u30eb\u30f3\u30d0"),
# Japanese
u_("\u305d\u306e\u30b9\u30d4\u30fc\u30c9\u3067"),
# greek
u_("\u03b5\u03bb\u03bb\u03b7\u03bd\u03b9\u03ba\u03ac"),
# Maltese (Malti)
u_("\u0062\u006f\u006e\u0121\u0075\u0073\u0061\u0127\u0127\u0061"),
# Russian (Cyrillic)
u_("\u043f\u043e\u0447\u0435\u043c\u0443\u0436\u0435\u043e\u043d\u0438\u043d\u0435\u0433\u043e\u0432\u043e\u0440\u044f\u0442\u043f\u043e\u0440\u0443\u0441\u0441\u043a\u0438")
]
for i in keys:
eq_(foo(i),i)
def test_D_invalidate():
"""Invalidate cache"""
memory_cache.invalidate(foo)
eq_(foo('Espanol'), 'Espanol')